It’s not uncommon to think of cybersecurity as primarily a technological challenge, but it’s really more of a human one, Alex Blau writes at the Harvard Business Review, in that cyberattacks so frequently take advantage of human error. Most of the large-scale cyberattacks that have made headlines in the past year at some point involved someone making a mistake or exercising bad judgment and accidentally giving cybercriminals access to sensitive data. Behavioral science, Blau observes, help explain why people (including your employees) have a hard time adopting good cybersecurity habits:
One major insight from the fields of behavioral economics and psychology is that our behavioral biases are quite predictable. For instance, security professionals have said time and again that keeping software up-to-date, and installing security patches as soon as possible, is one of the best methods of protecting information security systems from attacks. However, even though installing updates is a relative no-brainer, many users and even IT administrators procrastinate on this critical step. Why? Part of the problem is that update prompts and patches often come at the wrong time — when the person responsible for installing the update is preoccupied with some other, presently pressing issue.
Blau’s insight here underscores something we discovered in our recent study of organizational culture at CEB, now Gartner. When culture change efforts fail, it is sometimes because employees are unable to manage the tension between the desired change and their day-to-day workflow. Getting employees to adopt a new habit at work means understanding the tradeoffs they need to make in order to do so, minimizing those tradeoffs as much as possible, and giving employees guidance on how to manage them. When best practices in cybersecurity (or any other area where you’re hoping to change employees’ habits) get in the way of an employee doing their work efficiently, the employees is more likely to sidestep them.
In recent years, behavioral economists have become increasingly enthusiastic about the concept of “nudging”—prodding people toward more beneficial behaviors by making them the default option in some of the many choices individuals make about their health or finances. An example of nudging with which employers will be familiar is auto-enrollment in 401(k) plans, which past research has shown results in much higher participation rates than an opt-in system: When the default option is to participate, employees are more likely to do so because it takes more effort not to. Employers have also experimented with nudging strategies to encourage employees toward healthy choices like getting their yearly flu shot.
The latest research The Association for Psychological Science highlights a new study published last week that “compared the effectiveness of nudge-type strategies with more standard policy interventions” and found that nudges are substantially more effective at encouraging both financial and physical wellness:
In the case of retirement savings, for example, a nudge that prompted new employees to indicate their preferred contribution rate to a workplace retirement-savings plan yielded a $100 increase in employee contributions per $1 spent on implementing the program; the next most cost-effective strategy, offering monetary incentives for employees who attended a benefits fair, yielded only a $14.58 increase in employee contributions per $1 spent on the program.
“Defaults (and their designers) hold immense power,” Lena Groeger writes at ProPublica, “they make decisions for us that we’re not even aware of making”:
Consider the fact that most people never change the factory settings on their computer, the default ringtone on their phones, or the default temperature in their fridge. Someone, somewhere, decided what those defaults should be – and it probably wasn’t you.
Another example: In the U.S. when you register for your driver’s license, you’re asked whether or not you’d like to be an organ donor. We operate on an opt-in basis: that is, the default is that you are not an organ donor. If you want to donate your organs, you need to actively check a box on the DMV questionnaire. Only about 40 percent of the population is signed up to be an organ donor. In other countries such as Spain, Portugal and Austria, the default is that you’re an organ donor unless you explicitly choose not to be. And in many of those countries over 99 percent of the population is registered.
In other words, default settings serve as a powerful means of “nudging” people to prefer one choice over another. Groeger discusses how this principle applies to the realm of retirement savings:
Robin Mestre, the principal strategic advisor at Google for Work, takes to the pages of Fast Company to discuss how machine learning technologies are transforming our approach to workplace wellness, offering ways to automatically encourage or enable healthier behaviors:
Broadly speaking, the biggest potential advantage to bringing in machine learning for on-the-job health is that it’s all about putting data we already have into action in real time. Fitness trackers do that to a certain extent already, and many of them are designed to use that data to help modify users’ behavior (though some are skeptical as to how successfully).
But not all tools based on machine learning are wearables, and many can still give you in-the-moment nudges in subtle, helpful ways (that don’t make you feel bad about yourself). A recent one that we’ve built here at Google is the Goals feature of Google Calendar, released last month and meant to help people find time for their goals—whether that’s going to the gym or wrapping up creative projects on schedule. After answering a few simple questions like “How often”? and “Best time?” Google Calendar finds the optimal window in your schedule and automatically reschedules if a conflict comes up.