laptop on table with phone and books

Cybersecurity and data privacy: How to build smart cyber habits with employee training

Why is it so hard to change behavior?

No matter whether it’s New Year’s resolutions, eating habits, or cybersecurity, people generally maintain the habits they’ve always had - unless they have the proper resources and an ecosystem that facilitates change.1

Old habits die hard

When thinking about cybersecurity practices for long-standing employees, it isn’t too different, even when considering the shift to working from home throughout the COVID-19 pandemic. Though employees are at increased risk of cyberattack because of WFH, they still often prefer to do things the same way they’ve done them in the past.2 Mandate training simply isn’t very effective at changing this.3

In fact, according to a 2021 investigation, 60% of companies (ranging in size from less than 500 employees to more than 1,500) have 500 or more employee accounts that use non-expiring passwords - meaning they likely haven’t changed their password since the day they joined the organization.4

Using behavioral science to break our dangerous habits

While the onboarding process provides an excellent opportunity to form cybersecurity habits (more on that here), what can be done for people whose methods have been shaped by months or years with the same employer? In order to understand the solutions, it’s important to discern how three behavioral tendencies - social loafing, habituation, and status quo bias - play a role in employee negligence. Once the reasons behind these mental shortcuts are understood, leadership can then enact efficient and effective countermeasures.

Behavioral Science, Democratized

We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices. 

At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.

More about our services

Three behaviors that contribute to poor cyber habits

Social Loafing: We think our actions don’t matter

Social loafing is the tendency to exert less effort in a group setting than when working alone. It’s especially pronounced in settings where the effort of each member gets combined into a total outcome for the group, making it difficult to parse out who contributed what.5

Social loafing stems from the assumption that our individual actions don’t matter, which can have devastating consequences for cybersecurity. Part of why it happens is our tendency to think that if something is truly important, surely somebody else will be on top of it.6

Conversely, when employees think their security compliance actions significantly and positively impact the organization, they practice better habits and exhibit less social loafing.7

Collective Effort Model (CEM)

Social Loafing solution: Enhance social norms 

In order to remind employees that their actions do matter, social norms need to be shaped to reflect this.

Normative beliefs have a significant influence on the behavior of employees - so perceived expectations from peers, IT management, and superiors will have the most impact on cybersecurity practices.7

When showing the cyber-safe steps others are taking, whether through a monthly newsletter or briefly in a weekly meeting, behaviors stemming from social loafing will likely be reduced - however, only when the majority of people are:

  • Actually exhibiting “good” behavior
  • The majority behavior is known
  • Each individual can make a salient comparison between their behavior and the norm2

In order to create the aforementioned components that situate an organization to reduce social loafing among its employees, supervisors should provide personalized feedback to subordinates about their performance and how that compares to others. This would not only create social norms from which people can evaluate their progress, but it would also make employees feel supported and satisfied - key factors when trying to promote oneness within an organization.8

Habituation: We get used to warnings

During the process of habituation, people have a decreasing reaction to recurring stimuli as time goes on.9 An example of this is getting news updates on your phone: If you choose to only get breaking news updates, you'll receive only one or two per day and are more likely to respond to the stimuli. If you opt in to several notifications an hour, you’ll be less likely to read any of them - including the breaking news updates

We tend to get tired and overwhelmed if there is too much information. In terms of cybersecurity, habituation may look like employees disengaging from safe behaviors altogether.10

Part of the reason habituation occurs so frequently is because:

  • Cybersecurity situations often look identical, occur in close succession to each other, and there are no immediate consequences to ignoring the stimulus.11 For example, one may receive warnings about needing to create a stronger password every time they log into their email, but because they are easily ignorable and happen so often, the user can easily become habituated and neglect changing their password completely.
  • Complex language may turn users off from responding to every stimulus because it might take too much time or effort to try to understand what the issue is.12
  • If the downstream effects of one’s responses are not variable and salient, this can further enhance the strength of the habituation effect.13

Habituation solution: Create unique warnings and clarify consequences

How can we make sure that employees don’t become habituated to cyber warnings? One solution is to create warnings that are distinct from each other and describe how the update can help the user.2 If possible, make these warnings as infrequent as possible—focused only on the highest-value points requiring engagement—as continued exposure can deplete employees of cognitive energy14 or lead to stress and fatigue.15

Another avenue is to clearly state the consequences of ignoring warnings. For example, upon receiving a potential phishing email, they would have to read a detailed description of what it looks like to go through a phishing attack before continuing. Similarly, they could watch a brief video of people who have experienced a phishing attack, and then afterward, asked if they still want to open the suspicious email.2

Status Quo Bias: We’re hesitant to change

The status quo bias occurs when we overestimate the positive value of our current circumstances, reducing our motivation to change them.2 It leads us to stick to the status quo, even when we really shouldn’t - for example, keeping a two-year-old password or not updating computer software. This behavior is caused by of one of three issues:16

  1. Rational decision-making in the face of uncertainty
  2. Cognitive misperceptions
  3. Psychological commitments

  1. Rational decision making
    Surprisingly, a rational decision could be putting off updating your computer software. Changes like a software update are, in the short term—and in nearly all of the user’s most salient interactions—generally either neutral (i.e. it feels like nothing’s changed) or they’re negative (i.e. it’s harder to engage with the software and breeds greater frustration among users).
    Since they already know how to use their current software, the risk of changing it may be too high. The user will likely believe that if the update is easy to ignore and has no immediate value, or even a negative value, it’s not worth it. This would lead the individual to assume that the status quo choice is the best option available - and not update their software.
  2. Cognitive misperceptions
    Following logic similar to loss aversion, people feel the losses of leaving the status quo more saliently than the benefits of that change. Even when there are no loss or gain framing effects, the status quo bias still persists.16
    Cognitive misperceptions that lead to status quo bias occur in decision making processes that require people to learn about their options in sequential order. Because they’ve invested time on the first few options, they are more likely to choose one of these rather than spending excess time learning about all the options. For example, let’s say you are starting a new job and you are trying to decide which health insurance policy to choose. Given that it takes a significant amount of time to weigh the pros and cons of each policy, it’s unreasonable to expect that you have enough time to thoroughly investigate each option. As such, your most rational course of action is to look at a subset of your options and ignore the others completely. In this scenario, the status quo option holds a competitive advantage given its placement in your decision-making process.
  3. Psychological commitments
    The third and final issue that can lead to status quo bias is when we psychologically commit to something in order to metaphorically “cut our losses” (also known as sunk cost fallacy). The more time one has spent investigating the status quo option, the more likely they are to continue with this commitment in the future, even if better alternatives arise.

Status Quo Bias solution: Communication and ease of use

The first strategy to implement is to change the default settings where possible in order to make them more secure. For example, when bringing in new employees, automatically set them up with two-factor authentication, complex passwords, and a secure password manager. Additionally, change their settings to have passwords automatically expire after a few months. Forcing their passwords to expire would require significantly less effort than allowing passwords to age and trying to encourage employees to change them.

A second method is to create a short skill-enhancing quiz for repeat use. The quiz might entail receiving suspicious emails and having to decipher whether or not they are phishing or spam. Depending on their scores, they would get feedback on how to improve for next time they’re faced with unexpected emails. This strategy would address both the psychological commitments and cognitive misperceptions of the status quo bias by showing that habitual behavior can lead to losses, and it’s worth adapting for better protection.

Finally, if an organization is unable to change the default settings or implement value-delivery in the friction process, it can resort to framing effects. An organization could frame the default option as the less attractive option.17 Because of this bias’s relationship with loss aversion, when we frame the default option as a loss, we encourage employees to make changes to their initial settings, thus strengthening cybersecurity practices.17 

Address your employees’ bad habits via behavioral science

Understanding how to ensure your tenured employees practice cyber-safe behaviors requires an understanding of the three key biases and tendencies that contribute to unsafe practices: social loafing, habituation, and status quo bias. Most people in your organization don’t exhibit hazardous online behavior to be malicious; instead, they are acting out the same mental shortcuts that we all experience at one point or another, although in different settings. With a psychologically-informed and empathetic approach, you can ensure not only that your firm will be at a reduced risk of a cyber attack, but also that your employees feel more secure and important within their roles.

The Decision Lab is a behavioral consultancy that uses science to advance social good. We work with some of the largest organizations in the world to spark change and tackle tough societal problems. Data-driven decision-making is key to eliminating bias in the workplace and maximizing the talent organizations already have on hand. If you'd like to tackle this together in your workplace, contact us.

References

  1. Breaking Bad Habits. (2012, January). NIH News in Health. https://newsinhealth.nih.gov/2012/01/breaking-bad-habits
  2. Blau, A., Alhadeff, A., Stern, M., Stinson, S., & Wright, J. (2017). Deep Thought: A Cybersecurity Story. ideas42. https://www.ideas42.org/wp-content/uploads/2016/08/Deep-Thought-A-Cybersecurity-Story.pdf
  3. Cisco Systems, Inc. (2008). Data Leakage Worldwide: The High Cost of Insider Threats [White paper]. https://www.01net.it/whitepaper_library/Cisco_DataLeakage.pdf
  4. 2021 Financial Services Data Risk Report (2021). Varonis.
  5. Hoffman, R. (2020, June 22). Social loafing: Definition, examples and theory. Simply Psychology. https://www.simplypsychology.org/social-loafing.html
  6. Darley, J. M., & Latane, B. (1968). Bystander intervention in emergencies: Diffusion of responsibility. Journal of Personality and Social Psychology, 8(4), 377–383. https://doi.org/10.1037/h0025589
  7. Herath, T., & Rao, H. R. (2009). Encouraging information security behaviors in organizations: Role of penalties, pressures and perceived effectiveness. Decision Support Systems, 47(2), 154–165. https://doi.org/10.1016/j.dss.2009.02.005
  8. Meyer, J. P., & Allen, N. J. (1991). A three-component conceptualization of organizational commitment. Human Resource Management Review, 1(1), 61–89. https://doi.org/10.1016/1053-4822(91)90011-Z
  9. Thompson, R. F., & Spencer, W. A. (1966). Habituation: A model phenomenon for the study of neuronal substrates of behavior. Psychological Review, 73(1), 16–43. https://doi.org/10.1037/h0022681
  10. Furnell, S., & Thomson, K.-L. (2009). Recognising and addressing ‘security fatigue.’ Computer Fraud & Security, 2009(11), 7–11. https://doi.org/10.1016/S1361-3723(09)70139-3
  11. Amran, A., Zaaba, Z. F., & Mahinderjit Singh, M. K. (2018). Habituation effects in computer security warning. Information Security Journal: A Global Perspective, 27(4), 192–204. https://doi.org/10.1080/19393555.2018.1505008
  12. Bravo-Lillo, C., Cranor, L. F., Downs, J., & Komanduri, S. (2011). Bridging the Gap in Computer Security Warnings: A Mental Model Approach. IEEE Security & Privacy, 9(2), 18–26. https://doi.org/10.1109/MSP.2010.198
  13. Boutros, N., & Davis, T. (2022). Habituation: Definition, Examples, & Why It Occurs. The Berkeley Well-Being Institute. https://www.berkeleywellbeing.com/habituation.html
  14. Pignatiello, G. A., Martin, R. J., & Hickman, R. (2020). Decision fatigue: A conceptual analysis. Journal of Health Psychology. https://doi.org/10.1177/1359105318763510
  15. Salvagioni, D. A. J., Melanda, F. N., Mesas, A. E., González, A. D., Gabani, F. L., & Andrade, S. M. de. (2017). Physical, psychological and occupational consequences of job burnout: A systematic review of prospective studies. PloS One, 12(10), e0185781. https://doi.org/10.1371/journal.pone.0185781
  16. Samuelson, W., & Zeckhauser, R. (1988). Status quo bias in decision making. Journal of Risk and Uncertainty, 1(1), 7–59. https://doi.org/10.1007/BF00055564
  17. Fallahdoust, M. (2022). Nudges and Cybersecurity: Harnessing Choice Architecture for Safer Work-From-Home Cybersecurity Behaviour [Text, Carleton University].https://curve.carleton.ca/92b0cf7c-8751-4587-be25-8baa920f4ea8

About the Authors

Lindsey Turk's portrait

Lindsey Turk

Lindsey Turk is a Summer Content Associate at The Decision Lab. She holds a Master of Professional Studies in Applied Economics and Management from Cornell University and a Bachelor of Arts in Psychology from Boston University. Over the last few years, she’s gained experience in customer service, consulting, research, and communications in various industries. Before The Decision Lab, Lindsey served as a consultant to the US Department of State, working with its international HIV initiative, PEPFAR. Through Cornell, she also worked with a health food company in Kenya to improve access to clean foods and cites this opportunity as what cemented her interest in using behavioral science for good.

Brooke Struck portrait

Dr. Brooke Struck

Dr. Brooke Struck is the Research Director at The Decision Lab. He is an internationally recognized voice in applied behavioural science, representing TDL’s work in outlets such as Forbes, Vox, Huffington Post and Bloomberg, as well as Canadian venues such as the Globe & Mail, CBC and Global Media. Dr. Struck hosts TDL’s podcast “The Decision Corner” and speaks regularly to practicing professionals in industries from finance to health & wellbeing to tech & AI.

Dan Pilat's portrait

Dan Pilat

Dan is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. Dan has a background in organizational decision making, with a BComm in Decision & Information Systems from McGill University. He has worked on enterprise-level behavioral architecture at TD Securities and BMO Capital Markets, where he advised management on the implementation of systems processing billions of dollars per week. Driven by an appetite for the latest in technology, Dan created a course on business intelligence and lectured at McGill University, and has applied behavioral science to topics such as augmented and virtual reality.

Read Next

Notes illustration

Eager to learn about how behavioral science can help your organization?