85% of hacks exploit our fundamental behaviors
While most people think of hacks as attacking a line of code or correctly guessing a password, the reality is that an estimated 85% of cyber attacks across the world occur because hackers understand our core behaviors - such as trust and curiosity - and exploit them through carefully crafted scams.1 This psychological manipulation, which has the end goal of getting victims to share information or carry out an action, is called social engineering.2 Understanding why it works is key to preventing its success.
A closer look at persuasion strategies
Hackers, aware of our behavioral propensities, exploit our hardwiring to gain information. Even employees with the most cybersecurity experience and education can fall victim if they are unaware of how their reflexes and habits can be used against them.
The most widely accepted definition of persuasion strategies include a wide array of core behaviors that bad actors use to manipulate our nature as social creatures.3 According to this definition, there are six components: reciprocity, conformity/social proof, liking, scarcity, commitment, and authority.3
Six persuasion strategies hackers use
- Hackers exploit Reciprocity to take advantage of our nature of giving something in return when feeling indebted to someone.
- We also tend to seek Conformity by imitating others’ behavior.
- Similarly, cyber criminals try to make their persona similar to the victim because we tend to Like people who are similar to us.
- When items are Scarce, people view the product or service as more valuable and desirable than others.
- Individuals generally Commit to promises, so when hackers coerce them into making a promise, they can usually trust that their target will follow through.
- Lastly, people tend to obey requests from those who have Authority over them.
Behavioral Science, Democratized
We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices.
At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.
Why we fall prey to social engineering: Bounded rationality and FOMO
Part of why criminals shape their attacks following the above strategies is because when we aren’t aware of the following mental shortcuts and ways of reacting, our behavior becomes more predictable.
What is bounded rationality?
Every day, we make decisions that are limited by the information we have on hand - at best, this data is incomplete and at worst, they’re false beliefs. Even if we’re well-versed and well-educated on a subject, sifting through our knowledge of it to make an informed decision can be limited by our human processing power. These behaviors encapsulate the concept of bounded rationality.4
In the context of cybersecurity, this means that a user might intellectually know not to click a suspicious email, but they rely only on the judgment they have at the moment - perhaps they’ve been waiting for a stressful email from their child’s school, and let their anxiety overwhelm their better judgment. This reasoning, as shown in the graph below, is hindered in bounded rationality by cognitive limitations, time pressure or stress, and imperfect information. In this setting, an employee might be stressed about a looming deadline or an angry boss, and so seeing an email that looks like it’s their boss demanding credit card information then exploits their currently limited ability to make rational decisions.
FOMO: Not just for teens
First described in an op-ed by author Dan Hernan, the term FOMO, or Fear of Missing Out, has grown in popularity since his publishing in 2004.5 It describes the feeling of apprehension that one gets when they feel like they’re not looped into events, information, or experiences that they’d otherwise want to be involved in. When online, those who experience high levels of FOMO are likely to forgo cyber safety practices in order to join in on what they feel they’re missing out on.6
A hacker might make it seem like everyone else is doing something that you’re not: “82% of your colleagues have already signed up, why haven’t you?” It could also look like putting time pressure on a task, like “limited time offer - sign up now.”
More FOMO, more cybersecurity risk
FOMO has been linked to increased risk-taking behavior and heightened vulnerability while online.6 It also strongly predicted poor information security awareness (ISA), even when controlling for demographics and personality traits.7 Additionally, those with high self-reported scores of FOMO feel more negatively toward ISA, have poorer knowledge of it, and exhibit behaviors that could threaten their organization’s security.7
Even though it may seem like it’s impossible to target and reduce this fear, when you create training that focuses on awareness of how it plays a role in cybersecurity, you set your employees up to be more self-aware and less likely to exhibit FOMO or bounded rationality.
Train for reflexes, not just knowledge
Typical cybersecurity training might look like asking an employee to watch a video on the 5 best ways to protect one’s passwords and then take a quiz after. They may become aware of the right thing to do, but they might not actually do it. This example illustrates the knowing and doing gap, which suggests that there are weak points between awareness, behavioral intention, and actual behavior.8 Awareness and training activities do increase knowledge, however that knowledge does not always translate into observable behavioral changes.8
The role of heuristics and habits in our defenses
A more promising approach is to create ongoing training initiatives that focus on boosting awareness of one's habits; this type of learning is also referred to as persuasion knowledge.9 In one study, awareness of how social engineering exploits our behavior led to a decreased susceptibility to hacks, improving organizational safety.10 There is a positive relationship between resistance to persuasion attempts and persuasion knowledge, and people who have been trained accordingly are better equipped to react when being exposed to a social engineering scam.11
Training tailored toward self-awareness has also been shown to combat FOMO: when an employee learns why FOMO happens and how it’s triggered, they build resilience that helps them recognize their emotions before reacting.12 Through increased recognition of how their cognitive tendencies are being exploited, they also build an understanding of the social and personal countermeasures that are available to them.12
Teaching counter strategies to prevent hacks
One way to implement this type of training is to teach relevant psychological principles and counter strategies.9 Drive these points home by incorporating short scenarios or practice exercises in order to showcase the behavioral principles in practice.9 A second strategy includes realistic role playing in which participants have to make a similar decision multiple times.9 While potentially more effective, this strategy is also more effortful and may be best limited to key personnel.9
A positive side effect of persuasion knowledge: Engaged employees
By focusing training on how social engineers hack our behavior, rather than putting the blame on the victim, you can also foster a culture of education and support rather than fear.13 This will help keep your staff engaged,13 which is protective against the type of negligence that leads to successful hacking attempts.14 As technology develops and hackers become more savvy, where traditional training might become outdated, behavioral-focused training will never go out of date as hackers will always try to exploit our human nature.
Not all organizations have an internal team with the behavioral knowledge to create such training programs. The Decision Lab, drawing on expertise in consulting and the social sciences, can help you create behaviorally-informed training programs that will strengthen your organization’s cybersecurity.
Protecting ourselves through behavioral science education
While hackers using social engineering seem to have a sixth sense when it comes to knowing how to exploit our behaviors, thankfully there are ways to circumvent their efforts. One evidence-based strategy is learning more about our own reflexes and the reasons why we fall prey to hacks.
To keep up-to-date on how behavioral science applies to all facets of life, subscribe to The Decision Lab’s monthly newsletter. If you would like to find out more about cybersecurity preparedness and additional strategies, see Strengthen Your Strategy with Cybersecurity by The Decision Lab and Boston Consulting Group.
The Decision Lab is a behavioral consultancy that uses science to advance social good. In the digital age, cybercrime is a threat to all of us — and the root cause of most breaches is human error. We work with some of the most innovative minds in cybersecurity to help organizations navigate this risk by understanding & weeding out sources of bias. If you'd like to tackle this together, contact us.
- Verizon 2021 Data Breach Investigations Report. (2021). Verizon. verizon.com/dbir
- Mitnick, K. D., & Simon, W. L. (2011). The Art of Deception: Controlling the Human Element of Security. John Wiley & Sons.
- Cialdini, R. (2009). Influence: The Psychology of Persuasion. Harper Collins.
- Gobet, F., Richman, H., Staszewski, J., & Simon, H. A. (1997). Goals, Representations, and Strategies in a Concept Attainment Task: The EPAM Model. In D. L. Medin (Ed.), Psychology of Learning and Motivation (Vol. 37, pp. 265–290). Academic Press. https://doi.org/10.1016/S0079-7421(08)60504-6
- Social Theory at HBS: McGinnis’ Two FOs. (2004, May 10). The Harbus. https://harbus.org/2004/social-theory-at-hbs-2749/
- Buglass, S. L., Binder, J. F., Betts, L. R., & Underwood, J. D. M. (2017). Motivators of online vulnerability: The impact of social network site use and FOMO. Computers in Human Behavior, 66, 248–255. https://doi.org/10.1016/j.chb.2016.09.055
- Hadlington, L., Binder, J., & Stanulewicz, N. (2020). Fear of Missing Out Predicts Employee Information Security Awareness Above Personality Traits, Age, and Gender. Cyberpsychology, Behavior, and Social Networking, 23(7), 459–464. https://doi.org/10.1089/cyber.2019.0703
- Gundu, T. (2019, May 13). Acknowledging and Reducing the Knowing and Doing Gap in Employee Cybersecurity Compliance. International Conference on Cyber Warfare and Security, Stellenbosch, South Africa.
- Schaab, P., Beckers, K., & Pape, S. (2017). Social engineering defence mechanisms and counteracting training strategies. Information & Computer Security, 25(2), 206–222. https://doi.org/10.1108/ICS-04-2017-0022
- Bullée, J.-W. H., Montoya, L., Pieters, W., Junger, M., & Hartel, P. H. (2015). The persuasion and security awareness experiment: Reducing the success of social engineering attacks. Journal of Experimental Criminology, 11(1), 97–115. https://doi.org/10.1007/s11292-014-9222-7
- Briñol, P., Rucker, D. D., & Petty, R. E. (2015). Naïve theories about persuasion: Implications for information processing and consumer attitude change. International Journal of Advertising, 34(1), 85–106. https://doi.org/10.1080/02650487.2014.997080
- Alutaybi, A., Al-Thani, D., McAlaney, J., & Ali, R. (2020). Combating Fear of Missing Out (FoMO) on Social Media: The FoMO-R Method. International Journal of Environmental Research and Public Health, 17(17), 6128. https://doi.org/10.3390/ijerph17176128
- Caldwell, T. (2016). Making security awareness training work. Computer Fraud & Security, 8–14. https://doi.org/10.1016/S1361-3723(15)30046-4
- Blau, A., Alhadeff, A., Stern, M., Stinson, S., & Wright, J. (2017). Deep Thought: A Cybersecurity Story. ideas42. https://www.ideas42.org/wp-content/uploads/2016/08/Deep-Thought-A-Cybersecurity-Story.pdf
About the Authors
Lindsey Turk is a Summer Content Associate at The Decision Lab. She holds a Master of Professional Studies in Applied Economics and Management from Cornell University and a Bachelor of Arts in Psychology from Boston University. Over the last few years, she’s gained experience in customer service, consulting, research, and communications in various industries. Before The Decision Lab, Lindsey served as a consultant to the US Department of State, working with its international HIV initiative, PEPFAR. Through Cornell, she also worked with a health food company in Kenya to improve access to clean foods and cites this opportunity as what cemented her interest in using behavioral science for good.
Dr. Brooke Struck
Dr. Brooke Struck is the Research Director at The Decision Lab. He is an internationally recognized voice in applied behavioural science, representing TDL’s work in outlets such as Forbes, Vox, Huffington Post and Bloomberg, as well as Canadian venues such as the Globe & Mail, CBC and Global Media. Dr. Struck hosts TDL’s podcast “The Decision Corner” and speaks regularly to practicing professionals in industries from finance to health & wellbeing to tech & AI.
Dan is a Co-Founder and Managing Director at The Decision Lab. He has a background in organizational decision making, with a BComm in Decision & Information Systems from McGill University. He has worked on enterprise-level behavioral architecture at TD Securities and BMO Capital Markets, where he advised management on the implementation of systems processing billions of dollars per week. Driven by an appetite for the latest in technology, Dan created a course on business intelligence and lectured at McGill University, and has applied behavioral science to topics such as augmented and virtual reality.