Why do we believe that nothing bad is going to happen?
Normalcy Bias, explained.
What is the normalcy bias?
The normalcy bias describes our tendency to underestimate the possibility of disaster and believe that life will continue as normal, even in the face of significant threats or crises.
Where this bias occurs
Consider the following hypothetical scenario: Emma, a meticulous planner, lives in a region prone to occasional earthquakes. Despite the area’s history of seismic activity, Emma has never experienced a major earthquake in her lifetime. Consequently, she disregards the importance of preparing an emergency kit or formulating an evacuation plan, firmly convinced that her day-to-day life will remain unaltered. ‘Those things happen to other people, not me’, she continues to tell herself.
One day, as Emma goes about her usual routine, she feels a subtle tremor beneath her feet. Instead of immediately recognizing the potential danger and taking precautionary measures, she brushes off the sensation, attributing it to a passing construction truck or some other mundane cause. Within seconds, the light tremor turns into strong shaking and Emma finds herself stuck inside her apartment during a major earthquake.
As this example shows, the normalcy bias occurs when individuals encounter potential threats but instinctively downplay their significance. Emma’s confidence in the predictability of her daily routine prevented her from acknowledging the risks of a major earthquake and the need to prepare accordingly. This cognitive bias is not just confined to natural disasters; it permeates various aspects of our lives, affecting our ability to recognize and respond effectively to impending challenges such as financial uncertainties and health crises.
Debias Your Organization
Most of us work & live in environments that aren’t optimized for solid decision-making. We work with organizations of all kinds to identify sources of cognitive bias & develop tailored solutions.
The normalcy bias, when left unchecked, can have profound implications on individual decision-making and personal outcomes. People often reject signals of potential disruptions due to their subconscious reliance on the assumption that life will follow a familiar course. For instance, someone might disregard the early signs of a developing health issue and fail to seek immediate medical advice, instead convincing themself that the problem isn’t that serious and will probably disappear.
Similarly, in the realm of personal finance, our tendency to remain optimistic and believe that everything will continue to function as normal can lead us to make poor investment choices. Imagine for a moment that you have stocks in a company that has been struggling and share prices begin to fall. Driven by your confidence that the shares will quickly recover and return to their regular performance, you decide not to take any action. However, if the price continues to plummet and you hold onto your investment, your inability to heed the early warning signs may result in significant losses.
The normalcy bias also impacts broader societal structures, affecting how institutions respond to impending challenges. Take, for instance, a city's emergency management system. When decision-makers underestimate the likelihood, or potential impact of, a natural disaster, their response and resource allocation may fall short when a disaster actually does strike. This is exactly what happened in February 2021 when Winter Storm Uri slammed into the city of Austin, Texas. The city authorities never anticipated or planned for a severe winter storm on this scale and, as a result, their emergency response was less than effective and disorganised1.
How it affects product
The normalcy bias can significantly impact product development and innovation by fostering resistance to change and a reluctance to adopt new approaches. Companies and individuals influenced by this cognitive bias may stick to familiar methods and features, even when market trends or technological advancements suggest the need for adaptation. This bias can create a barrier to the successful introduction of novel products or improvements, as stakeholders may underestimate the urgency of staying ahead of evolving consumer needs or industry standards.
A good example of this situation is Blockbuster’s infamous decision to turn down an offer to buy Netflix back in the early 2000s2. Despite Netflix’s potential disruption of the video rental market, Blockbuster believed that the competitor’s idea to offer a mail-order movie rental service was an anomaly and that people would continue their habit of browsing for VHS and DVD in store. Blockbuster rejected the deal and by the end of the decade the company filed for bankruptcy, unable to compete with the growing popularity of Netflix and other online streaming services. If Blockbuster had viewed Netflix and their innovative idea as a potential threat, the company may have considered the deal and still be around today.
The Normalcy Bias and AI
With the ever-increasing presence of AI in nearly every aspect of our daily lives, the normalcy bias can have a profound effect on how we interact with and perceive new technologies. In a survey of US workers conducted by Gallup in 20233, 22% of respondents said that they were worried their jobs would eventually be replaced by AI. Becoming ‘technologically unemployed’ is nothing new and has been causing anxiety amongst workers since machines began replacing humans in the mid 1800s. However, while the respondents of Gallup’s survey may be unnecessarily concerned about their future jobs, the fact that they are thinking ahead to the worst-case scenario may not be such a bad thing.
No one can deny that AI represents a significant transformation in the workplace. However, by ignoring the potential impact AI can have on work processes, companies and individuals risk diminishing their future productivity and growth. In fact, a recent study conducted by some of the world’s leading universities in collaboration with Boston Consulting Group suggests that AI can have a significant positive improvement on employee performance4. The researchers asked hundreds of consultants to complete 18 different work tasks, allowing some to use ChatGPT and others not. Those who used AI to complete the tasks produced 40% higher quality results than those who didn’t.
So rather than burying our heads in the sand and ignoring the influence AI will have on our future jobs, now is the time to break with normality and find ways to use AI as a tool to enhance human performance.
Why it happens
As humans, we tend to base our actions on how often we see and experience things ourselves (inductive thinking), rather than the likelihood of something actually happening (deductive thinking). In other words, even though we might know that there’s a risk of a negative situation occurring, we choose not to take the threat seriously because we’ve never seen or experienced it before. In this way, the normalcy bias builds up in our minds over time; as more and more normal events happen to us, our tendency to underestimate threats and the likelihood of disaster increases.
Our tendency to ignore potential threats also stems from our desire for security, routine, and certainty in our lives. When the repetitive and predictable nature of our everyday existence is disrupted, or our usual surroundings abruptly change, we can feel anxious and unsettled. From this perspective, the normalcy bias acts as a coping mechanism that helps us to deal with uncertainty and reduce stress.
However, to fully understand why the normalcy bias happens, we also need to look at the influence of other cognitive mechanisms and biases on our perception of reality and ability to make decisions.
Always thinking positively
The normalcy bias is closely linked to the optimism bias; our tendency to overestimate our likelihood of experiencing positive events and underestimate our likelihood of experiencing negative events. Let's consider a scenario where an individual is facing a looming deadline at work while experiencing symptoms of burnout. Despite feeling overwhelmed and exhausted, they convince themselves that the task will get done on time and that they can maintain their usual work routine. By fixating on the positive outcome of the situation, the individual underestimates the potential impact of burnout on their performance and fails to take appropriate action.
Attachment to current beliefs
Beliefs are our brain’s way of making sense of our complex world and they help us with decision-making. We tend to believe information that is consistent with our existing beliefs, a phenomenon known as confirmation bias. When faced with potential threats, we may be presented with information that goes against our existing beliefs. If we revisit the example of Blockbuster, we can see how the company ignored the threat of mail-order DVDs because they had an unwavering attachment to their existing belief that browsing in store was still the most popular way to rent movies. By staying committed to our current beliefs, we may dismiss information about a potential negative event which feels unfamiliar or counterintuitive.
Following the crowd
When making decisions on how to act during a complex or unfamiliar situation, we take cues from our environment and what everyone else is doing around us. In a study by the sociologist Thomas Drabek, he found that when told to evacuate before a potential hurricane or flood, residents check the severity of the situation with four or more other sources, such as family, neighbours, and newscasters, before taking action5. The principle of ‘social proof’ is a heuristic, or mental shortcut, that we use to help us process large amounts of new data and make decisions on how to act. If everyone around us is downplaying the risks of a potential threat and hesitating to take action, we are more likely to follow their example.
Cognitive Dissonance Reduction
One fundamental reason for the normalcy bias lies in our instinctual desire to avoid cognitive dissonance. When faced with conflicting beliefs—such as the belief in a stable, predictable world versus the recognition of a potential threat—our minds strive to alleviate the discomfort arising from this dissonance. Research, including the seminal work of Leon Festinger, indicates that individuals often resort to downplaying the severity of potential risks in order to maintain internal consistency. In the case of the normalcy bias, this manifests as the tendency to perceive threats as less severe than they might actually be, aligning our beliefs with a more comforting narrative.
Most of us have heard the phrase “to cry wolf”. The saying originates from the fable about the shepherd boy who repeatedly fooled villagers into thinking that a wolf was about to attack their sheep. When a wolf actually came to the village, no one listened to the boy’s warning because they believed it was another false alarm. In today’s information saturated society, we are continuously advised about potential threats in our daily lives, even when the risks are often minimal. Maybe we are told to take a look at the safety card on a plane or read and sign an exoneration form for a sports event we’re taking part in. Most of the time we don’t read these through carefully, either because we already know what the risks are, or we don’t think that they’re relevant to us. Repeated exposure to warnings leads us to normalize them, and so when a real threat presents itself, it’s not surprising that we don’t always take them seriously.
In essence, the normalcy bias is a complex interplay of different cognitive processes aimed at reducing internal conflicts, maintaining satisfaction with the status quo, and grappling with the perceived challenges of change. Recognizing these underlying mechanisms is pivotal in overcoming the inertia associated with the normalcy bias and fostering a more adaptive mindset in the face of uncertainty.
Why it is important
In movies, characters often react swiftly to disasters, displaying quick decision-making and heroic responses. In order to keep us glued to our seats, cinematic portrayals tend to exaggerate and dramatize the immediacy of reactions in crisis. This can create a perception that real-life responses should mirror the rapid and decisive actions seen on screen.
In reality, our response to threats and catastrophes couldn’t be more different. The complexity and variability of human behavior during crisis has fascinated researchers for decades, with studies concluding that people are actually more likely to remain calm or deliberate in an emergency. During 9/11, for example, the average wait time among survivors to evacuate the towers was 6 minutes, with some waiting up to half an hour to leave6. Around 1000 people even took the time to shut down their computers and complete other office activities, a strategy to continue engaging in normal activities during an unknown situation.
The normalcy bias affects our decision making in a variety of scenarios. It can influence us whether we have a long time to prepare or a short time to react, and can impact situations on both a small and large scale. At an individual level, the normalcy bias can lead to delayed responses and poor decision making, resulting in adverse effects in our personal lives. At a systemic level, minimising potential risks and failing to plan for catastrophes can have far-reaching consequences when large groups of people are involved.
Above all else, the influence of the normalcy bias on people’s decision making can have dangerous, even life-threatening, implications during a disaster or crisis. Ignoring the signs of a potential catastrophe until it is on our doorstep not only puts our own lives at danger, but those of others around us as well.
How to avoid it
With the normalcy bias, we are trying to evade the stark realisation that terrible events can happen to us, not just to other people. As hard as it may be, avoiding the normalcy bias starts with acknowledging that terrible events can happen to anyone, even us.
This can be difficult to do because planning for potential disasters goes against our desire to be optimistic about the future and to feel safe within our immediate environment. Educating ourselves about potential risks and threats may also make us feel unnecessarily worried, anxious, and paranoid.
According to Jack Soll and John Payne, Professors at Duke University and Katherine Milkman, Assistant Professor at Wharton School of Business, one of the most challenging behavioral biases we need to overcome is how narrowly we think about the future7. In other words, overcoming our tendency to only plan for one forecast, scenario, or outcome. When we relate this to the normalcy bias, this describes our failure to think ahead to potential negative events or threats, preferring instead to focus on what normally happens. To tackle this limited outlook, the authors suggest four different strategies to broaden our perspective:
- Make three estimates or forecasts for the future: when predicting a potential outcome, make a low, medium, and high estimate.
- Forecast twice: after making one forecast, assume that it was wrong and predict again. When we think twice about a problem, we usually look at it from a different perspective the second time round.
- Make a premortem: humans are really good at revisiting and dissecting past negative events to try to understand the cause. While postmortems look backwards, premortems imagine potential disasters and explore their likely causes.
- Take an outsider’s view: when you’ve made a decision about something, take a moment to consider what someone on the outside might think about it.
Think back to the beginning of the COVID-19 pandemic when everyone was panic buying toilet roll and other essential items. Overnight, people who usually just bought what they needed for the week ahead became preppers, or survivalists—the people who proactively prepare for potential emergencies such as natural disasters, war, or global catastrophes.
No one is suggesting that you need to stockpile months’ worth of food and toilet roll for a disaster that might never happen. But engaging in scenario planning and preparedness measures, such as emergency kits or financial planning for uncertainties, can provide tangible steps to navigate disruptions more effectively. Instead of assuming that life will always adhere to routine, mentally explore various potential scenarios, both positive and negative. Develop contingency plans for unexpected events and consider how you would react and adapt. This proactive approach not only enhances your readiness for unforeseen challenges, but also disrupts the complacency associated with the normalcy bias.
How it all started
The term ‘normalcy bias’ was not coined by a single individual or first discovered in a seminal study. Rather, the concept has evolved over time in different fields, such as psychology and disaster studies.
Despite the term’s unknown origins, it’s possible to trace its roots back to the pioneering research of American psychologist Leon Festinger. In his theory of cognitive dissonance, Festinger highlights individuals’ tendency to rationalize and maintain consistency in their beliefs and behaviors, even in the face of contradictory information. This cognitive mechanism extends to what is now understood as normalcy bias, where people cling to the familiar and downplay threats when confronted with a crisis or disruptive change. Festinger’s work provides a foundational understanding of how individuals cope with uncertainty and change, offering insights into why people may resist acknowledging or adapting to new information or challenges.
Since Festinger’s pioneering work in the late 1950s, researchers have been applying the normalcy bias to understand how humans react in various situations. While there isn't an abundance of studies specifically devoted to the examination of normalcy bias, researchers have delved into related cognitive phenomena and decision-making biases that share commonalities with, or directly contribute to, the normalcy bias.
For instance, studies on hurricane evacuation have explored why some people are hesitant to evacuate despite clear warnings, attributing this behavior to a form of normalcy bias. In the realm of financial decision-making, behavioral economics research has investigated biases that may explain why investors persist with failing investments, a behavior that aligns with elements of normalcy bias. Additionally, research on responses to health crises, such as the COVID-19 pandemic, has examined why individuals may resist adhering to preventive measures. Not explicitly labelled as such, these studies collectively contribute to our understanding of the psychological mechanisms underlying normalcy bias in various contexts.
Example 1 – The unknown virus
The normalcy bias recently manifested itself in early 2020 during the initial stages of the COVID-19 pandemic.As the virus began its global spread, there was a tendency among many to downplay the severity of the situation and cling onto the belief that life would soon return to normal. Faced with daily warnings of rising death tolls, full-scale lockdowns, and strict social restrictions, many people refused to believe that the pandemic would last beyond a few months, let alone years. Driven by a desire to get back to normal, people tried to continue with their pre-pandemic routines, believing that they wouldn’t be negatively impacted.
The public’s complacency around the virus was exacerbated by the fact that many people didn’t know anyone who had been seriously impacted by the illness, thus reducing their perception of the threat.
In a study conducted across various European countries8, researchers found that as the pandemic developed, people who didn’t know anyone who had had COVID-19 became increasingly unrealistically optimistic about their chances of evading the virus. In other words, despite clear warning signs around them, such as exponentially rising case and hospitalization numbers, people continued to hold onto the belief that the high levels of infection only concerned other people.
In many cases, the normalcy bias contributed to delayed and sometimes inadequate responses, at both individual and societal levels, as people struggled to accept the gravity of the unfolding crisis.
Example 2 – The ‘unsinkable’ ship
The sinking of the RMS Titanic in 1912 is a poignant illustration of the devastating impact the normalcy bias can have on our decision making in times of crisis. Despite receiving several warnings icebergs in the area from nearby ships, the captain continued at full speed, believing that the crew would react in time if they saw danger ahead. Before she set sail, many described the Titanic as ‘unsinkable’ due to her advanced technology and sheer size. Consequently, the ship’s crew members underestimated the potential impact of a collision because they believed that such a catastrophic event couldn’t happen to a state-of-the-art ship on her maiden voyage.
On top of this, new evidence emerged in 2012 which showed that one of the Titanic’s safety officers, Maurice Clarke, warned the ship’s company just hours before their departure that they should sail with more lifeboats. In their eagerness to leave on time, the ship’s company ignored Clarke’s warning and set sail with only 20 lifeboats for all 2209 passengers on board. The company’s dismissal of Clarke’s warning ultimately led to the death of many of the Titanic’s passengers who were unable to evacuate the ship.
What it is
The normalcy bias describes our tendency to underestimate the possibility of disaster and believe that life will continue as normal, even in the face of significant threats or crises.
Why it happens
The normalcy bias is a complex interplay of different cognitive processes aimed at reducing internal conflicts, maintaining satisfaction with the status quo, and grappling with the perceived challenges of change. In other words, it’s a heuristic that helps us to make decisions and reduce stress during times of crisis or impending disaster. It’s difficult for us to process the implications of unimaginable events until we have actually experienced them. As humans, we tend to base our actions on how often we see and experience things ourselves, rather than the likelihood of something actually happening. As such, we tend to believe that bad things won’t happen to us and that our lives will continue as normal, even if there’s evidence to suggest otherwise.
Example 1 – The unknown virus
During the early stages of the COVID-19 pandemic, many people downplayed the severity of the virus and convinced themselves that life would soon return to normal. Even when faced with the warning signs of increasing cases and hospitalizations, individuals who didn’t know anyone with the disease felt optimistic that it wouldn’t affect them. The normalcy bias not only affected individual decision making, but also contributed to delayed responses at a systemic level, as governments and organizations struggled with a situation completely unknown to them.
Example 2 – The 'unsinkable' ship
When RMS Titanic set sail on her maiden voyage on April 10, 1912, no one imagined that the journey would end in disaster. Despite receiving warnings about insufficient lifeboats onboard and the presence of ice sheets on the ship’s route, the crew of the luxury liner continued full speed towards New York City. A combination of underestimating the possibility of disaster, ignoring early warnings, and failing to prepare for an emergency, ultimately contributed to the death of many passengers onboard.
How to avoid it
The first step to avoiding normalcy bias is making ourselves aware of it and understanding how the bias affects our perception of reality and our ability to make decisions. By acknowledging the fact that negative events can happen to us, and not just other people, we can start thinking more broadly about possible future outcomes. This is achieved by overcoming our tendency to only plan for one forecast, scenario, or outcome, and considering the potential for negative events. While thinking about the worst possible scenario may feel uncomfortable and go natural instinct for optimism, it can help us make better decisions should we need to act in an emergency.
Related TDL articles
In this article, Gleb Tsipurksy explores how cognitive biases, including the normalcy bias, led individuals, businesses, and governments to misjudge the risks of new, more infectious, COVID-19 strains during the pandemic. Gleb Tsipursky is a behavioral economist, cognitive neuroscientist, and a bestselling author of several books on decision-making and cognitive biases.
In the same way that we tend to underestimate or dismiss potential disasters, we are also inclined to ignore and avoid negative information. Instead of dealing with an uncomfortable situation, we choose to bury our heads in the sand, a phenomenon known as the ostrich effect. Read this article to learn more about what the ostrich effect is, why it happens, and how we can avoid it.