Why do we maintain the same beliefs, even when we are proved wrong?

Belief Perseverance

, explained.

What is Belief Perseverance?

Belief perseverance, also known as the backfire effect or conceptual conservatism, describes how we continue to hold onto established beliefs even when faced with clear, contradictory evidence. We tend to prioritize our initial conclusions and resist changing our minds, even when it might be in our best interest to do so.

Where this bias occurs

Consider Jane, a dedicated fourth-grade teacher with over three decades of experience. Late into her career, a new study is published suggesting that traditional homework methods might not be as effective as once believed. Jane reads the study but quickly dismisses it, trusting her years of observed success in sending worksheets home with her students. Even when her colleagues begin finding positive results from newer methodologies such as educational video games or encouraging free play outside, Jane sticks to her established routines, confident her approach remains superior.

In this example, Jane exemplifies belief perseverance by holding onto her established belief about traditional homework methods, despite new evidence suggesting otherwise. The new study and the successful outcomes of her colleagues should have encouraged her to re-evaluate her methods. Instead, Jane's deeply rooted beliefs resisted change, illustrating how belief perseverance can influence decisions and keep us anchored to outdated or incorrect views.

Debias Your Organization

Most of us work & live in environments that aren’t optimized for solid decision-making. We work with organizations of all kinds to identify sources of cognitive bias & develop tailored solutions.

Learn about our work

Individual effects

Belief perseverance plays a pivotal role in how we perceive and engage with the world. When we cling to a belief, even in the face of conflicting evidence, it has the potential to shape our personal trajectory in multifaceted ways.

One of the most evident manifestations is in decision-making.1 Our beliefs serve as cornerstones upon which we base our choices. If an individual persistently holds onto a debunked belief, their judgment can become clouded, leading to decisions that may not be in their best interest. For instance, someone might continually invest in a failing venture, anchored in the belief that it'll eventually succeed, despite evidence showing its decline.

This steadfast adherence to unfounded beliefs can also impact our interpersonal relationships.2 Differences in beliefs are natural, but when one person remains stubbornly attached to an idea proven to be false, it can cause tension. Consider a parent who firmly believes in an outdated myth about child-rearing. Their partner, equipped with modern research, might find it challenging to navigate discussions about parenting without conflict.

The mental strain of grappling with belief perseverance can be immense: Cognitive dissonance, or the discomfort one feels when confronted with conflicting beliefs or values, can lead to increased stress and anxiety. A person might experience inner turmoil when they discover new, scientifically-proven information that challenges their long-held beliefs about an important facet of their life. In this case, dismissing the new contradicting information may be emotionally easier than seriously considering it.

Moreover, rigidly sticking to outdated or incorrect beliefs can mean missing out on new and beneficial experiences. A refusal to adapt to technological advancements, for instance, can leave someone disconnected from the numerous advantages that modern innovation offers.

Systemic effects

Belief perseverance extends beyond individuals to shape the collective thinking of groups. Entire systems—whether that be education, healthcare, economy, or culture—are steered off course when dominated by unfounded beliefs.

In educational contexts, belief perseverance can act as a barrier to progress. When educators remain attached to outdated teaching methodologies, convinced of their superiority despite emerging research, it jeopardizes the quality of education students receive. Additionally, teachers often plan lessons based on assumptions about their students’ comprehension level rather than taking the time to evaluate the class’s average ability. In this case, the transmission of material might be faster or slower than what is age-appropriate, leaving students unengaged or even struggling to keep up.3

In healthcare, the effects of belief perseverance can, at times, be perilous.4 Despite new findings and advancements, medical professionals who remain committed to outdated treatments or methods might compromise patient care. This could mean patients might not receive the most current and effective treatments available.

On an even larger scale, governments and economies can be significantly impacted when belief perseverance infiltrates policy-making.4 If leaders, driven by old or disproven political or financial beliefs, fail to address contemporary challenges, it can lead to economic downturns or even prolonged periods of civil unrest. For instance, politicians may persist in denying the impending impacts of climate change, despite the warnings of their advisors or the protests of civilians outside their doors.

How it affects product

Belief perseverance isn’t just confined to personal choices or societal constructs – it also finds its way into the realm of product development dynamics. When creators, marketers, and consumers unwaveringly stick to certain beliefs about a product, it can steer its trajectory off course.

Consider Michael, an inventor developing a new type of shoe designed to improve posture and reduce back pain. Despite early prototype testers reporting discomfort and little relief, Michael’s confidence in his initial research makes him persevere with his original design. He invests himself so deeply in his belief about the shoe's success that he overlooks the clear feedback from his team. The result? The shoe hits the market and is immediately swamped by poor sales and reviews, leading to significant financial losses and tarnishing Michael's reputation in the industry.

When developers or businesses cling to initial product concepts or marketing strategies without being receptive to feedback, they risk product failure. Michael’s mistake underscores the importance of iterative testing and genuine openness to feedback – products aren’t made through individual opinion alone.

Belief perseverance and AI

Belief perseverance can determine whether we readily reject or embrace AI, especially when developers or users hold steadfast to certain beliefs about its capabilities or limitations.

On the one hand, if we hold the preconceived notions that machine learning is limited, we might refute any evidence proving otherwise. For example, if you believe “AI can’t be creative” or “AI can’t write with style,” you may turn down your colleague when they show you how to prompt AI chatbots as a waste of time. Your stubbornness may leave you spending hours writing rough drafts of project reports while your coworkers are already editing their final versions. 

On the other hand, we might over-rely on machine learning when convinced it is always superior to human judgment, even when evidence proves otherwise. After all, AI systems depend on user feedback to continue refining and improving their outputs; without this reiterative process, their output may be lacking. So if a chatbot falls short on a first draft, your belief perseverance may encourage you to settle rather than continue to prompt the system with new writing samples.

Why it happens

Belief perseverance results from four interacting factors: causal thinking, cognitive dissonance, ego defense mechanisms, and confirmation bias

Causal thinking

Causal thinking describes when we attribute cause and effect reasoning to our beliefs—for instance, denying global warming based on the explanation that human activities don’t impact the ozone layer. New research shows that we remember causal explanations independently from their original claim. This means that even when we learn our beliefs are wrong, we still hold tightly onto their explanations because we stored them so deeply in our memory.5, 6 So even if our friends presented us with numerous statistics that the planet’s temperature is going up every year, we may still persist in believing that humans don’t contribute to this average increase.

Cognitive dissonance

First introduced by psychologist Leon Festinger in 1957, cognitive dissonance posits that humans desire to maintain consistency in their beliefs and behaviors.15 When confronted with information that challenges our internal consistency, we fall into an uncomfortable state of dissonance. To reduce our discomfort, we often discount the new information and bolster our original ideas, strengthening belief perseverance.7 For example, if a person strictly sticking to a diet is presented with studies showing no significant benefits, they might dismiss the research as flawed to defend all of the hard work they’ve already put in.

Confirmation bias

The confirmation bias is our tendency to seek, interpret, and remember information in a way that confirms our preconceptions. Once we hold a belief, we are more likely to notice and remember instances that support that belief and ignore evidence to the contrary.7 Over time, confirmation bias helps us to reduce cognitive dissonance and strengthen belief perseverance by directing our attention towards confirming information and away from any evidence that contradicts it.1

Ego defense mechanism

Our beliefs often become intertwined with our self-identity. We may feel like recognizing that a long-held belief might be incorrect as a personal failure, leading to feelings of inadequacy or insecurity. As a protective mechanism, our egos may prevent the acceptance of such contradictory evidence to preserve our self-esteem.3 Dr. Carol Tavris and Dr. Elliot Aronson, in their book Mistakes Were Made (But Not by Me), explain how the need to preserve our self-concept can lead to self-justification and a strengthening of our original beliefs.8

Why it is important

When individuals hold steadfast onto beliefs even after they are disproven, it can profoundly impact personal decision-making and societal progress. 

On a personal level, clinging to outdated or incorrect beliefs can hinder growth, limit opportunities, and strain relationships, as individuals may become resistant to new perspectives or constructive feedback. In professional contexts, belief perseverance can impede innovation and lead to decisions based on outdated or flawed premises. In the realm of public discourse, it can interfere with constructive dialogue, polarizing groups and preventing the synthesis of diverse viewpoints. 

Recognizing the importance of belief perseverance underscores the need for self-awareness, open-mindedness, and a commitment to continuous learning and adaptation in our rapidly evolving world – we’ll never adapt if we’re unable to update our beliefs.

How to avoid it

Overcoming belief perseverance requires a combination of self-awareness, deliberate action, and embracing humility. It's not just about changing our own minds – it’s about fostering an environment in which everyone can be open to new evidence. Here are three strategies to sidestep the pitfalls of belief perseverance in daily life:

Foster critical thinking

Critical thinking begins with questioning our own beliefs and assumptions regularly. By actively seeking out and engaging with sources of information that challenge our pre-existing notions, we can keep our perspectives flexible. For example, reading a diverse range of literature, attending lectures, or participating in discussions outside of our comfort zone might help us open up our minds to new ways of seeing things. In short, try keeping the question "Why do I believe this?" at the forefront of your mind.

Seek feedback from diverse sources

Intentionally surrounding ourselves with a varied group of individuals—those from different backgrounds, professions, and worldviews—can offer a wealth of perspectives. Try encouraging the people in your life to give you honest feedback and be willing to listen without becoming defensive. No matter how difficult it is, openly accepting criticism from an external perspective can highlight our biases or incorrect beliefs that we might have never realized on our own.

Practice reflection

Setting aside time for reflection can help us identify where our ideas fall short. Over time, revisit your beliefs and assess if the evidence still holds or if new experiences and knowledge have provided a different angle. This self-evaluation process can help recognize when one's beliefs might persist in the face of contradictory evidence.

Incorporating these strategies into our everyday routines can safeguard us against the natural human tendency to cling to beliefs, even when faced with evidence to the contrary. Remember: the goal isn't to abandon our beliefs. It’s to ensure they remain based on the best available evidence.

How it all started 

Although not empirically explored until more recently, belief perseverance has ancient roots. Philosophers from ancient civilizations, including the Greeks and Chinese, often touched upon the stubbornness of human belief in their writings. Confucius, for instance, once said, "To see what is right and not do it is the want of courage."9 This proverb hints at the challenge of changing our behavior even when new insights come to light.

A significant thrust in the scientific study of belief perseverance emerged in the latter half of the 20th century. The first case study investigating this phenomenon was by psychiatrists Leon Festinger, Henry Riecken, and Stanley Schachter, who observed the resilient faith of doomsday cult members after their end of the world prophecies failed.10

Later on, Stanford psychologists Lee Ross, Mark Lepper, and Michael Hubbard conducted the first controlled experiment on belief perseverance.11 In their landmark 1975 study, participants who received positive feedback on a personality test continued to report higher self-perceptions, even after learning that the same generalized results were given to all the participants.12 In other words, knowing that the personality test didn’t apply specifically to them didn’t stop participants from using it as evidence to feel good about themselves. 

This study illuminated the strength and tenacity of human beliefs, setting the stage for further investigations into the topic. Today, researchers continue to study belief perseverance using a similar design, where participants learn information about a topic that they later learn is incorrect. However, participants tend to continue basing their opinion based on this initial information, knowing it is false.12

Example 1 – The Titanic’s “unsinkable” myth

History provides numerous examples of belief perseverance, but few are as chilling as the tragic fate of the Royal Mail Ship (RMS) Titanic.13 Belief in the ship's invincibility was widespread. Many referred to the Titanic as "unsinkable" despite evidence of potential vulnerabilities. This unshakable faith in its construction and design overlooked crucial safety measures, such as having an adequate number of lifeboats for all passengers.

When the ship struck an iceberg on its maiden voyage in April 1912, the "unsinkable" belief was shattered. If there had been a more skeptical approach and less adherence to the myth of its invulnerability, precautionary measures could have been more stringent. The disaster, leading to the loss of over 1,500 lives, serves as a stark reminder of the dangers of clinging to beliefs in the face of contradictory evidence.

Example 2 – The Dewey defeats Truman blunder 

The 1948 U.S. presidential election is a classic testament to belief perseverance. Most major pollsters and newspapers were convinced that the Republican candidate, Thomas E. Dewey, would easily defeat the incumbent, President Harry S. Truman. This belief was so deeply rooted that the Chicago Daily Tribune went ahead and printed the headline "Dewey Defeats Truman" on the front page of its early edition.14

As we now know, Truman ended up winning the election. However, the belief in Dewey's inevitable victory persisted even as early election results trickled in. The Tribune, still confident in its prediction based on earlier polls and political analyses, did not wait for the final tally. The iconic image of a triumphant Truman holding up the erroneous newspaper headline speaks comical volumes about the dangers of clinging to beliefs despite emerging evidence to the contrary.


What it is

Belief perseverance describes when we continue to hold onto our established beliefs even when faced with clear, contradictory evidence. We tend to prioritize our initial conclusions and resist changing our minds, even when it might be in our best interest to do so.

Why it happens

Belief perseverance results from a mixture of four factors. First, causal thinking encourages us to hold onto our initial explanations for beliefs in our memory. Second, cognitive dissonance makes us uncomfortable when we encounter evidence contradicting our beliefs. Third, confirmation bias propels us to dismiss any contrary evidence to preserve our initial ideas. Finally, our ego defense mechanisms sustain our instinct to be correct since we attach our beliefs to our self-identity.

Example 1 – The Titanic's "unsinkable" myth

The widespread public belief that the Titanic was “unsinkable” caused people to overlook installing precautionary safety measures, such as a sufficient number of lifeboats. This is a tragic example of how belief perseverance on a systematic level can come at the cost of lives. 

Example 2 – The Dewey defeats Truman blunder

In 1984, Truman ended up beating Dewey in the presidential election, despite pollsters and newspapers predicting the opposite. In fact, the Chicago Daily Tribune was so convinced that Dewey would win that they published the headline “Dewey Defeats Truman” before the election.

How to avoid it

To avoid belief perseverance, we must foster an environment where everyone can be open to new evidence and ready to adapt. This process means engaging in critical thinking, seeking feedback from diverse sources, and practicing reflection about our established beliefs.

Related TDL articles


In part, belief perseverance is propelled by inertia, where humans prefer to keep things as they already are. We stick to the default option, whether this be in our beliefs or our behaviors. Read this article to learn more about what differentiates belief perseverance from inertia, and the different scenarios we might succumb to inertia in.

10 Decision-Making Errors that Hold Us Back at Work

In this article, Melina Moleskis names belief perseverance as one of the cognitive biases that interfere with information processing in the workplace. Read on to learn more about the other types of decision-making errors we might encounter during our careers, especially when problems are not clearly defined.


  1. Siebert, J., & Siebert, J. U. (2023). Effective mitigation of the belief perseverance bias after the retraction of misinformation: Awareness training and counter-speech. PloS one, 18(3), e0282202. https://doi.org/10.1371/journal.pone.0282202
  2. Laythe, B. R. (2006). Conflict and threat between pre-existing groups: An application of identity to bias, persuasion and belief perseverance. Doctoral Dissertations. 354. https://scholars.unh.edu/dissertation/354 
  3. Savion, Leah (2009). Clinging to discredited beliefs: the larger cognitive story. Journal of the Scholarship of Teaching and Learning, 9(1), 81-92. https://files.eric.ed.gov/fulltext/EJ854880.pdf
  4. S., S. M. (2023, May 1). Belief perseverance. Psychology Dictionary. https://psychologydictionary.org/belief-perseverance/ 
  5. Anderson, C. A. (1989). Causal reasoning and belief perseverance. Proceedings of the Society for Consumer Psychology.
  6. Anderson, C. A., Lepper, M. R., & Ross, L. (1980). Perseverance of social theories: The role of explanation in the persistence of discredited information. Journal of Personality and Social Psychology, 39(6), 1037–1049. https://doi.org/10.1037/h0077720 
  7. Maegherman, E., Ask, K., Horselenberg, R., & van Koppen, P. J. (2021). Law and order effects: on cognitive dissonance and belief perseverance. Psychiatry, Psychology and Law, 1-20. https://doi.org/10.1080/13218719.2020.1855268 
  8. ​​Tavris, C., & Aronson, E. (2007). Mistakes were made (but not by me): Why we justify foolish beliefs, bad decisions, and hurtful acts. Harcourt.
  9. Confucius quotes. BrainyQuote. (n.d.). https://www.brainyquote.com/quotes/confucius_141561 
  10. Festinger, L., Riecken, H. W., & Schachter, S. (1956). When prophecy fails. University of Minnesota Press. https://doi.org/10.1037/10030-000
  11. Ross, L., Lepper, M. R., & Hubbard, M. (1975). Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm. Journal of Personality and Social Psychology, 32(5), 880–892. https://doi.org/10.1037/0022-3514.32.5.880
  12. McFarland, C., Cheam, A., & Buehler, R. (2007). The perseverance effect in the debriefing paradigm: Replication and extension. Journal of Experimental Social Psychology, 43(2), 233-240. https://doi.org/10.1016/j.jesp.2006.01.010
  13. Tikkanen, A. (2023, July 6). Titanic. Encyclopædia Britannica. https://www.britannica.com/topic/Titanic 
  14. Jones, T. (2020, October 31). Dewey defeats Truman: The most famous wrong call in electoral history. Chicago Tribune. https://www.chicagotribune.com/featured/sns-dewey-defeats-truman-1942-20201031-5kkw5lpdavejpf4mx5k2pr7trm-story.html  
  15. Festinger, L. (1957). A theory of cognitive dissonance. Stanford University Press.

About the Authors

Dan Pilat's portrait

Dan Pilat

Dan is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. Dan has a background in organizational decision making, with a BComm in Decision & Information Systems from McGill University. He has worked on enterprise-level behavioral architecture at TD Securities and BMO Capital Markets, where he advised management on the implementation of systems processing billions of dollars per week. Driven by an appetite for the latest in technology, Dan created a course on business intelligence and lectured at McGill University, and has applied behavioral science to topics such as augmented and virtual reality.

Sekoul Krastev's portrait

Dr. Sekoul Krastev

Sekoul is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. A decision scientist with a PhD in Decision Neuroscience from McGill University, Sekoul's work has been featured in peer-reviewed journals and has been presented at conferences around the world. Sekoul previously advised management on innovation and engagement strategy at The Boston Consulting Group as well as on online media strategy at Google. He has a deep interest in the applications of behavioral science to new technology and has published on these topics in places such as the Huffington Post and Strategy & Business.

Read Next

Reference Guide
See All
Notes illustration

Eager to learn about how behavioral science can help your organization?