There has not been a time in recent history when the truth has mattered more than today. As governments, the medical system, and global citizens grapple with misinformation surrounding the economic and health costs of COVID-19, knowing what information to trust is now a matter of life and death, and helping people separate fact from fiction is critical.
False beliefs can be stubborn, and popular fact correction strategies such as the myth-versus- fact format may actually backfire.1 Take for example the flesh-eating bananas hoax of the year 2000. Stories of bananas causing a flesh-eating disease spread like wildfire via emails, text messages, and word of mouth. The Centre for Disease Control and Prevention (CDC) set up a hotline to counter misinformation and to assure worried Americans that bananas were perfectly safe. While clearly a hoax, the CDC’s efforts, in fact, lent credibility to this crazy story and even increased some people’s acceptance of it, so much so that similar stories were doing rounds even a decade later.2,3
The cognitive sciences suggest that we have information processing blind spots that make us susceptible to believing false information. When we encounter a claim, we evaluate its truth by focusing on a limited number of criteria. We ask ourselves at least one of the following five questions: 3,4
Behavioral Science, Democratized
We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices.
At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.
1. Do Others Believe It?
We tend to turn to social consensus to evaluate what is likely to be correct. Research shows that we are more confident in our beliefs if others share them, and we trust our memories more if others remember events the same way.5 In order to gauge consensus, we turn to external resources, or we simply ask ourselves how often we have heard this belief. Chances are that we are more frequently exposed to widely shared beliefs than to beliefs that are held by few people.6 The popularity of a belief is actually quite a poor measure of veracity, and, to complicate this, we tend to do a poor job at tracking how often we have heard something and from whom. So, we end up relying on messages that feel familiar. Small but vocal groups can take great advantage of this situation by employing the illusory truth effect: the more they repeat their message, the more familiar it feels, giving the impression that there is wide social acceptance — when really there isn’t any at all.
2. Is There Much Evidence to Substantiate It?
It is not surprising that we are more likely to believe something when there is evidence to support it. Often, we look for evidence in peer-reviewed scientific articles, news reports, and other sources we trust. More often though, we take a far less taxing and speedier approach by making a judgment on the basis of how easy it is to retrieve or obtain some pieces of evidence. For example, when recalling evidence feels difficult, we often conclude that there is less of it, regardless of how much evidence is actually out there.8 This is an example of the availability heuristic bias, which can have a profound impact on human decision making.5
3. Is It Compatible with What I Believe?7
We are inclined to believe things that are consistent with our own beliefs and knowledge. When something is inconsistent with our existing beliefs, we stumble. This shows up even in simple tasks — we take longer to read a text that we disagree with, and experience negative feelings while doing so. So, it is possible that we believe in false facts simply because they are more compatible with what we already believe.9,10 This is a particular case of cognitive dissonance, where we might try to rationalize our belief of what is known to be false by changing our other beliefs and cognitions.
4. Does It Tell a Good Story?
Who doesn’t like a coherent story? When details are presented as part of a narrative, and individual elements fit together in a coherent frame, we are more likely to think that they are true.8 Research suggests that we react positively to efforts that help improve the coherence of the information we get.11
5. Does It Come from a Credible Source?
Indeed, we are more likely to accept information from what we believe to be a more credible source.12 People evaluate credibility by looking at the source’s expertise, past statements, and likely motives. And as expected, the ‘familiarity’ of the source matters. Even repeatedly seeing a face is enough to significantly increase perceptions of honesty, sincerity, and general agreement with what that person says. What is more surprising is that even the ease of pronouncing the speaker’s name influences credibility. A study in 2010 demonstrated that people are more likely to believe statements when they are made in a familiar and easy-to-understand accent compared to one that is difficult-to-understand.13
How fake news takes hold
We know that we humans process information imperfectly, so it isn’t a surprise that fake news takes hold so easily. The myth-versus-fact format we most often adopt to combat fake news isn’t working. A growing number of studies show that this strategy can have unintended consequences, namely increasing the acceptance of false beliefs, spreading them to new segments of the population, and creating the perception that these false beliefs are widely shared.14 As seen in the case of the terror bananas, just knowing that there might be some controversy about a fact seems to undermine people’s beliefs about the truth.
The AI Governance Challenge
The perfect example of this is the debate on the efficacy of vaccines.15 The anti-vaccine movement owes much of its origin to a paper published in The Lancet, a highly prestigious peer-reviewed general medical journal. This paper, which linked the Measles Mumps Rubella (MMR) vaccine to autism, managed to ignite fierce debate about the supposed relationship between vaccines and autism, all despite the eventual retraction of the paper in 2010. Even though several scientists have since debunked the study — in addition to the author being charged with misconduct and barred from practicing medicine in the UK — some still subscribe to the belief that vaccines cause autism to this day.16
How then can we fight the uptake of false information? Recent research suggests that some simple ploys can be effective:17
- Ideally, ignore false information, and repeat the correct information.
- Remove anecdotes and photos from communication on false information, as they only further serve to capture attention, boost comprehension, and enhance the acceptance of the false claim.
- Make communication as clear and as simple as possible.
- Make information accessible through clear, step-by-step exposition and avoidance of jargon.
- Keep the public informed — one of the most powerful strategies for avoiding misinformation is knowing that it is coming.
It is true that our era is not the first age of widespread falsehoods. These are early days in the war against fake news and false information; and we will find effective ways to fight them — such as the use of technology and artificial intelligence. While we may never fully quash the scourge of false information and fake news, we will surely find new ways to put up a solid fight.
1 Schwarz, N., Newman, E., & Leach, W. (2016). Making the truth stick & the myths fade: Lessons from cognitive psychology. Behavioral Science & Policy, 2(1), pp. 85–95.
2 Fragale, A. R., & Heath, C. (2004). Evolving informational credentials: The (mis)attribution of believable facts to credible sources. Personality and Social Psychology Bulletin, 30, 225–236
3 Erin Conway-Smith (2011). Mozambique: “Flesh-eating bananas” hoax goes viral. Public Radio International, retrieved from https://www.pri.org/
4 Schwarz, N. (2015). Metacognition. In M. Mikulincer, P. R. Shaver, E. Borgida, & J. A. Bargh (Eds.), APA handbook of personality and social psychology: Attitudes and social cognition (Vol. 1, pp. 203–229). Washington, DC: American Psychological Association
5 Festinger, L. (1954). A theory of social comparison processes. Human Relations, 7, 117–140.
6 Visser, P. S., & Mirabile, R. R. (2004). Attitudes in the social context: The impact of social network composition on individual-level attitude strength. Journal of Personality and Social Psychology, 87, 779–795.
7 Foster, J. L., Huthwaite, T., Yesberg, J. A., Garry, M., & Loftus, E. F. (2012). Repetition, not number of sources, increases both susceptibility to misinformation and confidence in the accuracy of eyewitnesses. Acta Psychologica, 139, 320–326.
8 Schwarz, N., Newman, E., & Leach, W. (2016). Making the truth stick & the myths fade: Lessons from cognitive psychology. Behavioral Science & Policy, 2(1), pp. 85–95.
9 Wyer, R. S. (1974). Cognitive organization and change: An information processing approach. Potomac, MD: Erlbaum.
10 Edwards, K., & Smith, E. E. (1996). A disconfirmation bias in the evaluation of arguments. Journal of Personality and Social Psychology, 71, 5–24.
11 Johnson-Laird, P. N. (2012). Inference with mental models. In K. Holyoak & R. G. Morrison (Eds.), The Oxford handbook of thinking and reasoning (pp. 134–145). New York, NY: Oxford University Press.
12 Eagly, A. H., & Chaiken, S. (1993). The psychology of attitudes. Orlando, FL: Harcourt Brace Jovanovich College.
13 Lev-Ari, S., & Keysar, B. (2010). Why don’t we believe non-native speakers? The influence of accent on credibility. Journal of Experimental Social Psychology, 46, 1093–1096.
14 Skurnik, I., Yoon, C., Park, D. C., & Schwarz, N. (2005). How warnings about false claims become recommendations. Journal of Consumer Research, 31, 713–724.
15 Ayoob, K. T., Duyff, R. L., & Quagliani, D. (2002). Position of the American Dietetic Association: Food and nutrition misinformation. Journal of the American Dietetic Association, 102, 260–266.
16 Chou, Vivian. To vaccinate or not to vaccinate? Searching for a verdict in the vaccination debate. Science in the News, Harvard University, retrieved from https://sitn.hms.harvard.edu
17 Schwarz, N., Newman, E., & Leach, W. (2016). Making the truth stick & the myths fade: Lessons from cognitive psychology. Behavioral Science & Policy, 2(1), pp. 85–95.
About the Author
Siddharth’s diverse education and experience feed his interest in the applicability of behavioral science in understanding our world and solving big problems. His work encompasses international development, consulting, finance, and social innovation. Apart from an MPA from Harvard University, he also has graduate degrees in Political Theory, Human Rights Law, Management, and Economics.