There has not been a time in recent history when the truth has mattered more than today. As governments, the medical system, and global citizens grapple with misinformation surrounding the economic and health costs of COVID-19, knowing what information to trust is now a matter of life and death, and helping people separate fact from fiction is critical.
False beliefs can be stubborn, and popular fact correction strategies such as the myth-versus- fact format may actually backfire.1 Take for example the flesh-eating bananas hoax of the year 2000. Stories of bananas causing a flesh-eating disease spread like wildfire via emails, text messages, and word of mouth. The Centre for Disease Control and Prevention (CDC) set up a hotline to counter misinformation and to assure worried Americans that bananas were perfectly safe. While clearly a hoax, the CDC’s efforts, in fact, lent credibility to this crazy story and even increased some people’s acceptance of it, so much so that similar stories were doing rounds even a decade later.2,3
The cognitive sciences suggest that we have information processing blind spots that make us susceptible to believing false information. When we encounter a claim, we evaluate its truth by focusing on a limited number of criteria. We ask ourselves at least one of the following five questions: 3,4
1. Do Others Believe It?
We tend to turn to social consensus to evaluate what is likely to be correct. Research shows that we are more confident in our beliefs if others share them, and we trust our memories more if others remember events the same way.5 In order to gauge consensus, we turn to external resources, or we simply ask ourselves how often we have heard this belief. Chances are that we are more frequently exposed to widely shared beliefs than to beliefs that are held by few people.6 The popularity of a belief is actually quite a poor measure of veracity, and, to complicate this, we tend to do a poor job at tracking how often we have heard something and from whom. So, we end up relying on messages that feel familiar. Small but vocal groups can take great advantage of this situation by employing the illusory truth effect: the more they repeat their message, the more familiar it feels, giving the impression that there is wide social acceptance — when really there isn’t any at all.
2. Is There Much Evidence to Substantiate It?
It is not surprising that we are more likely to believe something when there is evidence to support it. Often, we look for evidence in peer-reviewed scientific articles, news reports, and other sources we trust. More often though, we take a far less taxing and speedier approach by making a judgment on the basis of how easy it is to retrieve or obtain some pieces of evidence. For example, when recalling evidence feels difficult, we often conclude that there is less of it, regardless of how much evidence is actually out there.8 This is an example of the availability heuristic bias, which can have a profound impact on human decision making.5
3. Is It Compatible with What I Believe?7
We are inclined to believe things that are consistent with our own beliefs and knowledge. When something is inconsistent with our existing beliefs, we stumble. This shows up even in simple tasks — we take longer to read a text that we disagree with, and experience negative feelings while doing so. So, it is possible that we believe in false facts simply because they are more compatible with what we already believe.9,10 This is a particular case of cognitive dissonance, where we might try to rationalize our belief of what is known to be false by changing our other beliefs and cognitions.
4. Does It Tell a Good Story?
Who doesn’t like a coherent story? When details are presented as part of a narrative, and individual elements fit together in a coherent frame, we are more likely to think that they are true.8 Research suggests that we react positively to efforts that help improve the coherence of the information we get.11
5. Does It Come from a Credible Source?
Indeed, we are more likely to accept information from what we believe to be a more credible source.12 People evaluate credibility by looking at the source’s expertise, past statements, and likely motives. And as expected, the ‘familiarity’ of the source matters. Even repeatedly seeing a face is enough to significantly increase perceptions of honesty, sincerity, and general agreement with what that person says. What is more surprising is that even the ease of pronouncing the speaker’s name influences credibility. A study in 2010 demonstrated that people are more likely to believe statements when they are made in a familiar and easy-to-understand accent compared to one that is difficult-to-understand.13
How fake news takes hold
We know that we humans process information imperfectly, so it isn’t a surprise that fake news takes hold so easily. The myth-versus-fact format we most often adopt to combat fake news isn’t working. A growing number of studies show that this strategy can have unintended consequences, namely increasing the acceptance of false beliefs, spreading them to new segments of the population, and creating the perception that these false beliefs are widely shared.14 As seen in the case of the terror bananas, just knowing that there might be some controversy about a fact seems to undermine people’s beliefs about the truth.