How do we gauge whether a claim is true or false? Naturally, you would think, we use our existing base of knowledge, and maybe a couple of well-placed Google searches, to compare that claim to the available evidence. No rational person would accept a statement as true without first holding it up to the light and critically examining it, right?
Unfortunately, humans are rarely rational beings. Every single day, we make an average of 35,000 decisions.19 With all of those choices to make, and the huge volume of information that is coming at us every second, we can’t possibly hope to process everything as deeply as we might like.
To conserve our limited mental energy, we rely on countless shortcuts, known as heuristics, to make sense of the world, and this can often lead us to make errors in our judgment. There are a few fundamental heuristics and biases that underlie the illusory truth effect.
We are often cognitively lazy
According to the renowned behavioral economist Daniel Kahneman, there are two thinking systems in our brains. System 1 is fast and automatic, working without our awareness; meanwhile, System 2 handles deeper, more effortful processing, and is under our conscious control.1 System 2, since it’s doing the harder work, drains more of our cognitive resources; it’s effortful and straining to engage, which we don’t like. So, wherever possible, we prefer to rely on System 1 (even if we don’t realize that’s what we’re doing).
This preference for easy processing (also known as processing fluency) is more deeply rooted than many of us realize. In one experiment, participants were shown images on a screen while researchers measured the movements of the muscles in their faces. Some of the images were made easier to process by having their outlines appear before the rest of the picture—only by a fraction of a second, so briefly that participants didn’t consciously realize it was happening. Still, when processing was made easier in this way, people’s brows relaxed, and they even smiled slightly.1 Processing fluency even has implications in the business world: stocks with pronounceable trading names (for example, KAR) consistently do better than unpronounceable ones (such as PXG).
The problem with processing fluency is that it can influence our judgments about the accuracy of a claim. If it’s relatively effortless to process a piece of information, it makes us feel like it must be accurate. Consider another experiment, where participants were given problems that were deliberately designed to trip people up. For example:
If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?
100 minutes OR 5 minutes
Most of us, when we read this problem, intuitively want to say 100 minutes—but it’s actually 5. For some of the participants in this study, researchers made the questions more difficult to read, by presenting them in a barely-legible, small, gray font. When they did this, participants actually performed better—because they had to engage their more effortful System 2 thinking in order to parse the question. Meanwhile, when the problems were written in a normal, easy-to-read font, people were more likely to go with their (incorrect) intuition.1
Familiarity making processing easy
What does processing fluency have to do with the illusory truth effect? The answer lies with familiarity. When we’re repeatedly exposed to the same information—even if it’s meaningless, or if we aren’t consciously aware that we’ve seen it before—it gradually becomes easier for us to process. And as we’ve seen, the less effort we have to expend to process something, the more positively we feel about that thing. This gives rise to the mere exposure effect, which describes how people feel more positively about things they’ve encountered before, even very briefly.
In a classic experiment illustrating the mere exposure effect, Robert Zajonc took out ads in student newspapers at two Michigan universities, over a period of a few weeks. Every day, the front page of each paper featured one or more Turkish words. Some words appeared more frequently than others, and the frequency of each word was also reversed between the two papers, so that the most-frequently appearing word in one paper would be the least-frequently appearing one in the other.
After the exposure period was over, Zajonc sent out a questionnaire to both communities, asking respondents to give their impressions of 12 “unfamiliar” words. Some of the words were the Turkish words that had run in the newspapers. Participants rated each word from 1 to 7 based on whether they thought the word meant something “good” or “bad.” The results showed that, the more frequently participants had been exposed to a given word, the more positively they felt about it.1,2
For a long time, psychologists (reasonably) believed that ease of processing was only important in situations where we lack knowledge about something—that we use it as sort of a last-ditch attempt to come to a conclusion. Unfortunately, the evidence suggests that things might work the other way around: ease of processing is our go-to tool to judge whether something is true, and it’s only if that fails that we turn to our knowledge for help. One study found that college students fell for the illusory truth effect even when a subsequent knowledge test showed that they knew the correct answer.3 This phenomenon is known as knowledge neglect: even though people knew the right answers, they could still be led astray by the illusory truth effect.