What is Confirmation Bias?
The confirmation bias describes our underlying tendency to notice, focus on, and give greater credence to evidence that fits with our existing beliefs.
Consider the following hypothetical situation: Jane is the manager of a local coffee shop. She is a firm believer in the motto ‘hard work equals success.’ The coffee shop, however, has seen a slump in sales for the past few months. Because of her belief in the effectiveness of ‘hard work’ as a means to success, she concludes that it is because her staff is not working hard enough.
This makes sense, as she did recently catch several employees taking extended lunch breaks. Jane consequently decides to extend the store’s business hours and threatens to dismiss any employee she sees slacking. Despite these efforts, coffee sales do not improve, and the shop is now spending more on employee wages.
She then decides to consult with other coffee shop managers in the area, who identify her store’s new, less visible location as the cause of her sales slump. Jane’s belief in hard work as the most important metric of success led her to mistakenly identify employees’ lack of effort as the reason for the store’s falling revenue while ignoring evidence that pointed to the true cause: the shop’s poor location. This is a result of the confirmation bias, which caused Jane to notice and give greater credence to evidence that fit with her pre-existing beliefs.
This bias can lead us to make poor decisions because it distorts the reality from which we draw evidence. Under experimental conditions, decision-makers have a tendency to actively seek information and assign greater value to evidence confirming their existing beliefs rather than entertaining new ones. This can be considered a form of bias in evidence collection. Conclusions we draw from biased evidence are more likely to be false than those that consider evidence from a more objective standpoint. This is because they are farther from reality.
In the aggregate, individual confirmation bias can have troubling implications. If each of us is so deeply entrenched in our preconceptions that we only consider evidence which supports them, broader socio-political cooperation that often requires us to consider other points of view and their corresponding facts can be hindered. This suggests that much of the social divide and stalled policy making we see today may begin with our tendency to favour information that confirms our existing beliefs and ignore evidence which does not.
Confirmation bias is a cognitive shortcut we use when gathering and interpreting information. Evaluating evidence takes time and energy, and so our brain looks for such shortcuts to make the process more efficient.
These shortcuts are called “heuristics.” There is some debate surrounding whether or not confirmation bias can be formally categorized as a heuristic — but one thing is certain: it is a cognitive strategy that we use to look for evidence that best supports our hypotheses, and the most readily available hypotheses are the ones we already have.
It makes sense that we do this. People need to make sense of information quickly, and forming new explanations or beliefs takes time. We have adapted to take the path of least resistance, often out of necessity.
Imagine our ancestors hunting. An angered animal is charging towards them, and they only have a few seconds to decide whether to hold their ground or flea. There is no time to consider all the different variables that would go into a fully informed decision. Past experience and instinct might cause them to look at the size of the animal and flea, when in fact the presence of another hunting group now tilts the chances of successful conflict in their favour. Many evolutionary scientists have pointed out that our use of these shortcuts to make quick decisions in the modern world is based on survival instincts.¹
Another reason why we sometimes show confirmation bias is that it protects our self-esteem.
No one likes feeling bad about themselves — and realizing that a belief we valued is false can have this effect. Deeply held views often form our identities, and so disproving them can sometimes be deeply painful. Other times, it can suggest that we lack intelligence. As a result, we often look for information that supports rather than disproves our existing beliefs.2
This can also explain why confirmation bias extends to groups. In an influential 2002 peer-reviewed paper, Harriet Lerner, a clinical psychologist, and Phillip Tetlock, the political psychologist, posit that when people interact with others whose views are known to them, they have a tendency to adopt a similar position which they then seek to confirm in order to better fit into the group.
They call this “confirmatory thought” which is said to involve “a one-sided attempt to rationalize a particular point of view.” This is juxtaposed with “exploratory thought,” which entails “even-handed consideration of alternative points of view.”3 Confirmatory thought in an interpersonal setting can produce “groupthink,” in which the desire for conformity in the group results in dysfunctional decision making. So, while confirmation bias is often an individual phenomenon, it can also take place in groups of people.
As mentioned above, confirmation bias can be expressed individually or in a group context. Both can be problematic and deserve careful attention.
At the individual level, the bias affects our decision-making. Our decisions cannot be fully informed if we are only focusing on evidence that confirms our assumptions. It can cause us to overlook pivotal information both in our careers and in everyday life. A poorly informed decision is more likely to produce suboptimal results because it has not taken stock of the environment in which it is made. A voter might stand by a candidate “right or wrong” while dismissing emerging facts about the candidate’s behavior. A business executive might fail to investigate a new opportunity because of preconceived notions from a past engagement with similar ideas. Moreover, someone who sustains this sort of thinking may also accordingly be labeled ‘close minded.’ It is good to approach situations and the decisions they call for with an open mind that is aware of confirmation bias.
At a group level, it can produce and sustain the aforementioned “groupthink” phenomenon. In a culture of groupthink, the bias can hinder group decision making by contributing to the assumption that harmony and group coherence are the values most crucial to success. This reduces the likelihood of disagreement within the group and the sometimes essential reassessments it can spark.
Imagine if an employee at a technology company did not disclose a revolutionary discovery she made for fear of reorienting the firm’s direction. Likewise, this bias can prevent people from becoming informed on the differing views of their fellow citizens, and by extension, engaging in the constructive discussion that many democracies are built on.
When we make decisions, this bias is most likely to occur when we are gathering information. It is also likely to occur subconsciously, meaning that we are probably unaware of its influence on our decision making.
As such, the first step to avoiding confirmation bias is being aware that it is a problem. By understanding its effect and how it works, we are more likely to identify it in our decision making. Psychology professor and author Robert Cialdini suggests two approaches to recognizing when these biases are influencing our decision making:
Second, because the bias is most likely to occur early in the decision making process, we should focus on starting with a neutral fact base. This can be achieved by having one (or ideally, multiple) third parties who gather facts to form a more objective body of information.4
Third, when hypotheses are being drawn from the assembled data, decision makers should also consider having inter-personal discussions that explicitly aim at identifying individual cognitive bias in the hypothesis selection and evaluation. While it is likely impossible to eliminate confirmation bias completely, these measures may help manage cognitive bias and make better decisions in light of it.
Confirmation bias was known to the ancient Greeks and was written about by the classical historian Thucydides, who observed that people “entrust to careless hope” what they wish to be true. By contrast, they “use […] reason to thrust aside” what they do not.
The phenomenon was first described as confirmation bias by Peter Wason in 1960. In what’s known as Wason’s Rule Discovery Test, he conducted an experiment in which participants were asked to find a rule that applied to a series of three numbers. They were told the numbers ‘2-4-6’ satisfied this rule. To find out what the rule is, Wason told them they could make various other sets of numbers to see if they too satisfied it. An examiner would tell them if the conjured numbers satisfied the rule or not.
Most subjects proposed the rule was a sequence of even numbers and would follow this rule by doubling the given numbers in order to test their hypothesis. However, this was not the rule Wason had in mind. The rule was simply that the numbers in the set were increasing.
The experiment showed that most subjects formed a similar hypothesis and only tried number sequences that proved it rather than considering sequences that disproved it.4 They looked to confirm their own rule instead of breaking it.
A major study carried out by researchers at Stanford University in 1979 explored the psychological dynamics of confirmation bias. The study was composed of undergraduate students who held opposing viewpoints on the topic of capital punishment, and who were asked with evaluating two fictitious studies on the topic.
One of the false studies given to participants provided data in support of the argument that capital punishment deters crime, while the other supported the opposite view (that capital punishment had no appreciable effect on overall criminality in the population).
While both studies were entirely fabricated by the Stanford researchers, they were designed to present “equally compelling” objective statistics. The researchers discovered that responses to the studies broke down by participants’ pre-existing opinions:
So, after being confronted both with evidence that supported capital punishment and evidence that refuted it, both groups reported feeling more committed to their original stance. The net effect of having their position challenged was a re-entrenchment of their existing beliefs.5
The “filter bubble effect” is an example of technology amplifying and facilitating our cognitive tendency toward confirmation bias. The term was coined by internet activist Eli Pariser to describe the intellectual isolation that can occur when websites use algorithms to predict the information a user would want to see, and then provide information to the user according to this prediction.7
This means that as we use particular websites and content networks, those networks are more likely to serve us content that we prefer, while excluding content that our browsing patterns have shown run contrary to our preferences. We normally prefer content that confirms our beliefs because it requires less critical reflection. So, filter bubbles might favour information that confirms your existing options and exclude disconfirming evidence from your online experience.
In his seminal book, “The Filter Bubble: What the Internet Is Hiding from You”, Pariser uses the example of internet searches for an oil spill to show the filter bubble effect:
“In thespring of 2010, while the remains of the Deepwater Horizon oil rig were spewing crude oil into the Gulf of Mexico, I asked two friends to search for the term ‘BP’. They’re pretty similar — educated while left-leaning women who live in the Northeast. But the results they saw were quite different. One of my friends saw investment information about BP. The other saw news. For one, the first page results contained links about the oil spill; for the other there was nothing about it except for a promotional ad from BP.”8
If this were the only source of information that these women were exposed to, surely they would have formed very different conceptions of the BP oil spill. The internet search engine showed information tailored to the beliefs their passed searches showed, and which was accordingly predicted to fit with the reaction they would have to the oil spill. Unbenownsed to them, it therefore facilitated confirmation bias.
While the implications of this particular filter bubble may have been harmless, filter bubbles on social media platforms have been shown to influence elections by tailoring the content of campaign messages and political news to different subsets of voters.9 This could have a fragmenting effect that inhibits constructive democratic discussion, as different voter demographics become increasingly entrenched in their political views as a result of a curated stream of evidence that supports them.
Confirmation bias describes our underlying tendency to notice, focus on, and give greater credence to evidence that fits with our existing beliefs.
Confirmation bias is a cognitive shortcut we use when gathering and interpreting information. Evaluating evidence takes time and energy, and so our brain looks for such shortcuts to make the process more efficient. We look for evidence that best supports our existing hypotheses because the most readily available hypotheses are the ones we already have. Another reason why we sometimes show confirmation bias is that it protects our self-esteem. No one likes feeling bad about themselves– and realizing that a belief they valued is false can have this effect. As a result, we often look for information that supports rather than disproves our existing beliefs.
A 1979 study by Stanford researchers found that after being confronted with equally compelling evidence in support of capital punishment and evidence that refuted it, subjects reported feeling more committed to their original stance on the issue. The net effect of having their position challenged was a re-entrenchment of their existing beliefs.
Modern preference algorithms have a “filter bubble effect,” which is an example of technology amplifying and facilitating our cognitive tendency toward confirmation bias. Websites use algorithms to predict the information a user would want to see, and then provide information to the user according to this prediction. We normally prefer content that confirms our beliefs because it requires less critical reflection. So, filter bubbles might exclude information that clashes with your existing opinions from your online experience. This use of filter bubbles and the confirmation bias they can produce in subjects has been shown to influence elections and may inhibit the constructive discussion democracy rests on.
Confirmation bias is most likely to occur when we are gathering the information needed to make decisions. It is also likely to occur subconsciously, meaning that we are most likely unaware of its influence on our decision making. As such, the first step to avoiding confirmation bias is being aware that it is a problem. Because confirmation bias is most likely to occur early in the decision making process, we should also focus on starting with a neutral fact base. This can be achieved by having one (or ideally, multiple) third parties who gather facts to form a more objective body of information.10
This article argues that gender diversity in a firm is associated with higher firm performance. By addressing and drawing on confirmation bias (among other relevant psychological principles), firms may be able to increase diversity and thereby increase performance.
This article argues that the use of ‘trigger warnings’, modern preferences algorithms, and other such cues create a highly curated stream of information that facilitates cognitive biases such as confirmation bias. The author notes that this can prevent us from empathizing with others and consolidating our opinions in light of differing ones.
Reactive devaluation refers to our tendency to disparage proposals made by another party, especially if this party is viewed as...
The hard-easy effect occurs when we incorrectly predict our ability to complete tasks depending on their level of difficulty.