Biases

The Basic Idea

As the world of behavioral science continues to attract the attention of organizations in both the public and private sector, the term “bias” has become commonplace. There are a number of heuristics, biases, and fallacies that have cropped up in the media and popular science, arguably bringing the concept to buzzword status. In addition to prominence, it can also seem like there are simply too many biases to keep track of: Wikipedia’s list of cognitive biases contains nearly 200 entries. In order to prevent potential information overload, it can help to step back and take a high-level view of cognitive bias to better understand why all these bizarre effects might occur in the first place.

For starters, a bias is essentially a structured error in mental processing that leads to a conclusion that goes against formal logic or normative rationality. In other words, it is the thinking that can lead one astray; a psychological blind spot.

 

 
The confidence people have in their beliefs is not a measure of the quality of evidence but of the coherence of the story the mind has managed to construct.

– Daniel Kahneman

Key Terms

Rationality: Adherence to logic and reason. When applied to a decision-making context, rationality refers to the ability to make judgments that are optimal in relation to one’s preferences.

Heuristic: A mental shortcut that allows the brain to reach a conclusion without explicit deliberation. Though efficient, heuristics can lead to errors, with many cognitive biases reflecting the heuristical process that has resulted in a given error.

History

The notion of human bias has been around for centuries, with proverbial phrases such as “to err is human” ubiquitous in folk wisdom across the globe. In the academic field of decision-making – which spans both economics and psychology – the subject of biases and heuristics began in 1969, during the meetings of the Mathematical Psychology Society and the American Psychological Association.1 Two Israeli psychologists, Daniel Kahneman and Amos Tversky, administered a short questionnaire on hypothetical research decisions to expert guests at these meetings.2 What Kahneman and Tversky found, was that many of these psychologists’ intuitive responses were wrong. While most of them certainly could have calculated the correct answers with a pen and paper, a bias persisted in the guesses of these sophisticated participants. This bias would later be identified as the representativeness heuristic, where probability judgments are influenced by perceptions of similarity. Put differently, when we assess the probability that A belongs to B, we are guided by whether A “looks like” B. This idea that people sometimes answer a difficult question by substituting it for an easier one would become the basis of cognitive heuristics.

The desire for cognitive ease and reliance on intuitive judgment (i.e., answering the “easier” question) is a key principle in understanding biases and heuristics and is also the central theme in Kahneman’s seminal book Thinking, Fast and Slow published in 2011, which has been dubbed by many as the bible of behavioral economics. Kahneman and his Princeton colleague Shane Frederick summarized the notion in 2002: “From its earliest days, the heuristics and biases program was guided by the idea that intuitive judgments occupy a position – perhaps corresponding to evolutionary history – between the automatic parallel operations of perception and the controlled serial operations of reasoning.”

As a number of heuristics and biases began to emerge in the literature, an evolutionary framework followed in order to explain why natural selection would allow for cognitive processing that so often leads to “suboptimal” behavior. As the psychologist Hal Arkes put it, “the extra effort required to use a more sophisticated strategy is a cost that often outweighs the potential benefit of enhanced accuracy.” Despite being imperfect, heuristics can be incredibly efficient, making solving problems easier. Deliberative thought causes energy – it is “costly” in evolutionary terms – and jumping to conclusions can free up additional energy for other tasks. Of course, what may have been ideal on the African savannah under a nomadic lifestyle may not be the ideal neural architecture for the socially complex world we live in today. So despite the Darwinian pragmatism of mental shortcuts, such cognitive strategies are indeed fallible in many modern contexts.

Research on heuristics and biases continues to develop. Neuroscientific tools such as magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI) have been applied to explore neural correlates for various biases. For example, a 2007 paper by a group of researchers from New York University provided neural support for the optimism bias, whereby the bias is mediated by functional connectivity between the rostral anterior cingulate cortex and the amygdala.3 Additionally, a 2020 paper published in the journal Nature by researchers from University College London explored the neural processing of both confirmatory and contradicting evidence that underlies the well-known confirmation bias.4 Overall, the myriad effects and theories that have come out of the heuristics and biases domain continue to be refined in terms of understanding how they manifest cognitively, as psychologists seek to gain a richer comprehension of these phenomena.

People

  

Daniel Kahneman & Amos Tversky

Renowned in the fields of economics and psychology, Kahneman and Tversky are best known for their groundbreaking work, in the field of judgment and decision-making. The two are responsible for igniting the research domain of biases and heuristics in the early 70’s. Together, their paper on prospect theory published in 1979 became a key component in the foundation of behavioral economics. Kahneman is a Nobel Laureate, having won the prize in economics in 2002, which likely would have been shared with Tversky had he not passed away in 1996.

Consequences

Although each cognitive bias has its own unique consequences, we can generalize a bit and look at the ramifications of the subject of heuristics and biases as a whole. One considerable impact has been the role research on biases has had on the field of economics. Prior to and during much of Kahneman and Tversky’s foundational work, the study of economics was loyal to the assumption that individuals are rational agents (i.e., homo economicus). With biases being defined as structured errors that lead to irrational decision-making, they were naturally a thorn in the side of classical economics. The mounting evidence that has accrued highlighting the fallibility of human reasoning in the form of cognitive bias has shifted the way not only economists think about human decision-making but also psychologists, philosophers, political scientists, and many other professionals who’ve cared to explore the subject.

In light of such biases, attempts by governments and businesses at designing interventions, or “nudges,” to combat cognitive biases continue to gain momentum. One famous example is known as automatic enrollment, aimed at tackling the status quo bias. The idea is that people have a tendency to opt for the default option, or the status quo, so when it comes to enrolling in a retirement savings plan, people tend to stick with the default option of not saving at all. When the default option is changed, however, and people are automatically enrolled in a savings plan, more end up sticking with this default and saving.

Cognitive biases have also caught the attention of various progressive activists. Implicit or unconscious biases have been cited as hindering women and people of color from advancing their careers. These biases have also been front and center in a number of debates around racial profiling in policing. Most people don’t believe that they exhibit racist or sexist behavior, however the research and recognition of implicit biases have shown that regardless of what one thinks of their own behavior, discrimination can occur beneath our awareness.

Controversies

Not all researchers have eagerly jumped on the bias bandwagon. Gerd Gigerenzer, a German psychologist from the Max Planck Institute for Human Development, is a notable critic of the prominence of cognitive bias in behavioral economics. Gigerenzer has suggested a bias bias,5 which he describes as “the tendency to spot biases even when there are none.” He has also provided a systematic review that challenges the notion that cognitive biases have negative effects on health, wealth, and happiness.

Others such as Koen Smets, who applies behavioral economics to organizational development, have noted how biases have garnered too much popular attention, with the nuance of human behavior being neglected in exchange for mere labels of decision-making errors. In a piece for Behavioral Scientist, Smets wrote: “This focus on biases is unhelpful in several ways. It fails to acknowledge that biases are broad tendencies, rather than fixed traits, and it oversimplifies the complexity of human behavior into an incoherent list of flaws. This leads to misguided applications of behavioral science that have little or no effect, or which can backfire spectacularly.”

Case Study

  

Applied

“Old hiring methods are biased and ineffective. A typical hiring funnel leaks 60% of the best candidates, mostly from under-represented backgrounds. No team can afford that, and no society should tolerate it,” writes the Applied website. The company’s online recruitment platform seeks to allow organizations to hire candidates while minimizing bias by leveraging tools such as anonymized applications and data-driven evaluation metrics.

Related TDL resources

List of Cognitive Biases and Heuristics

See The Decision Lab’s comprehensive list of notable biases and heuristics in behavioral economics.

Protecting Your Projects from Cognitive Bias

Given the sheer amount of cognitive biases and heuristics, it can seem daunting to combat these mental errors in your organization. This article discusses how agile management can help tackle such a challenge.

Sources

  1. Kahneman, D., & Frederick, S. (2002). Representativeness revisited: Attribute substitution in intuitive judgment. Heuristics and biases: The psychology of intuitive judgment, 49, 81.
  2. Tversky, A., & Kahneman, D. (1971). Belief in the law of small numbers. Psychological bulletin, 76(2), 105.
  3. Sharot, T., Riccardi, A. M., Raio, C. M., & Phelps, E. A. (2007). Neural mechanisms mediating optimism bias. Nature, 450(7166), 102-105.
  4. Rollwage, M., Loosen, A., Hauser, T. U., Moran, R., Dolan, R. J., & Fleming, S. M. (2020). Confidence drives a neural confirmation bias. Nature communications, 11(1), 1-11.
  5. Gigerenzer, G. (2018). The bias bias in behavioral economics. Review of Behavioral Economics, 5(3-4), 303-336.

Read Next