The notion of human bias has been around for centuries, with proverbial phrases such as “to err is human” ubiquitous in folk wisdom across the globe. In the academic field of decision-making – which spans both economics and psychology – the subject of biases and heuristics began in 1969, during the meetings of the Mathematical Psychology Society and the American Psychological Association.1 Two Israeli psychologists, Daniel Kahneman and Amos Tversky, administered a short questionnaire on hypothetical research decisions to expert guests at these meetings.2 What Kahneman and Tversky found, was that many of these psychologists’ intuitive responses were wrong. While most of them certainly could have calculated the correct answers with a pen and paper, a bias persisted in the guesses of these sophisticated participants. This bias would later be identified as the representativeness heuristic, where probability judgments are influenced by perceptions of similarity. Put differently, when we assess the probability that A belongs to B, we are guided by whether A “looks like” B. This idea that people sometimes answer a difficult question by substituting it for an easier one would become the basis of cognitive heuristics.
The desire for cognitive ease and reliance on intuitive judgment (i.e., answering the “easier” question) is a key principle in understanding biases and heuristics and is also the central theme in Kahneman’s seminal book Thinking, Fast and Slow published in 2011, which has been dubbed by many as the bible of behavioral economics. Kahneman and his Princeton colleague Shane Frederick summarized the notion in 2002: “From its earliest days, the heuristics and biases program was guided by the idea that intuitive judgments occupy a position – perhaps corresponding to evolutionary history – between the automatic parallel operations of perception and the controlled serial operations of reasoning.”
As a number of heuristics and biases began to emerge in the literature, an evolutionary framework followed in order to explain why natural selection would allow for cognitive processing that so often leads to “suboptimal” behavior. As the psychologist Hal Arkes put it, “the extra effort required to use a more sophisticated strategy is a cost that often outweighs the potential benefit of enhanced accuracy.” Despite being imperfect, heuristics can be incredibly efficient, making solving problems easier. Deliberative thought causes energy – it is “costly” in evolutionary terms – and jumping to conclusions can free up additional energy for other tasks. Of course, what may have been ideal on the African savannah under a nomadic lifestyle may not be the ideal neural architecture for the socially complex world we live in today. So despite the Darwinian pragmatism of mental shortcuts, such cognitive strategies are indeed fallible in many modern contexts.
Research on heuristics and biases continues to develop. Neuroscientific tools such as magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI) have been applied to explore neural correlates for various biases. For example, a 2007 paper by a group of researchers from New York University provided neural support for the optimism bias, whereby the bias is mediated by functional connectivity between the rostral anterior cingulate cortex and the amygdala.3 Additionally, a 2020 paper published in the journal Nature by researchers from University College London explored the neural processing of both confirmatory and contradicting evidence that underlies the well-known confirmation bias.4 Overall, the myriad effects and theories that have come out of the heuristics and biases domain continue to be refined in terms of understanding how they manifest cognitively, as psychologists seek to gain a richer comprehension of these phenomena.