What is the availability heuristic?
The availability heuristic describes our tendency to use information that comes to mind quickly and easily when making decisions about the future.
Imagine you are considering either John or Jane, two employees at your company, for a promotion. Both have a steady employment record, though Jane has been the highest performer in her department during her tenure. However, in Jane’s first year, she unwittingly deleted a company project when her computer crashed. The vivid memory of having lost that project likely weighs more heavily on the decision to promote Jane than it should. This is due to the availability heuristic, which suggests that singular memorable moments have an outsized influence on decisions.
The availability heuristic can lead to bad decision-making because memories that are easily recalled are frequently insufficient for figuring out how likely things are to happen again in the future. Ultimately, this leaves the decision-maker with low-quality information to form the basis of their decision.
Exploring the availability heuristic leads to troubling conclusions across many different academic and professional areas. If each one of us analyzes information in a way that prioritizes memorability and nearness over accuracy, then the model of a rational, logical chooser, which is predominant in economics as well as many other fields, can be flawed at times. The implications of the availability heuristic suggest that many academics, policy-makers, business leaders, and media figures have to revisit their basic assumptions about how people think and act in order to improve the quality and accuracy of their work.
A heuristic is a ‘rule-of-thumb’, or a mental shortcut, that helps guide our decisions. When we make a decision, the availability heuristic makes our choice easier. However, the availability heuristic challenges our ability to accurately judge the probability of certain events, as our memories may not be realistic models for forecasting future outcomes.1
For example, if you were about to board a plane, how would you go about calculating the probability that you would crash? Many different factors could impact the safety of your flight, and trying to calculate them all would be very difficult. Provided you didn’t google the relevant statistics, your brain may do something else to satisfy your curiosity. In fact, many of us do this on an everyday basis.
Your brain could use a common mental shortcut by drawing upon the information that most easily comes to mind. Perhaps you had just read a news article about a massive plane crash in a nearby country. The memorable headline, paired with the image of a wrecked plane wreathed in flames, left an easily recalled impression, which causes you to wildly overrate the chance that you’ll be involved in a similar crash. This is the availability heuristic bias at work.
The availability heuristic exists because some memories and facts are spontaneously retrieved, whereas others take effort and reflection to be recalled. Certain memories are automatically recalled for two main reasons: they appear to happen often or they leave a lasting imprint on our minds.
Those that appear to happen often generally coincide with other shortcuts we use to comprehend our world. This is seen with a study that Tversky and Kahneman, two pioneers of behavioral science, conducted in 1973.2 They asked participants whether more words begin with the letter K or if more words have K as their third letter.
Even though a typical text contains twice as many words in which K is the third letter rather than the first, 70% of the participants said that more words begin with K. This is because it is much easier for people to think of words that begin with K (e.g., kitchen, kangaroo, kale, etc) than words that have K as the third letter (e.g., ask, cake, biking). Since words that begin with K are easier to think of, it seems like there are more of them.
Other events leave a lasting impression, which primes their chance of recall when we make decisions. Tversky and Kahneman exposed this tendency in a study conducted in 1983,3 in which half of the participants were asked to guess the chance that a massive flood would occur somewhere in North America, while the other half were asked the likelihood of a massive flood occurring due to an earthquake in California.
By definition, the chance of a flood in California is necessarily smaller than that of a flood for all of North America. Participants said, nonetheless, that the chance of the flood in California, provoked by an earthquake, is higher than that in all of North America. An explanation is that an earthquake in California is easier to imagine. There is a coherent story, which begins with a familiar event (the earthquake) that causes the flood, in a context that creates a vivid picture in one’s head. A large, ambiguous area like all of North America does not create a clear picture, so the prediction has no lasting mental imprint to draw on.
The availability heuristic has serious consequences in most professional fields and many aspects of one’s daily life. People make thousands of decisions per day and factors such as media coverage, emotional reactions and vivid images have greater influence than they would in an entirely rational calculation. Awareness of our intrinsic biases can be a safeguard against fallacious reasoning, unintentional discrimination or costly mistakes in investments and business decisions.
The availability heuristic is a label for the core cognitive function of saving mental effort that we often go through. Unfortunately, unlike a sleight of hand trick, simply knowing how it works is not sufficient to overcome it completely.4 The availability heuristic describes behavior that results from numerous shortcuts that our brain makes in order to process all of the world’s information.
Although awareness alone cannot change one’s thought process, it is essential in order to support and implement policies that take the heuristic into account. Taking steps to recognize and check the availability heuristic is crucial for ensuring fair treatment for consumers and citizens in areas ranging from regulating gambling law, to preventing discrimination, to holding the media accountable.
In practice, guaranteeing thoughtful and rigorous mental analysis is challenging. The availability heuristic is everywhere, so avoiding its effects demands what Daniel Kahneman and Amos Tversky, two pioneers in the field of behavioral science, referred to as ‘System 2 thinking’. System 2 refers to the mental network that is engaged in deliberative, careful and reflective decision-making.5 As opposed to System 1, which is fast and automatic. The availability heuristic works on System 1 because upon thorough reflection, people are able to realize that their quick approximations of probable outcomes are skewed.
Overcoming the availability heuristic involves activating System 2 thinking. This is often easier to do in collective decision making because others can catch instances when one is captivated by superficially convincing (but ultimately false) information.
A more deliberate strategy to counter the availability heuristic is called ‘red-teaming.’ Red-teaming involves nominating one member of a group to challenge the prevailing opinion, no matter their personal beliefs.6 Intentionally seeking out the mistakes that occur in individual decision-making can reduce the chance that heuristics are reflexively treated as facts.
In order for red-teaming, or other similar initiatives, to effectively identify the availability heuristic, we must be aware of the bias in order to observe its effect on the behavior of the group. Understanding a bias may not eliminate it completely from our decision-making; however, it increases the chances that we will be able to identify it in group settings, or in the behavior of colleagues and collaborators.
Heuristics like the availability heuristic are especially tenacious until one develops an understanding of how they work. A dedicated devil’s advocate can fall prey to the same biases that they are designed to prevent unless they are specifically attentive to the cases where those biases take effect.
Combining expert insights from behavioral science with dedicated resources can prevent bad decision-making and can help increase productivity across a variety of environments. For those of us without an expert consultant on hand, learning about behavioral science is a solid first step towards leveraging its power to influence important choices.
Amos Tversky and Daniel Kahneman’s work in 19737 helped generate insights about the availability heuristic. They described the availability heuristic as “whenever [one] estimates frequency or probability by the ease with which instances or associations could be brought to mind.” In simpler terms, one guesses the likelihood that things happen by using easily recalled memories as a reference.
The concluding remarks of their paper noted that analyzing the heuristics that a person uses when making decisions can predict whether their judgement will be too high or too low. Everyday life is filled with uncertainty due to the seemingly infinite number of decisions and information that our brains process daily, which is why knowing about common heuristics is so important. By being aware of the availability heuristic, humans can make less judgemental errors under uncertain conditions.
Let’s say you watch a documentary series, or see a plethora of advertisements, about the luxurious lives of those who won the lottery. After watching, you mistakenly figure that your chances of winning are higher than they actually are. Why did this happen? The documentary showcased the winner’s luxury house and brand new sports car; this left a strong impression in your mind, which will ultimately help with ease of recall. Later that day, you were feeling lucky, so you bought a Lotto 6/49 ticket with a $40 million jackpot prize.
Because of the documentary, you figured you had a decent chance of winning—after all, those people won, and they were regular people like you before buying that lucky ticket. However, you forgot the homework assignment you did for your statistics class a few years earlier where you calculated the odds of winning the 6/49 lottery as 1 in 13,983,816.8 Unfortunately, your ticket did not win, which may not have surprised you if you could’ve more easily recalled the actual odds you were up against.
A study by Russell Eisenman in 19939 examined how media coverage of specific topics can impact people’s perceptions via the availability heuristic. In this study, college students were asked if drug use in the United States was increasing or decreasing. It was found that they were more likely to say that it was increasing despite reputable survey data from the National Household Survey on Drug Abuse that claimed otherwise. Eisenman cited a 1984 study by Tyler and Cook10 which concluded that constant media coverage of certain topics like drug use can distort perceptions of how often those events occur in the real world.
The key idea is that news stories about sensationalized and relatively rare topics such as drug use or plane crashes can evoke the availability heuristic. People wildly overestimate the chance that these events happen compared to other deadly events that are statistically more likely, such as heart disease or car accidents. Depending on what you watch and read (and, perhaps most importantly, how much they inform your actions), your decisions could be based on heavily biased information.
The availability heuristic describes the mental shortcut where we make decisions based on emotional cues, familiar facts, and vivid images that leave an easily recalled impression in our minds.
The brain tends to minimize the effort necessary to complete routine tasks. When making decisions — especially ones involving probability — certain memories and knowledge jump out to replace the complicated task of calculating statistics. Some memories leave a lasting impression because they connect to emotional triggers. Others seem familiar because they align with the way we process the world, such as recognizing words by their first letter.
One buys lottery tickets because the lifestyle that follows a winning ticket comes to mind easily and vividly, while the probability of winning is a complex calculation that does not jump out while one is at the ticket counter.
Sensational news stories seem much more likely to occur than unremarkable (yet dangerous) activities. The availability heuristic skews the distribution of fear towards events that leave a lasting mental impression due to their graphic content or unexpected occurrence versus comparatively dangerous yet more probable events.
The best way to avoid the availability heuristic, on a small scale, is to combine expertise in behavioral science with dedicated attention and resources to locate the points where it takes hold of individual choices. On a larger scale, the solution remains similar. Dedicating a specialized team to focus on the role of heuristics in public policy, institutional behavior or media output can achieve more logical outcomes wherever human behavior is concerned.
This article examines how nudging can be used to help drive desirable outcomes in the medical field, from increasing organ donors to reducing the use of misprescribed antibiotics. The author notes that the availability heuristic can get in the way of our efforts to stay healthy, such as when we remember that taking a specific screening test in the past hurt. This can be harmful if it causes us to avoid potentially helpful screening tests in the future.
This article explores how elderly individuals are more likely to be victims of financial fraud from a behavioral science lens, and how this fraud can be prevented. The author notes that elderly individuals may be more susceptible to financial fraud because they think more in the present, which can increase their vulnerability in financial decision-making environments. This is an example of the availability heuristic and explains why fraudulent emails sometimes leverage urgent calls to action.
What is illusion of explanatory depth? The illusion of explanatory depth (IOED) describes our belief that we understand more about...
Reactive devaluation refers to our tendency to disparage proposals made by another party, especially if this party is viewed as...