Why do we take mental shortcuts?



, explained.

What are Heuristics?

Heuristics are mental shortcuts that can facilitate problem-solving and probability judgments. These strategies are generalizations, or rules-of-thumb, reduce cognitive load, and can be effective for making immediate judgments, however, they often result in irrational or inaccurate conclusions.

Where this bias occurs

We use heuristics in all sorts of situations. One type of heuristic, the availability heuristic, often occurs when we’re attempting to make judgments about the frequency with which a certain event occurs. Say, for example, someone asked you whether more tornadoes occur in Kansas or Nebraska. Most of us can easily call to mind an example of a tornado in Kansas: the tornado that whisked Dorothy Gale off to Oz in Frank L. Baum’s The Wizard of Oz. Although it’s fictional, this example comes to us easily. On the other hand, most people have a lot of trouble calling to mind an example of a tornado in Nebraska. This leads us to believe that tornadoes are more common in Kansas than Nebraska. However, the states actually report similar levels of tornadoes.1

Individual effects

The thing about heuristics is that they aren’t always wrong. As generalizations, there are many situations where they can yield accurate predictions or result in good decision-making. However, even if the outcome is favorable, it was not achieved through logical means. When we use heuristics, we risk ignoring important information and overvaluing less relevant information. There’s no guarantee that using a heuristic will work out and, even if it does, we’ll be making the decision for the wrong reason; instead of basing it in reason, our behavior is resulting from a mental shortcut with no real rationale to support it.

Systemic effects

Heuristics become even more concerning when applied to politics, academia, and economics. We may all resort to heuristics from time to time, something that is true even of members of these institutions who are tasked with making important, influential decisions. It is necessary for these figures to have a comprehensive understanding of the biases and heuristics that can affect our behavior, so as to promote accuracy on their part.

Why it happens

In their paper “Judgment Under Uncertainty: Heuristics and Biases” (1974)2, Daniel Kahneman and Amos Tversky identified three different kinds of heuristics: availability, representativeness, and anchoring and adjustment. Each type of heuristic is used for the purpose of reducing the mental effort needed to make a decision, but they occur in different contexts.

Availability heuristic

The first type of heuristic is the availability heuristic, which was touched upon in the example of judging the frequency with which tornadoes occur in Kansas relative to Nebraska. Kahneman and Tversky define this heuristic as a mental shortcut for making frequency or probability judgments based on “the ease with which instances or occurrences can be brought to mind” (p.1127).3

The availability heuristic occurs because we can call certain memories to mind more easily than others. The example that Kahneman and Tversky give is that participants asked if more words in the English language start with the letter K or have the third letter K, most participants responded with the former. In actuality, it is the latter that is true, but it is much harder to think of words that have K as the third letter than it is to think of words that start with K.4 In this case, our memories of words that begin with K come to mind more readily than do memories of words with the third letter K.

Representativeness heuristic

A second type of heuristic is the representativeness heuristic. We often rely on this heuristic when making probability judgments. We tend to classify events into categories, which, as illustrated by Kahneman and Tversky, can result in our use of this heuristic. When we use the representativeness heuristic, we make probability judgments about the likelihood that an object or event arises from some category based on the extent to which the object or event in question is similar to the prototypical example of that category.5 For example, if someone we meet in one of our university lectures looks and acts like a stereotypical medical student, we may judge the probability that they are studying medicine as highly likely, even without any hard evidence to support that assumption.

The representativeness heuristic is associated with prototype theory.6 This prominent theory in cognitive science provides an explanation for object and identity recognition. It suggests that we categorize different objects and identities in our memory. For example, we may have a category for chairs, a category for fish, a category for books, and so on. Prototype theory posits that we develop prototypical examples for these categories by averaging every example of a given category we encounter. As such, our prototype of a chair should be the most average example of a chair possible, based on our experience with that object. This aids in object recognition because we compare every object we encounter against the prototypes stored in our memory for identification. The more the object resembles the prototype, the more confident we are that it belongs in that category. Prototype theory may give rise to the representativeness heuristic, as this heuristic occurs in situations when a certain object or event is viewed as similar to the prototype stored in our memory, which leads us to classify the object or event into the category represented by that prototype. To go back to the previous example, if your peer closely resembles your prototypical example of a med student, you may place them into that category, based on the prototype theory of object and identity recognition. This, however, causes you to commit the representativeness heuristic.

Anchoring and adjustment heuristic

The third type of heuristic put forth by Kahneman and Tversky in their initial paper on the topic is the anchoring and adjustment heuristic.7 This heuristic describes how, when estimating a certain value, we tend to give an initial value, then adjust it by increasing or decreasing our estimation. However, we often get stuck on that initial value – which is referred to as anchoring – which results in us making insufficient adjustments. Thus, our adjusted value is biased in favor of the initial value, which we have anchored on.

In the example given by Kahneman and Tversky, participants were presented with a question, such as “estimate the number of African countries in the United Nations (UN)”. A wheel labeled with numbers from 0-100 was spun, and participants were asked to say whether or not the number the wheel landed on was higher or lower than their answer to the question. Then, participants were asked to give an estimate of the number of African countries in the UN. Kahneman and Tversky found that participants tended to anchor on the random number obtained by spinning the wheel. So, when the number obtained by spinning the wheel was 10, the median estimate given by participants was 25, while, when the number obtained from the wheel was 65, participants’ median estimate was 45.8

A 2006 study by Epley and Gilovich, “The Anchoring and Adjustment Heuristic: Why the Adjustments are Insufficient”9 investigated the causes of this heuristic. They illustrated that anchoring often occurs because the information we anchor on is more accessible than other information because we have just encountered it. Furthermore, they provided empirical evidence to demonstrate that our adjustments tend to be insufficient because adjustments require significant mental effort, which we are not always motivated to dedicate to the task. They also found that providing incentives for accuracy led participants to make more sufficient adjustments. So, this particular heuristic generally occurs when there is no real incentive to provide an accurate response.

Quick and easy

What these types of heuristics have in common is that they all allow us to respond automatically, without much effortful thought. They provide an immediate response and do not use up much of our mental energy, which allows us to dedicate mental resources to other matters, which may be more pressing. In that way, heuristics are efficient, which is a big reason why we continue to use them. That being said, we should be mindful of how much we rely on them because there is no guarantee as to their accuracy.

Why it is important

Using heuristics can cause us to engage in various cognitive biases and commit certain fallacies, as Tversky and Kahneman illustrate.10 As a result, we may make poor decisions, as well as inaccurate judgments and predictions. Awareness of heuristics can aid us in avoiding them, which will ultimately lead us to engage in more adaptive behaviors.

How to avoid it

Heuristics arise from automatic, System 1, thinking. It is a common misconception that errors in judgment can be avoided by relying exclusively on System 2 thinking. However, as pointed out by Kahneman, neither System 2 nor System 1 are infallible.11 While System 1 can result in heuristics, which lead to certain biases, System 2 can give rise to other biases, such as the confirmation bias.12 Systems 1 and 2 complement each other, and using them together can lead to more rational decision-making. That is, we shouldn’t make judgments automatically, without a second thought, but we also shouldn’t overthink things to the point where we’re looking for specific evidence to support our stance. Thus, heuristics can be avoided by making judgments more effortfully, but in doing so we should attempt not to overanalyze the situation.

How it all started

The first three heuristics – availability, representativeness, and anchoring and adjustment – were identified by Tverksy and Kahneman in their 1974 paper, “Judgment Under Uncertainty: Heuristics and Biases”.13 In addition to presenting these heuristics and their relevant experiments, they listed the respective biases that each of them can lead to.

To give an example, upon defining the availability heuristic, they illustrated how it may lead to illusory correlation, which is the mistaken belief that two events frequently co-occur. Kahneman and Tversky explained how the availability heuristic can cause us to over- or under-estimate the frequency with which certain events occur, leading us to draw erroneous conclusions, which may result in us seeing a correlation where there is none. Additionally, they discussed how the representativeness heuristic underlies the illusion of validity. The illusion of validity refers to our tendency to overestimate our accuracy in making probability judgments. The more representative an object or event is, the more confident we feel in our predictions about obtaining a certain outcome. However, because factors can influence the probability of obtaining that outcome, without affecting the representativeness of the object or event, we should not be so confident in our predictive abilities.

Example 1 - Advertising

Anyone working in advertising should have a working understanding of heuristics. This is because consumers often rely on heuristics when making decisions about purchases. One heuristic that frequently comes into play in the realm of advertising is the scarcity heuristic. When assessing the value of something, we often fall back on this heuristic, leading us to believe that the more rare the object in question, the more valuable it is.

A 2011 study by Praveen Aggarwal, Sung Yul Jun, and Jong Ho Huh evaluated the impact of “scarcity messages” on consumer behavior. They found that both “limited quantity” and “limited time” advertisements influence consumers’ intentions to purchase, but “limited quantity” messages are more effective. This explains why people get so excited over the one-day-only Black Friday sales, and why the countdowns of units available on home shopping television frequently lead to impulse buys.14

Knowledge of the scarcity heuristic can help businesses thrive, as “limited quantity” messages make potential consumers competitive and increase their intentions to purchase.15 This marketing technique can be a useful tool for bolstering sales and bringing attention to your business.

Example 2 - Stereotyping

One of the downfalls of heuristics is that they have the potential to lead to stereotyping, which is often harmful. Kahneman and Tversky illustrated how the representativeness heuristic may result in the propagation of stereotypes when they presented participants with a personality sketch of a fictional man named Steve and a list of possible occupations and gave participants the task of ranking the occupations in order of likelihood of them being Steve’s job. Since the personality sketch described Steve as shy, helpful, introverted, and organized, participants tended to indicate his being a librarian as highly probable.16 The stereotype of a librarian used here is, of course, less harmful than many other stereotypes, but this example serves to illustrate the link between heuristics and stereotypes.

Published in 1989, Patricia Devine’s paper “Stereotypes and prejudice: Their automatic and controlled components” illustrates how, even among people low in prejudice, rejecting stereotypes requires a certain level of motivation and cognitive capacity.17 We typically use heuristics in order to avoid exerting too much mental energy, specifically, in cases when we are not sufficiently motivated to dedicate significant mental resources to the task at hand. Thus, when we are not motivated to make a judgment or decision effortfully, we may rely instead upon automatic heuristic responses and, in doing so, risk propagating stereotypes.

Stereotypes are an example of how heuristics can go wrong. These broad generalizations do not always apply, and their continued use can have serious consequences. This underscores the importance of effortful judgment and decision-making, as opposed to automatic.


What it is

Heuristics are mental shortcuts that allow us to make quick judgment calls based on generalizations, or rules of thumb.

Why it happens

Heuristics in general occur because they are efficient ways of responding when we are faced with problems or decisions. They come about automatically, allowing us to allocate our mental energy elsewhere. The specific types of heuristics occur in different contexts; the availability heuristic happens because we remember certain memories better than others, the representativeness heuristic can be explained by prototype theory, and the anchoring and adjustment heuristic happens due to lack of incentive to put in the effort required for sufficient adjustment.

Example 1 – Advertising

The scarcity heuristic, which refers to how we value items more when they are scarce, can be used to the advantage of businesses looking to increase sales. Research has shown that advertising objects as “limited quantity” increases competitiveness among consumers and increases their intentions to buy the item.

Example 2 – Stereotypes

While heuristics can be useful, we should exert caution, as they are generalizations which may lead us to propagate stereotypes, which may range from inaccurate to harmful.

How to avoid it

Putting more effort into decision-making, instead of making decisions automatically, can help us avoid heuristics. Doing so requires more mental resources, but it will lead to more rational choices.

Related TDL articles

What are Heuristics

This interview with The Decision Lab’s Managing Director Sekoul Krastev delves into the history of heuristics, their applications in the real world, and their consequences, both positive and negative.

What Rock-Paper-Scissors Can Teach Us About Our Decision-Making

This article examines how Rock-Paper-Scissors is tied to heuristics, specifically, the win-stay lose-shift heuristic. It also describes the difference between System 1 thinking and System 2 thinking, or automatic thinking and effortful thinking. This simple, ubiquitous game offers significant insight into our decision-making processes.


  1. Gilovich, T., Keltner, D., Chen. S, and Nisbett, R. (2015). Social Psychology (4th edition). W.W. Norton and Co. Inc.
  2. Tversky, A. and Kahneman, D. (1974). Judgment Under Uncertainty: Heuristics and Biases. Science. 185(4157), 1124-1131.
  3. See 2
  4. See 2
  5. See 2
  6. Mervis, C. B., & Rosch, E. (1981). Categorization of natural objects. Annual Review of Psychology, 32(1), 89–115. https://doi.org/10.1146/annurev.ps.32.020181.000513
  7. See 2
  8. See 2
  9. Epley, N., & Gilovich, T. (2006). The anchoring-and-adjustment heuristic. Psychological Science -Cambridge-, 17(4), 311–318.
  10. See 2
  11. System 1 and System 2 Thinking. The Marketing Society. https://www.marketingsociety.com/think-piece/system-1-and-system-2-thinking
  12. See 11
  13. See 2
  14. Aggarwal, P., Jun, S. Y., & Huh, J. H. (2011). Scarcity messages. Journal of Advertising, 40(3), 19–30.
  15. See 14
  16. See 2
  17. Devine, P. G. (1989). Stereotypes and prejudice: their automatic and controlled components. Journal of Personality and Social Psychology, 56(1), 5–18. https://doi.org/10.1037/0022-3514.56.1.5