Why do we take mental shortcuts?

Heuristics

, explained.
Bias

What are Heuristics?

Heuristics are mental shortcuts that can facilitate problem-solving and probability judgments. These strategies are generalizations, or rules-of-thumb, that reduce cognitive load. They can be effective for making immediate judgments, however, they often result in irrational or inaccurate conclusions.

A stick figure on the left holding a red apple with a speech bubble saying 'Would you like this apple?' Another stick figure on the right, looking uncertain, has a larger speech bubble filled with various thoughts, such as 'Glucose levels: 43%' and 'Risk of poison: >0%' with arrows pointing to the apple and leading to the word 'YUMMY!' at the bottom. The background features an abstract color gradient with the word 'HEURISTICS' at the top.

Where this bias occurs

We use heuristics in all sorts of situations. For example, one type of heuristic, the availability heuristic, often happens when we’re attempting to judge the frequency with which a certain event occurs. Say someone asked you whether more tornadoes occur in Kansas or Nebraska. Most of us can quickly call to mind an example of a tornado in Kansas: the tornado that whisked Dorothy Gale off to Oz in Frank L. Baum’s The Wizard of Oz. Although it’s fictional, this example comes to us easily. On the other hand, most people have a lot of trouble calling to mind an example of a tornado in Nebraska. This leads us to believe tornadoes are more common in Kansas than in Nebraska. However, the two states report similar tornado activity.1

Heuristics don’t just pop up when we’re trying to predict probability. Simple heuristics show up across various domains of life, streamlining the brain’s decision-making process in the same way that keyboard shortcuts help us copy and paste text or switch between browser tabs. Like keyboard shortcuts we all know and love, heuristics are a problem-solving approach involving mental shortcuts that help us make decisions easier and faster.

Unfortunately, our cognitive time-savers are not always as accurate or reliable as the ones programmed into our computers. Just as the availability heuristic can cause us to judge the probability of a tornado in Nebraska inaccurately, heuristics often lead us to “good enough” conclusions that seem correct based on our previous experiences or pre-existing ideas but may not be objectively accurate. Why? Our brains often revert to heuristics when finding an optimal solution isn’t possible or practical—for example, you cannot evaluate every single restaurant in a big city before choosing a place to eat, so heuristics step in to help you make a decision that is likely to be satisfactory, even if it’s not optimal.

Individual effects

Two stick figures holding orange briefcases, one in the foreground and another further back, walking along an orange arrow that points to the right. The background is purple with abstract blue lines, resembling paths, behind the figures.

The thing about heuristics is that they aren’t always wrong. As generalizations, there are a wide range of situations where they can yield accurate predictions or result in good decision-making. For example, it’s often correct to assume that the person in a white lab coat sitting across from you in a medical office is a doctor with reliable and trustworthy health advice.

However, even if the outcome is favorable, relying on heuristics means we are not necessarily using the most logical decision-making path to get there. When we use heuristics, we risk ignoring important information and overvaluing what is less relevant. There’s no guarantee that using heuristics will work out, and, even if it does, we’ll be deciding for the wrong reason. Instead of basing it on reason, our behavior results from a mental shortcut with no real rationale to support it. With that in mind, here are some ways heuristics can influence our thinking and behavior: 

Heuristics help us make decisions efficiently

Perhaps the number one benefit of heuristics is that they speed up our thinking. We can thank these cognitive shortcuts for enabling us to act quickly in time-sensitive situations and avoid danger without calculating the probability of every risk we face. Heuristics also allow our brains to conserve cognitive energy. By shortening the cognitive distance between problem and solution, they reduce the mental load involved in processing large amounts of information. This is great when outcomes don’t matter all that much—like when choosing which brand of soap to buy or what to wear to work. 

Using heuristics means wasting less of our valuable mental resources deliberating over decisions that have minimal impact on our lives. This frees us up to focus on decisions that do benefit from conscious, deliberate thought, like making big life moves or evaluating conflicting news stories. So, what’s wrong with heuristics? The real problem occurs when we use heuristics for high-stakes decisions, where careful thought and objective rationality can help us avoid making serious errors.

Heuristics make us prone to error

Decision-making errors are a common consequence of heuristics. Relying on heuristic methods to prioritize speed over accuracy can cause us to make snap judgments and overgeneralizations. For example, we might misjudge the probability of random events based on how easily examples of these events come to mind—like with the above tornado example. Using heuristics in this way can distort our risk assessments. For instance, people tend to overestimate the probability of a plane crash when they can recall a recent news story about an airplane incident, feeling that flying is more risky than driving when the opposite is really true.19 Similarly, the normalcy bias describes how we often underestimate the possibility of disaster and assume life will go on as usual, even when facing risk. This bias has been used to explain why people might disregard safety warnings on cooking appliances, resulting in kitchen fires.19

Compensating for our cognitive limitations by reverting to heuristics can also make us resistant to change. Because these mental shortcuts often rely on patterns, they encourage us to prioritize familiarity—a tendency broadly known as the familiarity heuristic. This heuristic motivates us to avoid losses, stick with default options, and seek information that confirms what we already believe. 

All of this can have significant opportunity costs. For example, we might stay in a safe and comfortable job instead of taking a chance on a better-paying opportunity. Similarly, we might invest in a familiar company instead of researching lesser-known options with higher potential. Heuristics can even make us fall for marketing tactics that cause us to pay more for options that “feel right” rather than buying the objectively better product. We’ll explore the most common types of heuristics in depth a little later, but for now, let’s see how these shortcuts can impact groups.

Systemic effects

Heuristics become more concerning when applied to politics, academia, and economics. We may all resort to heuristics from time to time, something that is true even of members of important institutions who are tasked with making large, influential decisions. These figures must have a comprehensive understanding of the biases and heuristics that can affect our behavior, to promote accuracy on their part. These are some of the most significant ways in which heuristics can have larger, more systemic consequences:

Distorted group decision-making

Heuristics can get in the way of group decision-making in the same way they impact our individual judgements. In short, heuristics can help groups make quick, efficient decisions, but this often comes at the cost of thorough evaluation. This might mean groups overlook risks, conform to previous decisions, or rely on the judgements of authority figures without question. At the same time, our tendency to rely on fast, automatic thinking processes can cause groups to accept available information quickly while suppressing opinions that go against the consensus. Unsurprisingly, this often leads to undesirable consequences. The Challenger space shuttle disaster is a classic example, where groupthink—the tendency for groups to strive for cohesion and conformity over critical thinking—caused decision-makers to overlook risks and push forward with a decision under the illusion of unanimity.19

Perpetuating prejudice and inequality

On a group level, heuristics also play a central role in systemic inequality. Heuristics cause us to make quick judgments about people based on limited information, which frequently reinforces stereotypes. For example, we often categorize people based on how closely they resemble members of a group instead of evaluating each individual on their own. At the same time, we tend to pay more attention to information about people that confirms whatever beliefs we already hold about them.20 This means we often look for information that aligns with our existing perceptions of people, which are often rooted in media depictions and cultural stereotypes. 

The result? Members of marginalized groups often face unfair assumptions, resulting in systemic inequalities. Heuristics frequently influence individual and group decision-making regarding high-stakes areas such as hiring, education, medical care, policing, and legal matters. It goes without saying that making snap judgements in any of these industries can lead to decisions and policies that perpetuate existing inequalities.

Why it happens

In their paper “Judgment Under Uncertainty: Heuristics and Biases”2, Daniel Kahneman and Amos Tversky identified three different kinds of heuristics: availability, representativeness, as well as anchoring and adjustment. Each type of heuristic is used for the purpose of reducing the mental effort needed to make a decision, but they occur in different contexts.

Availability heuristic

The availability heuristic, as defined by Kahneman and Tversky, is the mental shortcut used for making frequency or probability judgments based on “the ease with which instances or occurrences can be brought to mind”.3 This was touched upon in the previous example, judging the frequency with which tornadoes occur in Kansas relative to Nebraska.3

The availability heuristic occurs because certain memories come to mind more easily than others. In Kahneman and Tversky’s example participants were asked if more words in the English language start with the letter K or have K as the third letter  Interestingly, most participants responded with the former when in actuality, it is the latter that is true. The idea being that it is much more difficult to think of words that have K as the third letter than it is to think of words that start with K.4 In this case,  words that begin with K are more readily available to us than words with the K as the third letter.

Representativeness heuristic

Individuals tend to classify events into categories, which, as illustrated by Kahneman and Tversky, can result in our use of the representativeness heuristic. When we use this heuristic, we categorize events or objects based on how they relate to instances we are already familiar with.  Essentially, we have built our own categories, which we use to make predictions about novel situations or people.5 For example, if someone we meet in one of our university lectures looks and acts like what we believe to be a stereotypical medical student, we may judge the probability that they are studying medicine as highly likely, even without any hard evidence to support that assumption.

The representativeness heuristic is associated with prototype theory.6 This prominent theory in cognitive science, the prototype theory explains object and identity recognition. It suggests that we categorize different objects and identities in our memory. For example, we may have a category for chairs, a category for fish, a category for books, and so on. Prototype theory posits that we develop prototypical examples for these categories by averaging every example of a given category we encounter. As such, our prototype of a chair should be the most average example of a chair possible, based on our experience with that object. This process aids in object identification because we compare every object we encounter against the prototypes stored in our memory. The more the object resembles the prototype, the more confident we are that it belongs in that category. 

Prototype theory may give rise to the representativeness heuristic as it is in situations when a particular object or event is viewed as similar to the prototype stored in our memory, which leads us to classify the object or event into the category represented by that prototype. To go back to the previous example, if your peer closely resembles your prototypical example of a med student, you may place them into that category based on the prototype theory of object and identity recognition. This, however, causes you to commit the representativeness heuristic.

Anchoring and adjustment heuristic

Another heuristic put forth by Kahneman and Tversky in their initial paper is the anchoring and adjustment heuristic.7 This heuristic describes how, when estimating a certain value, we tend to give an initial value, then adjust it by increasing or decreasing our estimation. However, we often get stuck on that initial value – which is referred to as anchoring – this results in us making insufficient adjustments. Thus, the adjusted value is biased in favor of the initial value we have anchored to.

In an example of the anchoring and adjustment heuristic, Kahneman and Tversky gave participants questions such as “estimate the number of African countries in the United Nations (UN).” A wheel labeled with numbers from 0-100 was spun, and participants were asked to say whether or not the number the wheel landed on was higher or lower than their answer to the question. Then, participants were asked to estimate the number of African countries in the UN, independent from the number they had spun. Regardless, Kahneman and Tversky found that participants tended to anchor onto the random number obtained by spinning the wheel. The results showed that  when the number obtained by spinning the wheel was 10, the median estimate given by participants was 25, while, when the number obtained from the wheel was 65, participants’ median estimate was 45.8.

A 2006 study by Epley and Gilovich, “The Anchoring and Adjustment Heuristic: Why the Adjustments are Insufficient”9 investigated the causes of this heuristic. They illustrated that anchoring often occurs because the new information that we anchor to is more accessible than other information Furthermore, they provided empirical evidence to demonstrate that our adjustments tend to be insufficient because they require significant mental effort, which we are not always motivated to dedicate to the task. They also found that providing incentives for accuracy led participants to make more sufficient adjustments. So, this particular heuristic generally occurs when there is no real incentive to provide an accurate response.

Quick and easy

Though different in their explanations, these three types of heuristics allow us to respond automatically without much effortful thought. They provide an immediate response and do not use up much of our mental energy, which allows us to dedicate mental resources to other matters that may be more pressing. In that way, heuristics are efficient, which is a big reason why we continue to use them. That being said, we should be mindful of how much we rely on them because there is no guarantee of their accuracy.

behavior change 101

Start your behavior change journey at the right place

Why it is important

As illustrated by Tversky and Kahneman, using heuristics can cause us to engage in various cognitive biases and commit certain fallacies.10 As a result, we may make poor decisions and inaccurate judgments and predictions. Defaulting to cognitive heuristics over careful critical thinking can affect every aspect of our lives. Relying too heavily on heuristics can make us vulnerable to misinformation, cause us to misjudge risks, and pass up on promising opportunities. 

Fortunately, awareness of heuristics can aid us in avoiding them, which will ultimately lead us to engage in more adaptive behaviors. Studying heuristics is also crucial for understanding why people tend to make biased judgments that perpetuate stereotypes. If we can learn to recognize when we’re making decisions based on mental shortcuts, we can step in and stop these automatic processes at times when we would benefit from thinking more critically. Consider how the type of decision-making influences whether heuristics are valuable or misleading. Ultimately, the goal is to use heuristics for decisions that don’t require effortful, conscious thought—like deciding what to eat for breakfast—while switching to careful reasoning for high-stakes decisions, such as managing our finances or hiring employees. 

Tversky and Kahneman refer to these two thinking styles as System 1 and System 2 thinking.11 Learning when and how to switch between these can help you to prevent unchecked heuristics from derailing your decision-making. At the same time, it’s also important to be aware of biases that can still pop up when you’re striving for deliberate, rational thought. In the next section, we offer tips for managing heuristics and cognitive biases in both thinking styles.

How to avoid it

System 1 System 2 Columns

Heuristics arise from automatic System 1 thinking, which refers to the fast, automatic thinking that requires little effort. However, it is a common misconception that errors in judgment can be avoided by relying exclusively on System 2 thinking, the slower, more deliberate system involved in complex decisions and reasoning. As pointed out by Kahneman, neither System 2 nor System 1 are infallible.11  While System 1 can result in relying on heuristics leading to certain biases, System 2 can give rise to other biases, such as the confirmation bias.12 In truth, Systems 1 and 2 complement each other, and using them together can lead to more rational decision-making. That is, we shouldn’t make judgments automatically, without a second thought, but we shouldn’t overthink things to the point where we’re looking for specific evidence to support our stance. Thus, heuristics can be avoided by making judgments more effortfully, but in doing so, we must actively seek diverse information and evidence instead of relying solely on information that is easily accessible and available.

Equally valuable is understanding the conditions under which heuristics will likely take over our thinking and make us prone to bias. For example, we often default to heuristics when our brains are under significant mental load—like when we’re facing complex decisions or confusing information. Heuristics essentially preserve our brains’ limited computational resources. This is why we’re more likely to switch to System 1 thinking when we’re feeling tired, stressed, or rushed. By recognizing these conditions, you can better spot times when you might be tempted to rely on mental shortcuts to reduce cognitive effort. If you find yourself using heuristics to make decisions or judgements with important consequences, consider taking a break and coming back to the problem at a later time. In other words, “sleeping on it” can actually help you make better decisions.21

FAQ

What is the difference between a heuristic and a cognitive bias?

Heuristics and cognitive biases are closely related but slightly different concepts. While heuristics are mental shortcuts that we use to make fast, efficient decisions, cognitive biases are the resulting systematic errors in thinking that distort our perception of reality. Here’s an example to better illustrate this distinction. Imagine you’re planning a vacation, and you need to book a hotel. You remember that your friend had a bad experience at a certain hotel, so you conclude that the hotel must be bad, even though positive online reviews suggest otherwise. In this case, the heuristic (treating a memorable testimony from a friend as an indicator of quality) resulted in a cognitive bias—you over-relied on memorable information while dismissing broader information, ultimately forming an unfair perception of the hotel. Heuristics often influence how we interpret information and approach decisions in recurring patterns, consistently pushing our decisions in predictable directions. This is why cognitive biases are considered systematic errors in thinking.22

What is a heuristic search?

Heuristics are most often applied to areas of behavioral and cognitive psychology, but heuristics also pop up in areas of machine learning and artificial intelligence. Heuristic search techniques are problem-solving functions that find the best path from a starting point to a specific goal.23 Unlike an exhaustive search—where an algorithm might explore all possible paths—a heuristic search refers to preset patterns, criteria, or rules of thumb to tackle demanding problems more efficiently. In machine learning systems, heuristic functions are essential for reducing the computational resources necessary for solving complex problems, focusing on likely solutions rather than searching exhaustively for the optimal solutions. For example, antivirus software often uses heuristic rules to detect malware, looking for behavioral patterns common to certain viruses. This process works similarly to the way our brains use heuristics, essentially prioritizing efficiency over exhaustive analysis.

How it all started

Research on heuristics began with work by computer scientist Herbert Simon in the 1950s.24 Simon was the first to introduce the idea of bounded rationality, challenging the traditional economic assumption that people apply perfect rationality when making decisions. By highlighting these limitations in human decision-making and introducing the idea of heuristics, Simon paved the way for later research in this area.

The first three heuristics – availability, representativeness, as well as anchoring and adjustment – were identified by Tverksy and Kahneman in their 1974 paper, “Judgment Under Uncertainty: Heuristics and Biases”.13 In addition to presenting these heuristics and their relevant experiments, they listed the respective biases each can lead to.

For instance, upon defining the availability heuristic, they demonstrated how it may lead to illusory correlation, which is the erroneous belief that two events frequently co-occur. Kahneman and Tversky made the connection by illustrating how the availability heuristic can cause us to over- or under-estimate the frequency with which certain events occur. This may result in drawing correlations between variables when in reality there are none.  

Referring to our tendency to overestimate our accuracy making probability judgments, Kahneman and Tversky also discussed how the illusion of validity is facilitated by the representativeness heuristic. The more representative an object or event is, the more confident we feel in predicting certain outcomes. The illusion of validity, as it works with the representativeness heuristic, can be demonstrated by our assumptions of others based on past experiences. If you have only ever had good experiences with people from Canada, you will be inclined to judge most Canadians as pleasant. In reality, your small sample size cannot account for the whole population. Representativeness is not the only factor in determining the probability of an outcome or event, meaning we should not be as confident in our predictive abilities.

Following Kahneman and Tversky’s exploration of heuristics and biases, psychologist Gerd Gigerenzer introduced yet another perspective on heuristics in the 1990s. He criticized the idea that heuristics always result in systematic errors or irrationality.24 In his critique, Gigerenzer introduced the idea of ecological rationality, suggesting that our decisions can be considered rational and effective if they are adapted to the surrounding environment. 

In other words, heuristics don’t always lead to errors in thinking but can be incredibly valuable in helping us navigate complex environments. Gigerenzer suggested that heuristics serve as a kind of “adaptive toolbox” that helps us make quick and efficient decisions in varying contexts. Much like a real toolbox holds different tools for different tasks, our minds rely on different heuristics to help us through different decision-making situations. For example, the recognition heuristic—where we prioritize the option we recognize—often points us to the correct answer. In a classic experiment, German students unfamiliar with U.S. cities were more likely to assume that San Diego is larger than San Antonio (which is correct) than American students familiar with both cities. This suggests that the recognition heuristic helped students make more accurate judgments with limited knowledge.25

Gigerenzer’s concepts of ecological rationality and the adaptive toolbox contradicts the notion that heuristics always lead to errors in thinking or suboptimal decisions. His approach significantly influenced the study of heuristics in psychology, shifting from a focus on inherent flaws in human reasoning to valuable cognitive strategies that can be very useful in certain situations. Understanding when heuristics may serve as helpful decision-making tools and when they might mislead us continues to be an important research topic to this day. In fact, recent research suggests that CEOs who use heuristics often make more effective decisions than those who carefully analyze information before making choices.26 Rather than trying to eliminate heuristics altogether, it’s important to recognize when heuristics can play an important role in decision-making processes and when they need to be checked by critical thinking.

How it affects product

Heuristics can be useful in product design. Specifically, because heuristics are intuitive to us, they can be applied to create a more user-friendly experience and one that is more valuable to the customer. For example, color psychology is a phenomenon explaining how our experiences with different colors and color families can prime certain emotions or behaviors. Taking advantage of the representativeness heuristic, one could choose to use passive colors (blue or green) or more active colors (red, yellow, orange) depending on the goals of the application or product.18 For example, if a developer is trying to evoke a feeling of calm for their app that provides guided meditations, they may choose to make the primary colors of the program light blues and greens. Colors like red and orange are more emotionally energizing and may be useful in settings like gyms or crossfit programs. 

By integrating heuristics into products, we can enhance the user experience. This idea inspired Jakob Nielsen’s 10 usability heuristics, which are general principles of digital interface design that make products more user-friendly. If an application, device, or item includes features that make it feel intuitive, easy to navigate and familiar, customers will be more inclined to continue to use it and recommend it to others. By appealing to those mental shortcuts, we can minimize the chances of user error or frustration with an overly complicated product.

Heuristics and AI

Artificial intelligence and machine learning tools already use the power of heuristics to inform its output. In a nutshell, simple AI tools operate based on a set of built in rules and sometimes heuristics! These are encoded within the system thus aiding in decision-making and the presentation of learning material. Heuristic algorithms can be used to solve advanced computational problems, providing efficient and approximate solutions.  Like in humans, the use of heuristics can result in error, and thus must be used with caution. However, machine learning tools and AI can be useful in supporting human decision-making, especially when clouded by emotion, bias or irrationality due to our own susceptibility to heuristics. 

Example 1 – Advertising

Those in the field of advertising should have a working understanding of heuristics as consumers often rely on these shortcuts when making decisions about purchases. One heuristic that frequently comes into play in the realm of advertising is the scarcity heuristic. When assessing the value of something, we often fall back on this heuristic, leading us to believe that the rarity or exclusiveness of an object contributes to its value.

A 2011 study by Praveen Aggarwal, Sung Yul Jun, and Jong Ho Huh evaluated the impact of “scarcity messages” on consumer behavior. They found that both “limited quantity” and “limited time” advertisements influence consumers’ intentions to purchase, but “limited quantity” messages are more effective. This explains why people get so excited over the one-day-only Black Friday sales, and why the countdowns of units available on home shopping television frequently lead to impulse buys.14

Knowledge of the scarcity heuristic can help businesses thrive, as “limited quantity” messages make potential consumers competitive and increase their intentions to purchase.15 This marketing technique can be a useful tool for bolstering sales and bringing attention to your business.

Example 2 – Stereotypes

One of the downfalls of heuristics is that they have the potential to lead to stereotyping, which is often harmful. Kahneman and Tversky illustrated how the representativeness heuristic might result in the propagation of stereotypes. The researchers presented participants with a personality sketch of a fictional man named Steve followed by a list of possible occupations. Participants were tasked with ranking the likelihood of each occupation being Steve’s. Since the personality sketch described Steve as shy, helpful, introverted, and organized, participants tended to indicate that it was probable that he was a  librarian.16 In this particular case the stereotype is less harmful than many others, however it accurately illustrates the link between heuristics and stereotypes.

Published in 1989, Patricia Devine’s paper “Stereotypes and Prejudice: Their Automatic and Controlled Components” illustrates how, even among people who are low in prejudice, rejecting stereotypes requires a certain level of motivation and cognitive capacity.17 We typically use heuristics in order to avoid exerting too much mental energy, specifically when we are not sufficiently motivated to dedicate mental resources to the task at hand. Thus, when we lack the mental capacity to make a judgment or decision effortfully, we may rely upon automatic heuristic responses and, in doing so, risk propagating stereotypes.

Stereotypes are an example of how heuristics can go wrong. Broad generalizations do not always apply, and their continued use can have serious consequences. This underscores the importance of effortful judgment and decision-making, as opposed to automatic.

Summary

What it is

Heuristics are mental shortcuts that allow us to make quick judgment calls based on generalizations or rules of thumb.

Why it happens

Heuristics, in general, occur because they are efficient ways of responding when we are faced with problems or decisions. They come about automatically, allowing us to allocate our mental energy elsewhere. Specific heuristics occur in different contexts; the availability heuristic happens because we remember certain memories better than others, the representativeness heuristic can be explained by prototype theory, and the anchoring and adjustment heuristic occurs due to lack of incentive to put in the effort required for sufficient adjustment.

Example 1 – Advertising

The scarcity heuristic, which refers to how we value items more when they are limited, can be used to the advantage of businesses looking to increase sales. Research has shown that advertising objects as “limited quantity” increases consumers' competitiveness and their intentions to buy the item.

Example 2 – Stereotypes

While heuristics can be useful, we should exert caution, as they are generalizations that may lead us to propagate stereotypes ranging from inaccurate to harmful.

How to avoid it

Putting more effort into decision-making instead of making decisions automatically can help us avoid heuristics. Doing so requires more mental resources, but it will lead to more rational choices.

Related TDL articles

What are Heuristics

This interview with The Decision Lab’s Managing Director Sekoul Krastev delves into the history of heuristics, their applications in the real world, and their consequences, both positive and negative.

10 Decision-Making Errors that Hold Us Back at Work

In this article, Dr. Melina Moleskis examines the common decision-making errors that occur in the workplace. Everything from taking in feedback provided by customers to cracking the problems of on-the-fly decision-making, Dr. Moleskis delivers workable solutions that anyone can implement. 

Sources

  1. Gilovich, T., Keltner, D., Chen. S, and Nisbett, R. (2015). Social Psychology (4th edition). W.W. Norton and Co. Inc.
  2. Tversky, A. and Kahneman, D. (1974). Judgment Under Uncertainty: Heuristics and Biases. Science. 185(4157), 1124-1131.
  3. See 2
  4. See 2
  5. See 2
  6. Mervis, C. B., & Rosch, E. (1981). Categorization of natural objects. Annual Review of Psychology32(1), 89–115. https://doi.org/10.1146/annurev.ps.32.020181.000513
  7. See 2
  8. See 2
  9. Epley, N., & Gilovich, T. (2006). The anchoring-and-adjustment heuristic. Psychological Science -Cambridge-17(4), 311–318.
  10. See 2
  11. System 1 and System 2 Thinking. The Marketing Society. https://www.marketingsociety.com/think-piece/system-1-and-system-2-thinking
  12. See 11
  13. See 2
  14. Aggarwal, P., Jun, S. Y., & Huh, J. H. (2011). Scarcity messages. Journal of Advertising40(3), 19–30.
  15. See 14
  16. See 2
  17. Devine, P. G. (1989). Stereotypes and prejudice: their automatic and controlled components. Journal of Personality and Social Psychology56(1), 5–18. https://doi.org/10.1037/0022-3514.56.1.5
  18. Kuo, L., Chang, T., & Lai, C.-C. (2022). Research on product design modeling image and color psychological test. Displays, 71, 102108. https://doi.org/10.1016/j.displa.2021.102108
  19. Murata, A., Nakamura, T., & Karwowski, W. (2015). Influence of cognitive biases in distorting decision making and leading to critical unfavorable incidents. Safety, 1(1), 44-58. https://doi.org/10.3390/safety1010044

About the Authors

A man in a blue, striped shirt smiles while standing indoors, surrounded by green plants and modern office decor.

Dan Pilat

Dan is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. Dan has a background in organizational decision making, with a BComm in Decision & Information Systems from McGill University. He has worked on enterprise-level behavioral architecture at TD Securities and BMO Capital Markets, where he advised management on the implementation of systems processing billions of dollars per week. Driven by an appetite for the latest in technology, Dan created a course on business intelligence and lectured at McGill University, and has applied behavioral science to topics such as augmented and virtual reality.

A smiling man stands in an office, wearing a dark blazer and black shirt, with plants and glass-walled rooms in the background.

Dr. Sekoul Krastev

Sekoul is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. A decision scientist with a PhD in Decision Neuroscience from McGill University, Sekoul's work has been featured in peer-reviewed journals and has been presented at conferences around the world. Sekoul previously advised management on innovation and engagement strategy at The Boston Consulting Group as well as on online media strategy at Google. He has a deep interest in the applications of behavioral science to new technology and has published on these topics in places such as the Huffington Post and Strategy & Business.

About us

We are the leading applied research & innovation consultancy

Our insights are leveraged by the most ambitious organizations

Image

I was blown away with their application and translation of behavioral science into practice. They took a very complex ecosystem and created a series of interventions using an innovative mix of the latest research and creative client co-creation. I was so impressed at the final product they created, which was hugely comprehensive despite the large scope of the client being of the world's most far-reaching and best known consumer brands. I'm excited to see what we can create together in the future.

Heather McKee

BEHAVIORAL SCIENTIST

GLOBAL COFFEEHOUSE CHAIN PROJECT

OUR CLIENT SUCCESS

$0M

Annual Revenue Increase

By launching a behavioral science practice at the core of the organization, we helped one of the largest insurers in North America realize $30M increase in annual revenue.

0%

Increase in Monthly Users

By redesigning North America's first national digital platform for mental health, we achieved a 52% lift in monthly users and an 83% improvement on clinical assessment.

0%

Reduction In Design Time

By designing a new process and getting buy-in from the C-Suite team, we helped one of the largest smartphone manufacturers in the world reduce software design time by 75%.

0%

Reduction in Client Drop-Off

By implementing targeted nudges based on proactive interventions, we reduced drop-off rates for 450,000 clients belonging to USA's oldest debt consolidation organizations by 46%

Notes illustration

Eager to learn about how behavioral science can help your organization?