The Basic Idea

Imagine you are conducting a job interview for an administrative assistant role at your real estate company. The following two candidates are presented to you:

Josh has dyed his hair bleach blond and has a long beard. He has a tattoo sleeve and a nose piercing. He speaks with a heavy accent.

Amelie has long brown hair. She is wearing a little bit of makeup, but no extreme colors. She is well dressed and speaks eloquently.

Based on these two descriptions, which candidate would you hire? Your initial instinct is likely to hire Amelie. Yet, you have absolutely no information on what kind of experience either candidate has – so how did you come to make that choice?

It is likely that you relied on a stereotype. Stereotypes are preconceived ideas about a person based on what people from a similar group might ‘typically’ be like.1 They cause us to make snap judgments based on a few characteristics – such as the inclination dyed hair, tattoos and piercings might make someone a worse candidate for an administrative role. A gender stereotype may also have led you to believe that a female is more suited to a secretarial position.

While stereotypes can lead to faulty conclusions and discriminatory practices, they do exist for a reason. If we aren’t presented with enough information, is relying on generalizations an appropriate response? Might stereotypes have certain advantages?

The single story creates stereotypes, and the problem with stereotypes is not that they are untrue, but that they are incomplete. They make one story become the only story.

– Nigerian writer Chimamanda Ngozi Adichie, in her TED talk “The Danger of a Single Story”2

Theory, meet practice

TDL is an applied research consultancy. In our work, we leverage the insights of diverse fields—from psychology and economics to machine learning and behavioral data science—to sculpt targeted solutions to nuanced problems.

Our consulting services

Key Terms

(Unconscious) BiasOur prejudice against or in favor of someone or something. These prejudices, or preconceived beliefs, can be both conscious and unconscious (or in other words, explicit and implicit).3 They are conclusions made on incorrect judgments about particular characteristics.

Social Categorization: Our tendency to think of individuals in terms of their group membership. This usually involves viewing people in dichotomous terms (e.g. either as ‘old’ or ‘young’, ‘male’ or ‘female’ etc.). As a result, we interact with these people based on their group identity instead of their individual identity.4

HeuristicsMental shortcuts that help us make judgment calls and decipher probabilities. These often lead to stereotyping, as we quickly make judgments based on minimal information to avoid having to exert too much mental energy.

Correlation: An association between two sets of data.

Schema: A mental structure that helps an individual organize and interpret knowledge and make decisions.5 A stereotype is a kind of schema.


As stereotypes encompass a large range of ideas and are involved in a lot of different cognitive biases, it is difficult to discern exactly when they first became a point of interest in behavioral science. Classifying people into categories seems to have existed even in ancient Greek philosopher Aristotle’s time, as he tried to sort objects of human perception into one of ten categories.7

However, one of the first theories about stereotypes came from Walter Lippmann, an American political reporter and commentator. In 1922, Lippmann wrote a book entitled Public Opinion that asked (and answered) two important questions: can people achieve a basic understanding of political affairs given their limited exposure to the necessary information?8

His answer was essentially that no, we cannot, because people do not always process information correctly, or in its entirety. We come to information with preconceived notions and allow signs in our environment to represent complex ideas. For example, he states that “we do not much see this man and that sunset; rather we notice that the thing is man or sunset, and then see chiefly what our mind is already full of on those subjects.9 Lippmann was identifying that humans do not approach information as tabula rasa (a blank slate). Instead, we pinpoint a particular and sometimes singular trait in an individual and fill in the rest of our mental picture of that person based on categories we already perceive.

Some of the most notable research into stereotypes came from Daniel Kahneman and Amos Tversky in the 1970s. They suggested that people make judgments based on representativeness. We tend to compare objects or people to an existing prototype in our minds and then use their degree of similarity to that prototype to estimate the likelihood of an event.10 They called this cognitive bias the representativeness heuristic.


Eleanor Rosch

Just after Kahneman and Tversky conducted their research on the representativeness heuristic, Eleanor Rosch coined the prototype theory in 1975. She showed that people often think of categories in terms of exemplars or ‘prototypes’. The prototype is the best example of a particular category, which we use to define other examples in relation to it.11 For example, if we think about the category of birds, and are given a choice between flamingos and pigeons, we are likely to think that pigeons more aptly represent the category of ‘bird’, even though both entities accurately belong in that category.7

Loren & Jean Chapman

Loren and Jean Chapman demonstrated that people tend to see a connection between two variables, even when no such relationship exists.12 For example, people might have a pair of lucky socks that they think helps them win basketball games. The Chapmans coined this phenomenon illusory correlation. This faulty belief can lead to stereotyped behavior, as a person might continue to enact a behavior (like wearing “lucky” socks) despite the fact that it does not actually help them navigate their environment (or win basketball games). In this way, this behavior is similar to stereotypes in that it uses a single example or instance to craft an overly simplistic and generalized narrative.

Peter Wason

Cognitive psychologist Peter Wason was the first to describe the confirmation bias in 1960. He gave participants a mathematical riddle by asking them to find a rule that applied to the series “2, 4, 6.” To discover the rule, they were asked to show Wason other sets they thought satisfied it. Wason found that participants only showed him sets that proved their hypothesis and did not consider sequences that disproved it. They searched for information that would confirm their preexisting ideas and ignored information that didn’t.13 The confirmation bias reinforces stereotypes, as once formed, people tend to seek out information that supports their preconceived notions and categories instead of allowing contradictory information to dissolve those formed ideas.

Claude Steele & Joshua Aronson

In 1995, psychologists Claude Steele and Joshua Aronson identified a dangerous effect of stereotypes: the stereotype threat. The stereotype threat is at work when someone feels at risk of confirming a negative stereotype about their own social group, especially if prompted to think about one in particular. For example, women who are asked to identify their gender before completing a math exam did worse than women who were not, because they were prompted to consider the stereotype that females are worse at math.14


Stereotypes can be oversimplified, biased pathways through which we place people and ideas into discrete categories. Our tendency to rely on stereotypes means that we are quick to judge a book by its cover and arrive at conclusions that may not be warranted. Even if our stereotypes presuppose positive characteristics, they still blind us to complex personhood and reduce people to a single story that we come to understand as truth.

Stereotypes often cause us to conflate correlation and causation. While data may reveal a correlation between a particular race or gender and a particular characteristic, that isn’t to say that the race or gender causes the characteristic. For example, one common stereotype is that Asian people are good at math. There might be a correlation between the racial category and mathematical prowess, but the category does not necessitate the skill. Particularly, there may be other factors at play such as cultural emphasis on mathematics.

More often than not, stereotyping reduces people into singular negative perceptions. We end up feeling prejudiced towards a particular group or category. Unfortunately, this can lead some people to treat others in a particular group – usually one that is different from their own group – in a discriminatory manner.

Two of the most prominent areas in which stereotypes lead to prejudice and discrimination are racial and gender stereotypes. There exists a great deal of racial inequity when it comes to unemployment, health, and wealth and stereotypes contribute to the growing gap. For example, Indigenous peoples are often stereotyped to be lazy and recruiters therefore may shy away from hiring an Indigenous person. The fact that Indigenous peoples then cannot get a job reinforces the existing stereotype that they are not hard workers, when it is in fact the stereotype preventing them from working to begin with. Stereotypes are often formed in collective thought, not in individual perceptions, which can lead to systemic discrimination of particular demographic groups.

Gender stereotypes also cause people to have fixed notions of women and men’s places within society. Although in the modern day, there are less stringent notions of the roles of each gender, gender bias still exists. Male scientists are often seen to be more trustworthy than female scientists, granting them an imbalance of authority. Additionally, similar to racial stereotypes, gender stereotypes might dissuade women from entering a field they don’t think they will succeed in, which will then contribute to the stereotype that the field is reserved for men. Stereotypes can therefore be self-fulfilling prophecies, and are not only opinions we have about others, but opinions we come to believe about ourselves.


Although stereotypes can cause us to discriminate against particular people or groups without getting to know them first, stereotyping can also sometimes be a necessary skill. It is almost impossible to gather all the information that exists about the objects and people we encounter. Our ability to make decisions is restricted by time, brain capacity, and available information, a phenomenon known as bounded rationality.

Although stereotypes can be dangerous when wrong, they can also be a method through which we efficiently categorize information and free up brain processing power.15 We are exposed to so much data every day that we have to rely on heuristics to navigate our environment.

In his book, Thinking, Fast and Slow, Kahneman, who has done a great deal of research on stereotypes, even suggests that stereotypes can be beneficial. He uses the example of asking you which cab company you would take in a city that has two companies (Green & Blue). If the two had the same number of vehicles but you knew that 85% of cab accidents involved the Green company, you might form a stereotype that Green cab drivers are worse and decide to go with Blue. That stereotyping might help save your life.15

If stereotypes are grounded in reliable evidence, then ignoring them can be likened to ignoring hard facts. Ignoring salient information is suggested to be irrational. It seems as though stereotyping itself is not inherently bad but rather, when taken to extremes, can predict the inferiority of particular social groups, leading to discrimination and disadvantage.

COVID-19 May Worsen Biases

There are few areas of our lives untouched by the COVID-19 pandemic. Unfortunately, the unprecedented times have reinforced and worsened existing stereotypes and biases.

For one, the virus has disproportionately impacted marginalized groups. For one, Black people are dying at three times the rate of white people. The pandemic has also brought to light the fact that only 5% of doctors in the U.S are Black, which has severe consequences for healthcare as data shows that same-race providers have a positive impact on patients’ health outcomes.16

These figures unfortunately reinscribe many existing stereotypes about people of color (POC), namely that they do not follow rules. At the beginning of the pandemic, from March through to May, almost 98% of the people arrested for social distancing violations were POC. It is clear that police unfairly discriminate against POC, both as a result of overt racism and as a result of implicit bias and subconscious stereotypes.17

There has also been a devastating increase in anti-Asian discrimination throughout the pandemic. No thanks to former President Trump’s terming of the virus as the “Chinese Virus,” rhetoric surrounding COVID-19 has perpetuated that Asian people are dirty and carry the virus. This misconception has led to hate speech, violent acts, and racist abuse against Asians.18

As these stereotypes further concretize in the minds of the population during the pandemic, their effects will be felt long after its conclusion. An already existent racial gap in the labor market is likely to increase for POC, as COVID-induced biases affect and influence the hiring process.19 The COVID-19 pandemic has revealed the worst impacts of racist stereotypes.

Stereotypes Can Inform Laws

Law professor Dr. Dale Nance argued in his book The Burdens of Proof that complex decision-making sometimes requires the use of stereotypes.20 Although we tend to attach a negative connotation to stereotyping, Nance argues that there can be value in generalizations, even if they are not universally valid.

For example, age requirements are a useful stereotype that inform our laws and legislation. Most people are likely to accept that a 14-year-old should not be driving a car – but this is based on a stereotype that children of that age are not competent to do so. There is likely a handful of 14-year-olds out there that could safely drive, yet, that isn’t a reason to get rid of the age requirement. Nance argues that in such instances, the cost of a stereotype that may not accurately represent each individual in that category is much smaller than the cost of letting every 14-year-old drive.20

What Nance’s view suggests is that sometimes, when it comes to some stereotypes, we’re better off safe than sorry.

Related TDL Resources

Professional Women and Stereotypes: Moving Past Them

Although we may have progressed from a deep-seated belief that women belong in the kitchen, women still face challenges in professions and fields previously reserved for men. In this article, our writer Aisha Kan outlines three damaging stereotypes that persist today: that women lack the required skills for particular jobs; that women are less dedicated to their work; and that women are primarily responsible for caring for their children and will therefore focus less on work.

I Think I Am, Therefore I Am

Stereotypes do not always have negative effects. If the stereotype can elevate your self-perception or the way others perceive you, they can be beneficial. The famous saying, ‘fake it until you make it’, suggests that if we choose to inauthentically display certain characteristics , we might end up actually authentically embodying them. In this article our writer, Andrew Lewis, looks at how wearing clothing we associate with an important profession (like a labcoat) can increase our confidence and therefore make us more successful.


  1. Oxford Reference. (n.d.). Stereotype. Retrieved March 9, 2021, from
  2. Adichie, C. N. (2009). The Danger of a Single Story [Video]. Facing History and Ourselves.
  3. Chalaby, O. (2018, April 27). Humans are blinded by biases. But behavioural science can fix that. Apolitical.
  4. Social Categorization and Stereotyping. (2014, September 26). BCcampus Open Textbooks Faculty.
  5. Cherry, K. (2019, September 23). The Role of a Schema in Psychology. Verywell Mind.
  6. Stereotyped behavior. (n.d.). Science Direct. Retrieved March 9, 2021, from
  7. The Decision Lab. (2021, January 27). Representativeness heuristic
  8. Illing, S. (2018, December 20). Intellectuals have said democracy is failing for a century. They were wrong. Vox.
  9. Popova, M. (2019, August 4). The antidote to prejudice: Walter Lippmann on overriding the mind’s propensity for preconceptions. Brain Pickings.
  10. Cherry, K. (2021, January 18). What Is the Representativeness Heuristic? Verywell Mind.
  11. (2017, July 14). Prototype theory
  12. The Decision Lab. (2021, January 22). Illusory correlation
  13. The Decision Lab. (2020, August 24). Confirmation bias
  14. Smets, K. (2018, July 24). There Is More to Behavioral Economics Than Biases and Fallacies. Behavioral Scientist.
  15. Pomeroy, S. (2013, February 26). Don’t Be Afraid to Stereotype Strangers. Real Clear Science.
  16. American Medical Association. (2020, October 7). Impact of COVID-19 on minoritized and marginalized communities
  17. Southall, A. (2020, May 7). Scrutiny of Social-Distance Policing as 35 of 40 Arrested Are Black. The New York Times.
  18. Wingfield, A. H. (2020, May 14). The Disproportionate Impact of Covid-19 on Black Health Care Workers in the U.S. Harvard Business Review.
  19. Ramalingam, S. (2020, September 11). COVID-19 May Worsen Biases During The Hiring Process. Here’s How That Can Be Avoided. The Decision Lab.
  20. Peterson, E. (n.d.). WHY NOT ALL STEREOTYPES ARE BAD: Using Generalizations to Help Make Better Decisions. Case Western Reserve University.

About the Authors

Dan Pilat's portrait

Dan Pilat

Dan is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. Dan has a background in organizational decision making, with a BComm in Decision & Information Systems from McGill University. He has worked on enterprise-level behavioral architecture at TD Securities and BMO Capital Markets, where he advised management on the implementation of systems processing billions of dollars per week. Driven by an appetite for the latest in technology, Dan created a course on business intelligence and lectured at McGill University, and has applied behavioral science to topics such as augmented and virtual reality.

Sekoul Krastev's portrait

Dr. Sekoul Krastev

Sekoul is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. A decision scientist with a PhD in Decision Neuroscience from McGill University, Sekoul's work has been featured in peer-reviewed journals and has been presented at conferences around the world. Sekoul previously advised management on innovation and engagement strategy at The Boston Consulting Group as well as on online media strategy at Google. He has a deep interest in the applications of behavioral science to new technology and has published on these topics in places such as the Huffington Post and Strategy & Business.

Read Next

Notes illustration

Eager to learn about how behavioral science can help your organization?