Curse of Knowledge
What is the Curse of Knowledge?
The curse of knowledge, also known as the curse of expertise, is a cognitive bias in which we assume that the people we are talking to have the same level of understanding as we do on a given subject. This often causes a barrier to effective knowledge sharing as we are uncertain about what the other party already knows.
The Basic Idea
People often say, “knowledge is power.” But what happens when we all have different levels of knowledge?
The “curse of knowledge,” or “the curse of expertise,” is a cognitive bias where we incorrectly assume that everyone knows as much as we do on a given topic.1 When we know something, it can be hard to imagine what it would be like not knowing that piece of information. In turn, this makes it difficult to share our knowledge because we struggle to understand the other party’s state of mind. The curse of knowledge can profoundly affect how information is communicated, particularly by experts and educators. These individuals often struggle to convey complex concepts effectively because they have difficulty imagining what it’s like for others to lack the depth of knowledge they possess.
"Lots of us have expertise in particular areas. Becoming an expert in something means that we become more and more fascinated by nuance and complexity. That’s when the curse of knowledge kicks in, and we start to forget what it’s like not to know what we know."
– Chip and Dan Heath, authors of Made to Stick: Why Some Ideas Survive and Others Die
Key Terms
Hindsight bias: Also referred to as the knew-it-all-along phenomenon or creeping determinism, the hindsight bias is the tendency for individuals to perceive past events as having been more predictable than they actually were.
Homo economicus: A hypothetical, idealized person who makes rational decisions aimed at maximizing personal utility or profit. This model assumes that individuals always act in their own best interest, are fully informed, and have the ability to make decisions that are logically consistent and based on clear preferences.
Asymmetric information: A situation in which one party in a transaction has more or better information than the other party. Asymmetric information is a critical concept in economics, as it can distort markets and hinder efficient exchanges.
Dunning-Kruger effect: A cognitive bias where people with low ability or knowledge in a particular domain overestimate their competence, while experts tend to underestimate their own. This occurs because those with limited expertise lack the self-awareness to recognize their own shortcomings.
Inhibition: In the context of the curse of knowledge, this refers to our inability to suppress our own extensive knowledge of a subject when trying to communicate it to someone who is less informed.12
Fluency misattribution: The ease with which information comes to mind influences judgments about how widely that information is known. In other words, we tend to project our understanding, or simply what we think is common knowledge, onto others.12
History
In 1975, American psychologist Baruch Fischhoff published Hindsight ≠ Foresight: The Effect of Outcome Knowledge on Judgement Under Uncertainty.2 Fischhoff developed a method to examine hindsight bias, a cognitive bias where an event’s outcome seems more predictable after we know what happened.
Fischhoff’s method consisted of presenting participants with four possible outcomes to a short story.2 Some participants were told which one of the four outcomes was true; other participants were not given any information. Then, all participants were asked to determine the likelihood of each outcome. Fischhoff found that when a participant was told an outcome was true, they frequently assigned a higher probability to that outcome. On top of overestimating the probability of outcomes for which they had extra information, participants also failed to reconstruct their prior, less knowledgeable states of mind.
There are several famous real-world examples of hindsight bias. During the investigation into the Challenger space shuttle disaster in 1986, for example, the failure of the O-ring seal appeared obvious to the investigators based on what had happened. However, during the lead up to the launch, the anomalies in the seals were not obvious. Based on what the shuttle engineers knew and the information that was available to them at the time, they acted in line with the prevailing safety processes.11
Stemming from Fischoff’s work on hindsight bias, the term “curse of knowledge” was first used in the 1989 article The Curse of Knowledge in Economic Settings: An Experimental Analysis by economists Colin Camerer, George Loewenstein, and Martin Weber.3 They credited British-American psychologist Robin Hogarth with coining the term, and they explored the curse of knowledge in the context of economic transactions.
Their study observed that different economic agents have different amounts of knowledge.3 Sellers tend to know more about the value of their products than prospective buyers; workers tend to know more about their skills than prospective employers. Crucially, Camerer, Loewenstein, and Weber argued that the curse of knowledge perpetuates this informational imbalance even when an agent wants to convey what they know. They also argued that this unintentional imbalance had two consequences:
- Better-informed agents might suffer losses—having more information might hurt us! Sellers might overprice products, assuming buyers recognize their value, or investors may misjudge market reactions, leading to poor decisions.
- The curse of knowledge can mitigate market consequences that result from information asymmetry. Well-informed agents may unintentionally share their advantage by lowering prices or making fairer deals, reducing the typical market inefficiencies caused by uneven information.
Following Camerer, Loewenstein, and Weber’s work, Elizabeth Newton, a graduate student in psychology at Stanford in 1990, developed an experiment that is now a classic example of the curse of knowledge.1,4 She asked participants to tap out popular songs with their fingers (known as the “tappers”) before predicting how many of those tapped melodies would be recognized by other people (known as the “listeners”). She also asked the tappers to predict how many people would guess the melody correctly.
In a sample of 120 melodies, listeners got it right only 2.5% of the time.1 The tappers had predicted a 50% success rate: they grossly overestimated how well the listeners could guess due to the curse of knowledge. Once they were given a song to tap, they couldn’t help but hear the melody that their tapping was based on, so they assumed the listeners would also hear the melody.4 In reality, all the listeners heard was a random series of taps.1
The curse of knowledge was popularized in the 2007 book Made to Stick: Why Some Ideas Survive and Others Die.10 Therein, Chip and Dan Heath brothers explore the concept of “stickiness:” making ideas memorable and interesting. Their claim is that, by making ideas sticky, we can avoid the curse of knowledge: they become so memorable we never forget them in the first place. If our memory of a given choice is really memorable, we would be less prone to remembering and reevaluating that choice in accordance with what we know now.
People
Baruch Fischhoff
Renowned psychologist and behavioral scientist known for his foundational work on hindsight bias, a cognitive phenomenon where people perceive past events as more predictable than they actually were. In his 1975 book Hindsight ≠ Foresight, Fischhoff demonstrated how knowing the outcome of an event can distort individuals’ recollections of their prior judgments and perception of the event’s inevitability.
Robin Hogarth
British-American psychologist and professor credited with coining the term ‘curse of knowledge’ in the 1980s. Hogarth’s main body of work centered on the psychology of judgment and human decision-making.
Colin Camerer
American behavioral economist whose work focuses on improving the economic analysis of decision-making, gaming, and financial markets. Camerer is particularly known for his work on Game Theory as summarised in his 2003 book Behavioral Game Theory: Experiments in Strategic Interaction.
George Loewenstein
American educator and economist (and great-grandson of Sigmund Freud!) who is widely regarded as one of the early founders of the fields of behavioral economics and neuroeconomics. From applying psychology to economics and then economics to psychology, Loewenstein has explored the intersection of psychology and economics from almost every angle. One of his most renowned ideas is the "hot-cold empathy gap", which explains how people in a "cold" (rational) state struggle to predict how their emotions or impulses will influence their decisions when they are in a "hot" (emotional or aroused) state, and vice versa.
Martin Weber
German professor in the field of behavioral economics, decision-making, and psychology. His research focuses on understanding and modelling the human psyche in financial situations.
Elizabeth Louise Newton
Former psychology graduate student at Stanford University, most renowned for her "tapper and listener" experiment, which demonstrated the curse of knowledge.
behavior change 101
Start your behavior change journey at the right place
Impacts
Camerer, Loewenstein, and Weber discussed the implications of the curse of knowledge for economics.3 The first type of situation where the curse of knowledge might be important is in cases of asymmetric information. When people have private information that less informed people lack, the traditional economic assumption is that better informed people, behaving as a homo economicus, will optimally exploit their informational advantage.
However, the study found the opposite. A homo economicus with an information advantage would make the same offer regardless of the monetary amount to be divided–that way, they could maximize their profits. Yet, Camerer, Loewenstein and Weber found that people present a larger offer when the amount to be divided is larger. Well-informed people simply assume everyone has the same information, so they make a fairer offer because they think everyone knows what they know. In this situation of informational imbalance, the curse on knowledge yields a fairer deal.
The second situation where the curse of knowledge might be important is when people learn more over time and have to reconstruct their earlier perspectives.3 Here, the curse of knowledge interacts with another bias (hindsight bias). When we’re evaluating our decisions in the past, we tend to mistakenly assume that we knew then what we know now. So, we think that an unfavorable outcome was known to be unfavorable, and that a favorable outcome was always known to be favorable. The curse of knowledge, along with hindsight bias, leads us to judge previous choices according to our current knowledge, even when we didn’t have that information at the time.
A third situation where the curse of knowledge is at play is in teaching, especially for engineering and science in higher education.4 While instructors have extensive expertise in their fields, there tends to be a disconnect between what they understand and what their students understand. Some credit this disconnect to instructors’ lack of training and experience in teaching, being distracted by greater interests in research, or simply not caring about their students’ understanding.5
The curse of knowledge gives us a more straightforward (and more charitable) explanation. While expertise in a field can increase instructors’ confidence in their ability to teach, they struggle to deliver the material in a way that fits what their students know.6 Cognitive scientist Steve Pinker identified the following issues in how instructors in higher-education deliver content:7
- Abstract language is used for already complex topics;
- There can be clumsy transitions between related topics, making the connection between them opaque;
- Instructors often use so-called zombie nouns instead of verbs or adjectives (verb + ization, e.g., operationalization); and,
- There can be inadequate interpretations of external sources, further confusing students.
The curse of knowledge also has implications for our everyday lives.9 Many people write to-do lists to keep themselves organized, but sometimes they can look less like a comprehensive list; and more of a collection of scattered sticky notes. This is because, when we lose or forget one of the items of our to-do list, and then reencounter it months later, the curse of knowledge makes us forget why that needed to be done. So, a to-do list item becomes a pointless series of words scrawled on a page.
Overcoming the knowledge gap
Generally speaking, being well-educated and knowledgeable sets you up well for life. For some, knowing more makes them feel more confident. Yet, possessing significantly more information or expertise than others can result in a knowledge gap that can be harmful in many different situations. The knowledge gap can lead to communication imbalances, which in turn causes frustration, confusion, misunderstanding, and disappointment.
Obviously, we can’t ‘unknow’ what we already know—and we don’t necessarily want to. Can you remember what it’s like not to know how to drive or to speak your mother tongue? As we go through life, we gain knowledge which seems ‘innate’ to us over time. But, what we view as innate knowledge is not necessarily shared knowledge. So, how can we overcome the curse of knowledge?
Get to know your audience
For starters, we need to reflect on our relative expertise or privileged knowledge of a topic and compare that level with that of those around us. Having a chat with others to gauge their existing knowledge of a topic or a problem can help us put ourselves in other people’s shoes and then tailor our communications to a less-informed perspective. Effective teachers do this on a daily basis — they predict the issues and the misconceptions that their students may face when learning something new, and they adapt their teaching to account for this.
Simplify
The old rule of thumb that ‘less is more’ is an important principle to consider when communicating complex ideas to other people. By starting with simplicity, we allow room for further elaboration if our audience grasps what we’re talking about and wants to know more. However, if we overwhelm our audience with complexity from the outset, they may struggle to grasp the fundamentals, leaving them confused and disengaged.
Encourage learning
In organizational settings, the impact of the curse of knowledge can be reduced by promoting a culture of continuous learning. Ongoing education and professional development ensures that knowledge gaps are filled and that all members stay informed on key topics. That’s not to say that everyone will become an expert, but the knowledge gap can be diminished significantly.
Case Study
The curse… without knowledge?
Ironically, what exactly causes the curse of knowledge is still a mystery. For years, researchers have been exploring the proposed mechanisms that underlie the bias, two of which are inhibition and fluency misattribution.
The inhibition theory argues that people have difficulty fully suppressing the content of their knowledge when trying to convey it to someone with a less informed perspective. Fluency misattribution, on the other hand, suggests that when something feels easy to remember or recognize (because we've seen it before), we mistakenly think that it must also be easy for everyone else to know. Instead of realizing that the feeling comes from our past exposure, we assume that the information is just naturally easy or obvious.
One study explored whether fluency alone (without actual knowledge) can induce the curse of knowledge bias. They tested the two mechanisms (inhibition and fluency misattribution) across three experiments with 359 undergraduate students.12 In the first experiment, participants learned new facts and were later tested on their recall. Over time, some of the facts were forgotten, leading to the creation of three distinct conditions. The first condition, called the “known condition,” included facts that participants had learned and successfully remembered. The second condition, the “fluent condition,” consisted of facts that participants had been exposed to but had since forgotten the correct answers. Finally, the “unknown condition” consisted of facts that participants had never encountered before. Participants overestimated how many peers knew the facts, even when they had forgotten the answers, suggesting fluency misattribution influenced their judgments.
Experiment two replicated these findings with a longer delay (2 weeks), thus increasing the likelihood that the participants would forget the facts. Even without recalling answers, participants still believed previously seen questions were more widely known. This provided further evidence that familiarity with a question alone—without recall of the actual answer—was enough to make participants believe the information was more widely known than it actually was.
In the third experiment, participants were shown trivia questions multiple times but never given the answers. They were then asked to estimate how many of their peers would know the correct answers to these questions. The findings revealed that participants judged the questions they had seen more frequently as more widely known, even though they had never actually learned the answers themselves. Even when participants themselves lacked knowledge, their familiarity with the question alone made them assume that others would be more likely to know the information.
The authors found that fluency misattribution, rather than failure of inhibition, is sufficient to induce the curse of knowledge. Even when participants lacked knowledge, they overestimated how commonly known information is among their peers due to the fluency with which the question itself was processed. The study, therefore, not only provided evidence for one of the primary mechanisms of the curse of knowledge, but demonstrated that the bias can occur even when we don’t have knowledge of a particular event or subject.
Faculty development strategies
In light of the discussion around the curse of knowledge in higher education, researchers have explored potential strategies to increase awareness of this cognitive bias among instructors, and close the teacher-student gap.4 In line with the Heath brothers’ suggestion that sticky messages can be used to combat the curse of knowledge,10 the following principles have been suggested for instructors:4
- Simplicity: Section out content and distill it into concise, accessible messages.
- Unexpectedness: Step beyond the mindset that since you are curious about your content, your students will be too. Instead, actively share opportunities for curiosity!
- Concreteness: Provide and ask students to provide concrete, specific examples.
- Credibility: Use problem- or project-based learning to engage students with issues related to the content. They may want to learn more about a topic once they’ve engaged with it.
- Emotions: Allow room to cope with the emotional frustration that can be part of learning.
- Stories: Use stories to make content more personal and accessible to students. Stories are also great memory aids!
On top of these six principles, researchers suggest that instructors evaluate students’ mindsets at the beginning of course, to better understand what they know initially.4 Researchers also suggest collecting feedback throughout the course, to make sure students can convey their feelings, experiences, and stories. Gathering this feedback can inform successful teaching practice, fostering student-teacher connections that make it easier for teachers to understand their students’ perspective. It might even lead to better grades!
Related TDL Content
How working from home can amp up your team’s communication and creativity
You’d be hard-pressed to have a conversation about 2020 and 2021 without mentioning COVID-19, as the pandemic has touched many areas of our lives. One of these areas is the transition to remote work. The curse of knowledge can be difficult enough to navigate during in-person interactions, but it can be an entirely different story in the online world. Take a look at this article for insights on how to capitalize on remote work, and how to beat the curse of knowledge.
Zooming Out: The Impact of Distance on our Decisions
As we saw earlier, the curse of knowledge happens when we have some distance between what we know now and what we knew before. However, this kind of effect temporal distance has in our decision-making is only one way in which distance affects our thinking. In this article, our staff writer Kaylee Somerville walks through the effects other kinds of distance have on what we decide.
Sources
- Heath, C., & Heath, D. (2006, December). The curse of knowledge. Harvard Business Review. https://hbr.org/2006/12/the-curse-of-knowledge
- Fischhoff, B. (1975). Hindsight ≠ foresight: The effect of outcome knowledge on judgement under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1(3), 288-299.
- Camerer, C., Loewenstein, G., & Weber, M. (1989). The curse of knowledge in economic settings: An experimental analysis. Journal of Political Economy, 97(5), 1232-1254.
- Froyd, J., & Layne, J. (2008, October). Faculty development strategies for overcoming the “curse of knowledge”. In 2008 38th Annual Frontiers in Education Conference (pp. S4D-13). IEEE.
- Wieman, C. E. (2007). APS News – The back page. The” curse of knowledge” or why intuition about teaching often fails. American Physical Society News, 16(10).
- Fisher, M., & Keil, F. C. (2016). The curse of expertise: When more knowledge leads to miscalibrated explanatory insight. Cognitive Science, 40(5), 1251-1269.
- Leddy, C. (2012, November 8). Exorcising the curse of knowledge. The Harvard Gazette. https://news.harvard.edu/gazette/story/2012/11/exorcising-the-curse-of-knowledge/
- Birch, S. A. J., & Bloom, P. (2007). The curse of knowledge in reasoning about false beliefs. Psychological Science, 18(5), 382-386.
- Berg, A. (2021). 33 Key Principles for Success. Kindle.
- Made to Stick: Why Some Ideas Survive and Others Die. (n.d.). Heath Brothers. https://heathbrothers.com/books/made-to-stick/
About the Authors
Dan Pilat
Dan is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. Dan has a background in organizational decision making, with a BComm in Decision & Information Systems from McGill University. He has worked on enterprise-level behavioral architecture at TD Securities and BMO Capital Markets, where he advised management on the implementation of systems processing billions of dollars per week. Driven by an appetite for the latest in technology, Dan created a course on business intelligence and lectured at McGill University, and has applied behavioral science to topics such as augmented and virtual reality.
Dr. Sekoul Krastev
Sekoul is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. A decision scientist with a PhD in Decision Neuroscience from McGill University, Sekoul's work has been featured in peer-reviewed journals and has been presented at conferences around the world. Sekoul previously advised management on innovation and engagement strategy at The Boston Consulting Group as well as on online media strategy at Google. He has a deep interest in the applications of behavioral science to new technology and has published on these topics in places such as the Huffington Post and Strategy & Business.