Why do we think we understand the world more than we actually do?
The Illusion of Explanatory Depth
, explained.What is the Illusion of Explanatory Depth?
The illusion of explanatory depth (IOED) describes our belief that we understand more about the world than we actually do. It’s often not until we are asked to explain a concept that we come face to face with our limited understanding of it.
Where this bias occurs
Imagine an alien comes to Earth and demands that you explain how a toilet works. That’s easy for you, right? You’ve presumably been using a toilet your entire life, as has everyone you know. In fact, you may even have encountered an array of different types of toilets, from the old-style one with a pull flush in your grandparents’ house to more modern ones with heated seats in places like Japan. Explaining toilets to aliens should be a piece of cake. You assure your new friend that very soon, he’ll be an expert on toilets.
And yet, as the alien takes a seat to listen, you realize you can tell him about the button or the lever you press to flush the toilet, but you can’t explain much about what’s going on inside the toilet. How does the dirty water leave and the clean stuff arrive? What mechanisms or forces are working behind the scenes? And what about the little thing that rises and falls inside the… what’s it called… the bit at the back…
Perhaps you can partially answer one or two of these specific questions (or even more if you happen to be a plumber), but surely the alien will have even more questions you can’t answer. To think that a toilet is such a simple, everyday item, and yet you actually know much less than you’d predicted. This puzzles you greatly.
Your trouble explaining a toilet to the alien is because of the illusion of explanatory depth: sometimes, having to explain your knowledge brings you to realize how limited it is in reality. You could replace the example of the toilet with many other everyday items, such as locks, car engines, or light bulbs.
The illusion has two parts. First, the “explanatory” part refers to our belief that we can provide a clear, detailed explanation of how something works. The “depth” part reflects our assumption that the explanation will be thorough and complex enough to convey what we’re trying to explain. In reality, however, when we try to dig into the details, we often find that our explanatory knowledge is much shallower than we initially thought. What we actually end up with is the reality of explanatory shallowness.
Both parts are important to the illusion. That is, it’s only when we explain something that the gaps in our knowledge become apparent to us. Before that, we might still believe in our minds that we possess a depth of knowledge. The illusion of the explanatory gap is something that we’re subject to from an early age, with studies showing that children as young as kindergarten overestimate their explanatory knowledge.7
The illusion is far stronger for explanatory knowledge than other domains of knowledge, such as facts, procedures, or narratives.11 And it doesn’t apply if you know nothing about a topic and are happy to admit it. It only works when we inaccurately overestimate our understanding or knowledge of a certain concept or topic.
Individual effects
The illusion of explanatory depth can cause people to make important decisions based on limited information, as we consistently believe we have much more information to work with than we really do. As a result of limited but convincing information, we may become passionate and excited about a new concept. Perhaps you discover a new cause you choose to support or a new class you’re suddenly excited about pursuing. When asked to explain the cause you are fighting for or the class for which you are signing up, however, you may find yourself stumped for words. In these cases, your decisions and notions are often based on feelings more than deep thoughts, evident by your weak ability to explain them. If you consistently make choices by way of your immediate feelings, pursuing unclear goals or causes, you might find your values misrepresented by your life choices.
Systemic effects
On the whole, people far too often hold strong opinions about topics for which they have limited information. This can cause social and political movements to have plenty of people behind them with limited ideas of what they are even fighting for. In fact, studies show that those who are more passionate about a certain cause actually often know less about it. In one such study, cognitive scientist Phil Fernbach and his colleagues asked a scientific question about genetically modified foods to a group of people.12 They then asked the participants how strongly they were for or against genetically modified foods. Fernbach found that those who got the scientific question wrong were often more passionate about whichever side they supported.
In our political sphere, issues are often hotly debated on a ‘for or against’ basis, with two opposing sides. While most people can list off reason after reason for why they support or oppose a certain cause, fewer people can actually explain the issue at hand with clarity. Thinking that we understand things far more than we truly do, unfortunately, causes us to jump to conclusions before we fully understand the issue. As you can see, this bias unnecessarily causes division and a lack of consensus in society.
The illusion of explanatory depth can also make us less susceptible to considering other political opinions. One study by a student at MIT looked at whether getting people to explain a political policy in detail—thereby exposing the illusion of explanatory depth—would make them more open to changing their views when later presented with a persuasive argument.8
The experiment involved asking participants to first rate their understanding and opinion on a political policy. Some were then asked to explain in detail how the policy worked, while others did a neutral task (like describing their morning routine). Afterward, all participants were shown a persuasive argument for the opposite viewpoint and asked to rate their opinion again.
While participants did rate their understanding lower after trying to explain the policy, they were actually less likely to shift their position compared to those who didn’t have to explain anything. This finding suggests that realizing how little we understand a topic might make us even more committed to our original views.
Why it happens
Apart from the fact that it would be quite time-consuming to obtain comprehensive knowledge of everything, there are four main factors that contribute to the illusion of explanatory depth.
Change blindness
Change blindness is the human tendency to miss changes in appearance when the change is disrupted by other visual stimuli. For example, if you see an image of a bike, then glance at something else, you will not notice if the image of the bike returns looking slightly different than before.
In a way, change blindness helps explain why we understand much less than we think we do about many different concepts. When something is not in front of us, we become blind to many of its features―and also blind to our own blindness. You might call it change blindness blindness, or the meta-cognitive error of overestimating our ability to detect change.11 This was proven in another experiment, in which Fernbach and colleagues asked people how much they knew about bikes.13 As you can predict, many of the experimental subjects forecasted a high degree of knowledge. After asking them to draw a basic bike, however, all of their drawings were wildly inaccurate compared to real bikes, largely due to change blindness. In fact, Italian artist and designer Gianluca Gimini has been collecting strangers’ attempts at drawing a bike since 2009. His project, Velocipedia, is part testament to the illusion of explanatory depth, part abstract art gallery.4
Unfortunately, when information is not directly in front of us, our memories and conceptions of it are shaky at best―but our egos are still intact.
The illusion of levels
Being able to describe different levels of an item or concept inflates our perception of how much we know about it. For example, if asked to explain how a computer works, you might start by listing its different parts: there’s a screen, a mouse, a keyboard, and a motherboard. If you can move onto the next level―describe what a keyboard does or looks like―you may believe you can keep going deeper and deeper and that you therefore know a lot. That being said, if you still can’t explain how these parts function together to make the computer work, you can’t actually explain computers.
Indiscrete information
Unlike facts or process knowledge, explanations have no discrete ending. In practice, it may not stop until one’s audience reaches an understanding. While it’s clear to us whether or not we know a fact or can describe a process clearly, explanatory understanding does not reach a natural end. If we wanted to, we could keep going and going and going. For that reason, anything you can begin to explain leads you to believe you understand it, even if the explanation is not at all sound or comprehensive.
Rarity
Lastly, the illusion of explanatory depth occurs because, quite simply, we rarely explain things. The last time you told someone a fact or taught a process was probably more recent than the last time you explained a concept. We often give reasons why we support or condemn a cause, but rarely explain the cause in detail. In turn, this leads to worse overall understanding and less practice and proficiency in explanations―a classic vicious cycle.
There are other factors at play other than those outlined above. People rely on sparse, simplified causal explanations called 'intuitive theories' to understand complex systems. These intuitive theories provide a sense of coherence and structure, even when they lack detailed or mechanistic accuracy. We're very rarely asked to give explicit explanations that conform to our intuitive theories of everyday phenomena.11
The extent to which we interact with other opinions and ideas can also impact the illusion. If we exist in an insulated community and never seek to challenge our opinions or ideas, we can be left feeling overly confident in our knowledge of a topic. As philosophy professor Daniel Zelinski points out, “Isolation from others’ opinions is compounded by the segregation of communities into people who only agree with each other. [...] This doesn’t give us opportunities for recognizing our own limitations around understanding.”6
Recent research by Ethan Meyers et al. has also discovered that the illusion of explanatory depth has a quasi-domino effect. That is, when a person tries to explain one phenomenon (e.g., how a zipper works), it leads them to report knowing less about a completely different phenomenon (e.g., how snow forms).5 Across a series of three experiments, participants rated their understanding of several mechanical devices, such as zippers, speedometers, and can openers. They were then asked to explain one of the devices in detail. Afterward, the participants re-rated their understanding of all the devices. The results of the first experiment showed that explaining one device reduced the participants’ confidence not just in that device, but in the others as well. In the third experiment, the researchers expanded their scope beyond mechanical devices to include natural phenomena (e.g., how snow forms) and economic policies (e.g., how trade with China affects the U.S. economy). The same pattern emerged, with participants’ confidence in their understanding of completely unrelated topics reducing as a result of their explanation of one phenomenon.
The study challenges the traditional specificity principle, which is the idea that the illusion of explanatory depth is only revealed when explaining the exact topic in question. Instead, it supports a breadth principle, where recognizing knowledge gaps in one area leads to a broader reassessment of one's overall understanding. The findings also suggest that explanation serves as a powerful, non-confrontational tool for reducing overconfidence, promoting intellectual humility, and increasing openness to new information. This approach could have significant implications for education, misinformation correction, and decision-making, helping individuals to understand the current limitations of their knowledge and seek new learning opportunities.
behavior change 101
Start your behavior change journey at the right place
Why it is important
The illusion of explanatory depth is important because it shows the ignorance on which many aspects of society are based and the false pretenses under which so much division exists. Phrases such as ‘fake it ‘til you make it’ and ‘we’re all making it up as we go along’ highlight the fact that we’re not as knowledgeable as we like to make everyone believe. While many people think they are experts in certain fields, when asked to explain their views, they often fall silent.
People claiming to know more than they really do can be potentially dangerous in today’s hyper-connected world. Misinformed claims such as “Airpods cause cancer!” and “Country X is using your cellphone to spy on you” can spread like wildfire through online channels. Social media channels such as YouTube, TikTok, and Reddit give your everyday layperson an easily-accessible platform on which to voice their ‘expert opinions’ about complex situations without regulation. When so-called experts are allowed to broadcast misinformed opinions that are taken as fact by their audience, disinformation can spread and lead to potentially dangerous behaviors and conflict.
The illusion of explanatory depth shows us that more explanation―and less “reason-giving” and argument―can help our society build consensus, understanding, and deeper knowledge.
How to avoid it
Honesty is the best policy when it comes to the illusion of explanatory depth. If someone asks you to explain something that you’re not qualified to talk about with authority, just admit it. That way, you avoid digging yourself deeper and deeper into a hole with no way out.
Next time you learn about something new, ask yourself to explain the concept out loud to yourself or to someone else before you develop a strong opinion about it. Take questions from other people, or predict what their questions may be. As you do so, keep digging for answers, and don’t close yourself off from discovering more information. Unfortunately, the most well-informed people are less passionate about topics, since they understand well the many sides and factors at play. Before your own passion clouds your understanding and turns into a hateful post on Facebook, practice explaining concepts out loud and spotting gaps in your own explanatory knowledge.
How it all started
The illusion of explanatory depth was coined in 2002 by Yale researchers Leonid Rozenblit and Frank Keil. In their experiment, Rozenblit and Keil asked Yale undergraduate students to rate their understanding of 48 everyday items, such as sewing machines, cell phones, and zippers. They then asked subjects to write a detailed explanation of how each item works and then re-rate their own explanatory knowledge on these items. Confronted with the limitations of their own understandings, participants consistently scored themselves much lower than they had before writing these explanations. In their paper, The Misunderstood Limits of Folk Science: An Illusion of Explanatory Depth, Rozenblit and Keil concluded that having to explain basic concepts helps people realize that they don’t know all they think they do.11
In their paper, Rozenblit & Keil refer to ‘folk theories’ as incomplete, intuitive explanations people use to make sense of the world. They argue that while concepts are often embedded within larger sets of explanatory relations, folk theories are rarely comprehensive or logically rigorous. Unlike formal scientific theories, which undergo scrutiny and revision, folk theories are fragmentary and often remain unexamined by those who hold them. Laypeople tend to believe they understand the world better than they truly do because they rarely need to provide full explanations for the phenomena they think they comprehend.
Rozenblit & Keil suggest that folk theories are influential in everyday reasoning and decision-making, shaping how people categorize, infer, and interpret causal relations. However, because these theories are often based on naïve intuitions rather than deep understanding, they can lead to overconfidence about knowledge and misjudgments about how things work.
Since Rozenblit & Keil first introduced the illusion of explanatory depth, advancements have extended this concept into new areas such as technology and AI. Researchers are now applying the illusion of explanatory depth to understand how non-expert users interact with complex AI systems like chatbots and machine learning models, finding that people often overestimate their grasp of these sophisticated technologies.
How it affects product
The illusion of explanatory depth can significantly influence both consumer behavior and product design. Consumers often overestimate how well they understand a product’s inner workings or even how to use it in the first place. This overconfidence might lead them to make purchasing decisions based on a simplified view of the product, which in turn may result in disappointment if the product doesn’t work as intuitively as expected.
On the design side, product developers may also inadvertently fall into the same trap. They might assume that users share their in-depth understanding of the product’s functionality, which can lead to overly complex interfaces or instructions that fail to simplify the user experience.
Rozenblit and Keil argue that people experience an illusion of understanding when it comes to complex systems and causal relations. That is, they often mistake their ability to recall basic details or apply simple heuristics for a genuine deep comprehension of how things work. Their 2002 paper argues that people often believe they understand mechanisms, concepts, or systems in greater depth than they genuinely do, particularly when they can easily visualize components or rely on intuitive theories.11 This illusion is most pronounced in areas where things appear visually obvious and straightforward. In this case, people tend to mistake their familiarity with visible components or basic functions for a deeper understanding of how the system works in reality. As the next section shows, this can happen with users of AI; although the user interface is easy to navigate, this doesn’t translate into explanatory knowledge of AI systems.
The illusion of explanatory depth and AI
With large language models such as ChatGPT, the world’s knowledge is at our fingertips, literally. But simply having facts about something doesn’t necessarily mean we understand it. As people acquire more and more facts through the democratization of knowledge, our illusion of explanatory depth may become more prevalent. That is, armed with more information, we may think we understand more than we do.
The illusion of explanatory depth also relates to how well we think we understand AI and the way it works. AI systems sometimes produce unexpected results, which has led to a demand for them to be more understandable. Explainable AI (XAI) addresses this by offering simple, local explanations for individual decisions made by the system. However, these explanations only show small parts of the full picture, leaving it up to the users to piece together how the entire model works.
Researchers at the Ludwig Maximilian University in Munich looked at whether non-technical users might fall for the illusion of explanatory depth—believing they understand the whole AI model better than they really do when they see these simple explanations.9 The researchers conducted two studies, one moderated with 40 participants and an unmoderated study with 107 crowd workers using a spreadsheet-like explanation tool based on the SHAP framework (short for SHapley Additive exPlanations. This is a method used to generate explanations of the predictions of machine learning models).
In the experiment, the researchers wanted to see if people would overestimate their understanding of an AI model after seeing simple Shapley-based explanations. Participants completed five tasks that explained how the model worked and regularly rated how well they understood it. By comparing how they rated their understanding before and after, the researchers could assess whether the participants realized they understood the model less than they initially believed.
The research team observed how these users built their mental model of the AI’s overall behavior from the local explanations. They found that when users were asked to examine their understanding more closely, they realized they knew much less than they initially thought.
Researchers at Erasmus University in Rotterdam explored whether or not leveraging the illusion of explanatory depth could discourage unethical use of ChatGPT in academic assignments.10 They believed that if students realized they had a poor understanding of how the AI model worked, they might question whether it was a good idea to rely heavily on the information it gave them. The study tested two strategies to discourage the use of AI models to complete homework and papers: sending a warning message and having students explain how ChatGPT works. While explaining, the chatbot lowered students' self-rated understanding of it. This process didn’t reduce their moral stance on using ChatGPT or their likelihood of using it in the future. Similarly, reading a warning message increased their perceived understanding but didn’t affect their likelihood of using ChatGPT or their moral views. The findings suggest that simply addressing students’ explanatory knowledge of ChatGPT is not enough to deter its unethical use—even if students don’t fully understand how the information they use is being generated.
Example 1 - Rash decision-making
Suppose you happen upon information about a new graduate program that seems to be a great fit for you. Its slogan and marketing techniques draw on your values of innovation, betterment in society, and diversity. It’s located in a city you’ve always wanted to move to, and its recent graduates have achieved careers in a variety of fields that you could definitely see in your personal crystal ball. You’re already starting your application, and you begin to tell your friends and family about your excitement.
All of a sudden, one of your friends calls and asks you to tell him more about the program. You launch into the reasons you want to go: it’s in a great city, it has reputable alumni, and its website really speaks to your values.
“No, no, no,” your friend says, stopping you. “I want you to explain the program itself. What are you going to learn about? What is the program actually training you to do?”
Often, subjects and concepts catch us when we’re thinking fast and running high on passion, meaning our feelings don’t necessarily translate succinctly into explanation. In these cases, we can make rash decisions without thinking them through, and being asked to explain ourselves can help get us in touch with our own thoughts.
If it weren’t for your friend, you might spend thousands of dollars on a graduate program that you don’t even fully understand. Only the attempt to explain the program can put you back in touch with your own reality and prompt you to do some more research.
Example 2 - Political extremism
In one study by Phil Fernbach and colleagues, experimenters asked subjects what political causes were close to their hearts, and then asked them to give reasons why they support them. Afterward, the experimenters offered to donate money on the subjects’ behalf to an advocacy group for these particular causes, and more often than not, the subjects agreed.14
In a second trial, experimenters asked subjects what political causes were close to their hearts, and then asked them to explain the causes in detail. Afterward, once again, the experimenters offered to donate money on the subjects’ behalf to an advocacy group for these particular causes. In this trial, however, people largely declined the experimenters’ offers. After explaining the causes, the subjects often realized they didn’t know as much as they thought about these subjects and that they would rather dig deeper into these issues before giving them money.
While the first method confirmed what participants already believed, the second made them pause and reconsider. As you can see, our political landscape’s focus on argument and reason, rather than thorough explanation, often confirms people’s already held views, promoting extremism and division.
Summary
- What it is
The illusion of explanatory depth describes the common realization we have when we begin to explain a concept: that we actually understand it far less than we thought we did.
- Why it happens
The illusion of explanatory depth happens for four reasons:
-
- When information is not in front of us, our memories of it are foggy, but we aren’t aware of this gap.
- Believing that we can briefly explain multiple parts or levels of a concept leads us to believe we understand the entire concept better.
- Not having a natural endpoint for explanations feeds our ego and leads us to believe we can explain anything well, since there is no such thing as “complete.”
- We rarely explain things, and therefore don’t get the practice or feedback we need to understand our own shortcomings.
- Example 1 - Rash decision-making
Without the opportunity to explain our decisions and therefore spot gaps that we need to look into further, we base decisions too largely on emotions and impulses, leading to choices we may later regret.
- Example 2 - Political extremism
When we explain the concepts behind our views rather than the reasons or arguments for them, we gain a more holistic understanding of any issue that can combat political extremism and division.
- How to avoid it
You can avoid the illusion of explanatory depth by explaining concepts in detail, out loud, before launching into a debate or making decisions based on them. Keep an open mind, eyes, and ears, and be sure to understand all sides of a topic before picking one to vouch for.
Related TDL articles
The confirmation bias describes our tendency to seek out information that corresponds with beliefs we already hold. Similarly to the illusion of explanatory depth, this bias can lead to division in society as a result of ignorance disguised by arrogance.
Yet another illusion that we’re subject to, the illusion of validity, describes our tendency to be overconfident in the accuracy of our judgments and predictions.