Why do we think we understand the world more than we actually do?

The Illusion of Explanatory Depth

, explained.
Bias

What is illusion of explanatory depth?

The illusion of explanatory depth (IOED) describes our belief that we understand more about the world than we actually do. It is often not until we are asked to actually explain a concept that we come face to face with our limited understanding of it.

Where this bias occurs

Imagine an alien comes to Earth and demands that you explain the concept of houses. That’s easy for you, right? You’ve presumably lived in a house all your life, as has everyone you know. Everytime you go on a walk or drive, you see at least 50 of them. Explaining houses to aliens should be a piece of cake.

And yet, as the alien takes a seat to listen, you realize you can tell him what a house is, but you can’t explain much about them. How are they built? How did we as civilians come to live in houses? How are their prices determined? What are the laws surrounding them? How long have people lived in houses, and what did they live in before? Perhaps you can answer one or two of these specific questions, but surely the alien will have even more questions you can’t answer. To think that housing is such a simple concept, and that you actually know much less than you’d predicted puzzles you greatly. This is because of the illusion of explanatory depth: having to explain your knowledge brings you to the realization that you actually know much less than you thought you did.

Debias Your Organization

Most of us work & live in environments that aren’t optimized for solid decision-making. We work with organizations of all kinds to identify sources of cognitive bias & develop tailored solutions.

Learn about our work

Individual effects

The illusion of explanatory depth can cause you to make important decisions based on limited information, as you consistently believe you have much more information to work with than you actually do. As a result of limited but convincing information, you may become passionate and excited about a new concept. Perhaps you discover a new cause you choose to support, or a new class you’re suddenly excited about pursuing. When asked to explain the cause you are fighting for or the class for which you are signing up, however, you may find yourself stumped for words. In these cases, your decisions and inklings are often based on feelings more than deep thoughts, evident by your weak ability to explain them. If you consistently make choices by way of your immediate feelings, toward goals or causes you can’t explain in depth and don’t fully understand, you might find your values misrepresented by your life choices.

Systemic effects

On the whole, people far too often hold strong opinions about topics for which they have limited information, causing social and political movements to have plenty of people behind them with limited ideas of what they are even fighting for. In fact, studies show that those who are more passionate about a certain cause actually often know less about it. When cognitive scientist Phil Fernbach and his colleagues asked a scientific question about genetically modified foods to a group full of people and then asked them how strongly they were for or against genetically modified foods, he found that those who got the scientific question wrong were often more passionate about whichever side they supported.

In our political sphere, issues are often hotly debated on a ‘for or against’ basis, with two opposing sides. While most people can list off reason after reason they may be for or against a certain cause, fewer people can actually explain the issue at hand with any clarity. Thinking that we understand things far more than we actually do unfortunately causes us to jump to conclusions about which side we support before we even fully understand the issue. As you can see, this bias unnecessarily causes division and a lack of consensus in society.

Why it happens

Change blindness

Change blindness is the human tendency to miss changes in appearance when the change is disrupted by other visual stimuli. For example, if you see an image of a bike, then suddenly you see something else, you will not notice if the image of the bike returns looking slightly different than before.

In a way, change blindness contributes to why we understand much less than we think we do about all sorts of concepts. When something is not in front of us, we become blind to many of its features―and also blind to our own blindness. This was proven with bikes specifically in another experiment, in which Fernbach and colleagues asked people how much they knew about bikes. As you can predict, many of the experimental subjects forecasted a high degree of knowledge. After asking them to draw a basic bike, however, all of their drawings were wildly inaccurate when compared with real bikes, largely due to change blindness. Unfortunately, when information is not directly in front of us, our memories and conceptions of it are shaky at best―but our egos are still intact.

The illusion of levels

Being able to describe different levels of an item or concept inflates our perception of how much we know about it. For example, if asked to explain how a computer works, you might start by listing its different parts: there’s a screen, a mouse, a keyboard, and a motherboard. If you can move onto the next level―describe what a keyboard does or looks like―you may believe you can keep going deeper and deeper and that you therefore know a lot. That being said, if you still can’t explain how these parts function together to make the computer work, you can’t actually explain computers.

Indiscrete information

Unlike facts or process knowledge, explanations have no discrete ending and in practice, may not stop until one’s audience reaches an understanding. While it’s clear to us whether or not we know a fact―yes or no―and whether or not we can describe a process―as it reaches an end goal―explanatory knowledge does not reach a natural end. For that reason, anything you can begin to explain leads you to believe you understand it even if the explanation is not at all sound or comprehensive.

Rarity

Lastly, the illusion of explanatory depth occurs because quite simply, we rarely explain things. If you think about the last time you told someone a fact or taught someone a process, it was probably more recently than the last time you explained a concept. We often give reasons we support or condemn a cause, but rarely explain the cause in detail. In turn, this leads to worse overall understandings and less practice and proficiency in explanations―a classic vicious cycle.

Why it is important

The  illusion of explanatory depth is important because it shows the ignorance on which many aspects of society are based, and the false pretenses under which so much division exists. While many people think they are experts in certain fields, when asked to explain their views, they often fall silent. The  illusion of explanatory depth shows us that more explanation―and less “reason giving” and argument―can help our society build consensus, understanding, and deeper knowledge.

How to avoid it

Next time you learn about something new, ask yourself to explain the concept out loud to yourself or to someone else before you develop a strong opinion on it. Take questions from other people, or predict what their questions may be. As you do so, keep digging for answers, and don’t close yourself off from discovering more information. Unfortunately, the most well-informed people are less passionate about topics, since they understand well the many sides and factors at play. Before your own passion clouds your understanding and turns into a hateful post on Facebook, practice explaining concepts out loud and spotting gaps in your own knowledge.

How it all started

The  illusion of explanatory depth was coined in 2002 by Yale researchers Leonid Rozenblit and Frank Keil. In their experiment,  Rozenblit and Keil asked Yale undergraduate students to rate their understanding of everyday items like sewing machines, cell phones, and zippers. They then asked subjects to write a detailed explanation of how each item works and then re-rate their own knowledge on these items. Confronted with the limitations of their own understandings, participants consistently scored themselves much lower than they had before writing these explanations. In their paper, The Misunderstood Limits of Folk Science: An Illusion of Explanatory Depth, Rozenblit and Keil concluded that having to explain basic concepts helps people realize that they don’t know all they think they do.

Example 1 - Rash decision-making

Suppose you happen upon information about a new graduate program that seems to be a great fit for you. Its slogan and marketing techniques draw on your values of innovation, betterment in society, and diversity. It’s located in a city you’ve always wanted to move to, and its recent graduates have gone on to successful careers in a variety of fields that you could definitely see in your personal crystal ball. You’re already starting your application, and you begin to tell your friends and family about your excitement.

All of a sudden, one of your friends calls and asks you to tell him more about the program. You launch into the reasons you want to go: it’s in a great city, it has reputable alumni, and its website really speaks to your values.

“No, no, no,” your friend says, stopping you. “I want you to explain the program itself. What are you going to learn about? What is the program actually training you to do?”

Often, subjects and concepts catch us when we’re thinking fast and running high on passion, meaning our feelings don’t necessarily translate succinctly into explanation. In these cases, we may make rash decisions without thinking them through, and being asked to explain ourselves can actually help get us in touch with our own thoughts.

If it weren’t for your friend, before you know it, you may spend thousands of dollars on a graduate program that you don’t even fully understand. Only the attempt to explain the program can put you back in touch with your own reality, and prompt you to do some more research.

Example 2 - Political extremism

In one study by Phil Fernbach and colleagues, experimenters asked subjects what political causes were close to their hearts, and then asked them to give reasons why they support them. Afterward, the experimenters offered to donate money on the subjects’ behalf to an advocacy group for these particular causes, and more often than not, the subjects agreed.

In a second trial, experimenters asked subjects what political causes were close to their hearts, and then asked them to simply explain the causes in detail. Afterward, once again, the experiments offered to donate money on the subjects’ behalf to an advocacy group for these particular causes. In this trial, however, people largely declined the experimenters’ offers. After explaining the causes, the subjects often realized they didn’t actually know as much as they thought they did about these subjects, and that they would rather dig deeper into these issues before giving them money.

While in the first instance, being probed to think with one side in mind confirmed subjects’ already held views, being asked to explain a more general view on the second trial caused subjects to think twice about supporting a cause. As you can see, our political landscape’s focus on argument and reason, rather than thorough explanation, often confirms people’s already held views, promoting extremism and division.

Summary

  1. What it is
    The  illusion of explanatory depth describes the common realization we have when we begin to explain a concept: that we actually understand it far less than we thought we did.
  1. Why it happens
    The  illusion of explanatory depth happens for four reasons:
    1. When information is not in front of us, our memories for it are foggy, but we aren’t aware of this gap.
    2. Believing that we can briefly explain multiple parts or levels of a concept leads us to believe we understand the entire concept better
    3. Not having a natural end-point for explanations feeds our ego and leads us to believe we can explain anything well, since there is no such thing as “complete.”
    4. We rarely explain things, and therefore don’t get the practice or feedback we need to understand our own shortcomings.
  1. Example 1 - Rash decision-making
    Without the opportunity to explain our decisions and therefore spot gaps that we need to look into further, we base decisions too largely on emotions and impulses, leading to choices we may later regret.
  1. Example 2 - Political extremism
    When we explain the concepts behind our views, rather than the reasons or arguments for them, we gain a more holistic understanding of any issue that can combat political extremism and division.
  1. How to avoid it
    You can avoid the IOED by explaining concepts in detail, out loud, before launching into debate or making decisions regarding them. Keep an open mind, eyes, and ears, and be sure to understand all sides of a topic before picking one to vouch for.

Related TDL articles

Confirmation bias describes our tendency to seek out information that corresponds with beliefs we already hold. Similarly to the  illusion of explanatory depth, this bias can lead to division in society as a result of ignorance disguised by arrogance.

Similarly to the illusion of explanatory depth, the illusion of validity describes our tendency to be overconfident in the accuracy of our judgments and predictions.

Sources

  1. Fernbach, P. (2013, November 15). The Illusion of Understanding: Phil Fernbach at TEDxGoldenGatePark. YouTube. https://www.youtube.com/watch?v=2SlbsnaSNNM&ab_channel=TEDxTalks
  2. (2017, November 29). You Don’t Know How Toilets Work – The Illusion Of Explanatory Depth [Video]. YouTube. https://www.youtube.com/watch?v=9CodKUa4F2o&ab_channel=Technicality
  3. Waytz, A. (2017). The Illusion of Explanatory Depth. Edge.org. https://www.edge.org/response-detail/27117

About the Authors

Dan Pilat's portrait

Dan Pilat

Dan is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. Dan has a background in organizational decision making, with a BComm in Decision & Information Systems from McGill University. He has worked on enterprise-level behavioral architecture at TD Securities and BMO Capital Markets, where he advised management on the implementation of systems processing billions of dollars per week. Driven by an appetite for the latest in technology, Dan created a course on business intelligence and lectured at McGill University, and has applied behavioral science to topics such as augmented and virtual reality.

Sekoul Krastev's portrait

Dr. Sekoul Krastev

Sekoul is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. A decision scientist with a PhD in Decision Neuroscience from McGill University, Sekoul's work has been featured in peer-reviewed journals and has been presented at conferences around the world. Sekoul previously advised management on innovation and engagement strategy at The Boston Consulting Group as well as on online media strategy at Google. He has a deep interest in the applications of behavioral science to new technology and has published on these topics in places such as the Huffington Post and Strategy & Business.

Notes illustration

Eager to learn about how behavioral science can help your organization?