Change blindness is the human tendency to miss changes in appearance when the change is disrupted by other visual stimuli. For example, if you see an image of a bike, then suddenly you see something else, you will not notice if the image of the bike returns looking slightly different than before.
In a way, change blindness contributes to why we understand much less than we think we do about all sorts of concepts. When something is not in front of us, we become blind to many of its features―and also blind to our own blindness. This was proven with bikes specifically in another experiment, in which Fernbach and colleagues asked people how much they knew about bikes. As you can predict, many of the experimental subjects forecasted a high degree of knowledge. After asking them to draw a basic bike, however, all of their drawings were wildly inaccurate when compared with real bikes, largely due to change blindness. Unfortunately, when information is not directly in front of us, our memories and conceptions of it are shaky at best―but our egos are still intact.
The illusion of levels
Being able to describe different levels of an item or concept inflates our perception of how much we know about it. For example, if asked to explain how a computer works, you might start by listing its different parts: there’s a screen, a mouse, a keyboard, and a motherboard. If you can move onto the next level―describe what a keyboard does or looks like―you may believe you can keep going deeper and deeper and that you therefore know a lot. That being said, if you still can’t explain how these parts function together to make the computer work, you can’t actually explain computers.
Unlike facts or process knowledge, explanations have no discrete ending and in practice, may not stop until one’s audience reaches an understanding. While it’s clear to us whether or not we know a fact―yes or no―and whether or not we can describe a process―as it reaches an end goal―explanatory knowledge does not reach a natural end. For that reason, anything you can begin to explain leads you to believe you understand it even if the explanation is not at all sound or comprehensive.
Lastly, the illusion of explanatory depth occurs because quite simply, we rarely explain things. If you think about the last time you told someone a fact or taught someone a process, it was probably more recently than the last time you explained a concept. We often give reasons we support or condemn a cause, but rarely explain the cause in detail. In turn, this leads to worse overall understandings and less practice and proficiency in explanations―a classic vicious cycle.