The Illusion of Explanatory Depth: Why We Know Less Than We Think We Do
The illusion of explanatory depth (IOED) describes our belief that we understand more about the world than we actually do. It is often not until we are asked to actually explain a concept that we come face to face with our limited understanding of it.1
We know less than we think we do.
Most of the time, it’s not a big deal. We can navigate our days just fine.
In many areas, however, it’s critical to acknowledge what we know vs. what we don’t know.
We need to be well-calibrated. That is, we need to make sure our confidence matches our actual knowledge. When we think we know more than we do, we make bad decisions, even though we remain supremely confident in our decisions. This mismatch has real consequences:
• Thinking you know more about money and investments than you really do means you will lose money
• Thinking you know more about leadership and management than you really do means you will lose good people
• Thinking you know more about communication than you really do means strained relationships
• Thinking you know more about parenting than you really do means you will struggle to raise good kids
It’s easy to expose IOED (Illusion of Explanatory Depth) and I’ll show how you can prove it to yourself or in others.
We confuse the availability of information with the possession of knowledge. It’s easy to access information. We consume it all day. Almost any fact is accessible with a simple Google search.
But there’s a difference. Accessing information doesn’t mean we’ve converted that information into knowledge.
Knowledge is different. Knowledge requires a deep understanding of an idea, without help from the outside environment (see below) and without using our short-term memory.
Short-term memory is not knowledge. Wait a few days and you’ll discover what was in your short-term memory has vanished and the knowledge never existed.
At present, the IOED is profoundly pervasive given that we have infinite access to information, but consume information in a largely superficial fashion.2
That’s the curse of IOED. We think we know something in the moment, but we fail to anticipate that we’ll forget it soon in the future.
The question is, how do we counteract IOED to make better decisions?
And how do we get others to reveal their actual knowledge vs. relying on their professed confidence?
Let’s dive deeper into the reasons for IOED and then discuss tools to overcome it.
Why Does IOED Exist?
Our environment is a crutch. When information is readily accessible, we assume we understand more than we do. It’s easy to explain something after reading an overview about it. It’s easy to explain how something works when you are looking at the object. But as studies show, once these objects or information is removed, our ability to explain goes away.
The most famous example was asking people to draw a bike – something we all think we understand:
When something is not in front of us, we become blind to many of its features―and also blind to our own blindness. This was proven with bikes specifically in another experiment, in which Fernbach and colleagues asked people how much they knew about bikes. As you can predict, many of the experimental subjects forecasted a high degree of knowledge. After asking them to draw a basic bike, however, all of their drawings were wildly inaccurate when compared with real bikes, largely due to change blindness. Unfortunately, when information is not directly in front of us, our memories and conceptions of it are shaky at best―but our egos are still intact.3
Even when we struggle to explain, our egos confidently dismiss it as a one-off occurrence. We often rationalize why we couldn’t explain, while still believing we have the knowledge.
Don’t confuse short-term memory with long-term knowledge. We’ve all crammed information – trying to prepare for a test by shoving as much information in our brain as possible. And sometimes this works, at least for the test. But even if you score well on the test, the information isn’t in your long-term memory, so it’s lost within days of taking the test. You have no permanent working knowledge that you’ll be able to draw on in the future, all the while you’ll go about your life believing you still possess the knowledge. You’re actually worse off than you would have been if you had never taken the class because at least you wouldn’t fool yourself into believing you know something when you really don’t.
We hold too strong opinions. People confuse strong, confident opinions with knowledge. Being confident does not equal possession of knowledge, even though we like to associate confidence with expertise. This is partially due to ego, but also because people don’t know that they don’t know. They never get feedback to correct their beliefs. They truly believe they know, but actively avoid challenging their beliefs.
On the whole, people far too often hold strong opinions about topics for which they have limited information, causing social and political movements to have plenty of people behind them with limited ideas of what they are even fighting for.4
IOED is closely related to an interesting idea proposed by a team of researchers from Cornell– David Dunning and Justin Kruger:
In any domain of knowledge, often the most ignorant are the most overconfident in their understanding of that domain… Having to explain a phenomenon forces us to confront this complexity and realize our ignorance. At a time where political polarization, income inequality, and urban-rural separation have deeply fractured us over social and economic issues, recognizing our only modest understanding of these issues is a first step to bridging these divides.5
Dunning-Kruger claim that the most ignorant are the most overconfident in their beliefs. In my experience, I’ve found this to be accurate. The worst ideas and suggestions come from the people with the least knowledge. They can’t even begin to fathom how they could be wrong.
The problems of specialization – we know more and more about less and less. The drive to specialize is another reason for IOED. We don’t read widely. Or at all. We do our jobs, come home, and spend zero-time learning. We’ve lost our curiosity in seeking to understand the world.
Charlie Munger advocates a multidisciplinary approach to learning. That is, learning the big ideas from all the knowledge disciplines. By following this path, you learn the big principles and ideas that can help build your understanding.
You don’t have to become an expert in all areas, but you need to know the big principles. As mentioned previously, it’s incredibly helpful to understand the big ideas related to money, leadership, relationships, parenting, etc, because screwing up in these areas is catastrophic.
Knowing about IOED isn’t enough, so we need practical steps to combat these problems.
How Can We Counteract IOED?
We need to ask questions. Our ignorance is only revealed when we are asked to explain our understanding to someone else. Explaining reveals understanding. Explaining is how we take beliefs and convert them into proof.
Keep asking questions. Why do you know this to be true? How do you know? What would disprove your belief? Each time they respond, ask the same open-ended questions to reveal the rigor behind their assertions.
Most people will crack quickly. They are rarely pushed to explain their thoughts and usually can only explain themselves at a high level. It soon becomes apparent when they struggle to provide deeper explanations. They wont’ have it. They’ll try to make it up. And they’ll struggle valiantly to keep their stories coherent.
Asking questions separates superficial from deep knowledge:
Rozenblit and Keil argue that people tend to have knowledge at one level of explanation (e.g., pressing the flusher causing the water to drain and then fill up again"), and this causes them to mistakenly believe that they have knowledge at the other levels of explanation when they really don't. This explains why they don't exhibit the illusion of depth for facts and stories. Facts and stories generally only involve a few causal relations (some facts might not involve any) that can be described at one level of explanation, and thus it's more difficult to mistakenly believe we have explanatory knowledge that we don't actually have.6
Again, superficial knowledge isn’t bad, but never confuse it with deep knowledge because that’s where the problems will occur.
You have to push your team to get their true knowledge. This is critical if you are a leader. You must assess what your team knows. Don’t rely on their confidence. Don’t rely on them telling you how good they are. Make them prove it!
This isn’t about being a jerk. You are trying to make everyone better by making sure they don’t BS themselves and the team.
Do this with outside consultants, vendors, and managers. Make them work to prove what they know. Don’t let them settle into a comfortable routine of high-level, vague verbiage that anyone could string together. Don’t take their word on their expertise - make them show it.
Writing. The best way to reveal and eliminate IOED. Writing is an exceptional tool that will expose IOED every time. Writing is hard because you have to know what you are talking about to write well. There’s no way around it. It’s easy to spot rambling, incoherent messages. It’s easy to spot logical fallacies or gaps in arguments that you’d miss through the spoken word or in your thoughts. That’s the power of writing. It forces you to expose and reveal all your knowledge in a concrete form. With thinking or talking, we can gloss over details and talk fast enough to cover our ineptitude.
That doesn’t happen with writing. The written word allows readers to methodically dissect your arguments, trying to figure out if you’ve earned their approval.
This is why we need to write – writing forces us to articulate what we know. Writing is a discipline that reveals our true knowledge.
Writing is a powerful way to fill gaps in your knowledge. Without writing, we never get the feedback we need to improve. It takes work to write, which is why so few do it.
Even scientists realize how writing imposes a high level of discipline:
The incompleteness of everyday theories should not surprise most scientists. We frequently discover that a theory that seems crystal clear and complete in our head suddenly develops gaping holes and inconsistencies when we try to set it down on paper.7
The Illusion of Explanatory Depth is a gentle reminder that we are not as smart as we think we are.
Sources:
1. The Illusion of Explanatory Depth - The Decision Lab. https://thedecisionlab.com/biases/the-illusion-of-explanatory-depth/
2. Adam Waytz. The Illusion of Explanatory Depth. https://www.edge.org/response-detail/27117
3. The Illusion of Explanatory Depth - The Decision Lab. https://thedecisionlab.com/biases/the-illusion-of-explanatory-depth/
4. The Illusion of Explanatory Depth - The Decision Lab. https://thedecisionlab.com/biases/the-illusion-of-explanatory-depth/
5. Adam Waytz. The Illusion of Explanatory Depth. https://www.edge.org/response-detail/27117
6. Adam Waytz. The Illusion of Explanatory Depth. https://www.edge.org/response-detail/27117
7. Leonid Rozenblit and Frank Keil. The misunderstood limits of folk science: an illusion of explanatory depth. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3062901/