It’s natural to feel we have a deep understanding of the world.
Unfortunately, we most often don’t.
On the whole, we tend to think we understand how things work in much more detail than we actually do.
Scientists continue to show us just how complex the world really is. And the devices we build are also increasingly intricate in design and function.
We used to fix cars ourselves. High-school kids would rebuild whole engines to get a running vehicle at low cost, and in part, just to think more carefully about them.
Now, most auto maintenance is nearly impossible without the computer diagnostics that only professionals have access to, and even they have a harder time really knowing how it all fits together.
We need to deeply understand how things work if we want to function well in the world.
So, how well do we really grasp all the details about how things around us work?
How does our actual understanding compare with our intuitions about how much we know?
These are questions about our metacognition. Metacognition means, in part, to know how much you know about a topic.
Cognitive science research by Frank Keil, described in his paper, “Folkscience: Coarse Interpretations of a Complex Reality,” suggests that:
Our sense of how the world works is often vastly cruder than we think.
Our metacognition is not that great. We too often don’t have that deep understanding we thought we did.
In a number of experiments by Keil and others, people were shown many devices, such as helicopters, zippers, and cylinder locks. In a first pass, folks rated how well they understood and could explain the devices using standard survey-type questions.
The possible answers ranged from having “only the haziest knowledge…” to understanding deeply enough to reproduce a “full working diagram.”
After the first pass, people in the studies were asked to explain how some of the items work in as much detail as possible.
The research participants were then surprised to discover something important.
For the most part, they were only able to give very rough ideas about the actual mechanisms in play.
This is where their metacognition broke down.
Keil called this mismatch between what we think we know and what we really know, “The Illusion of Explanatory Depth.”
Why does this illusion of deep understanding happen?
One possible reason is that we are often familiar with some components of the system and how to use them. For example, you know that a computer has a keyboard, monitor, USB ports, and the like.
Thinking of these familiar things quickly can lead you to feel like you understand deeply and can explain how the system works. When you really try to do it, though, your tongue may suddenly become tied.
You begin to realize that clever engineers have worked out many steps in between keypresses and characters showing up on the screen, and those steps are largely a mystery to you (or, at least to me).
With this realization, your metacognition – your knowledge about how much you know – is starting to improve.
How does this help you build deep understanding?
Keil acknowledges that scientists (and other professionals, surely) usually can give deep, fine-grained explanations for the narrow set of topics they think about all day.
It is possible to think very deeply, or at least as thoroughly as you need or want, especially if you routinely push yourself to get smarter on subjects you care about.
You become smart about those things and also have a better sense of the limits of your knowledge. You develop deep understanding and improved metacognition.
As shown in other cognitive psychology research, one way to improve your thinking is to develop a habit of routinely explaining things to yourself and others.
You can also build concepts maps to graph out your explanations.
Verbal and graphical explanations have several benefits.
One thing regular explaining will do is show you where the gaps are in your current understanding. This is an improvement in metacognition.
You can then fill them in and begin building deep understanding.
Image credit: Ian B-M