In other words, we like to look for and interpret information in ways that confirm our expectations. The language makes it sound like confirmation bias is something only scientists can be afflicted with. Not so.
If you’re in the business of science, confirmation bias can certainly mess with the questions you ask and how you go about finding answers to them. If you expect that there is or once was water on Mars (maybe you want there to be because you’re hoping to create an extra-terrestrial habitat), you’ll be inclined to look for signs of water.
If you’re hoping that the human race restricts it’s footprint to planet Earth, you might instead look for evidence that water couldn’t possibly exist in the Martian environment. Either way, your initial expectation will guide the way you go about conducting your science…and the answers you find. Yes, you might indeed find a different answer depending on the question you asked. Feel free to take a minute to reflect on the implications.
To make things worse, the confirmation bias affects pretty much all our attempts to learn about and understand the world and other people in it.
How will you interpret your boss’ comments on your performance evaluation? If you think she’s got it in for you and is just looking for an opportunity to fire you, you’ll think that criticisms confirm your suspicion. If you expect that she’s happy with you and in fact may want to promote you, you might instead see critique as a hint about how you can tip the scales in your favor. Either way, you’re interpreting evidence in a way that suits your expectations.
So, you can’t hide from confirmation bias by staying away from science. But, is there a cure?
Yes. It’s a potion made up of one part sticking to your guns, one part considering far-out possibilities, and one part being open to surprise. I’ll explain.
The Search for a Cure
Clifford Mynatt and his colleagues, Michael Doherty, and Ryan Tweney from Bowling Green State University conducted a now-classic set of studies to figure out if falsification, the most commonly accepted strategy for curing confirmation bias, really is effective. That is, does looking for evidence that disconfirms your initial hypothesis help you discover the right answer more often?
To answer this question, Mynatt and his colleagues designed two complex, simulated computer environments that each had a number of objects (circles, squares, and triangles) with different attributes. These objects moved around in their environment and interacted with each other in systematic ways following a series of rules. The task for their participants was to discover these rules. Half of the participants were made aware of the confirmation bias and received instructions on how to use a falsification strategy, the other half didn’t.
3 Effective (and 3 Ineffective) Cures for Confirmation Bias
In these studies the researchers discovered three things about how not to cure confirmation bias. But, don’t despair. I’ll use what didn’t work as a starting place for discussing what does.
Ineffective 1. Knowing that the confirmation bias exists doesn’t cure it.
Those of Mynatt’s participants who received information about the confirmation bias and instructions on falsification still mostly looked for evidence that would confirm their initial hypothesis.
This means that just reading this article and other sources about confirmation bias (or falsification strategies) isn’t enough to cure you. Sorry.
“But,” you may be thinking, “perhaps Mynatt’s participants just didn’t use the strategies they had learned. Surely looking for evidence that goes against your hypothesis is the most effective and evenhanded thing to do.” In fact, it isn’t. Not always. Read on.
Ineffective 2. Looking for evidence that your initial hypothesis is wrong isn’t always the safest way to go.
Mynatt found that participants who completely abandoned their disconfirmed hypotheses sometimes ended up even further away from discovering the laws of the system than from where they started.
On the other hand, the most successful participant used disconfirmation to modify incorrect aspects of their initial hypothesis.
What does this mean?
Effective 1. Stick to your guns. Don’t abandon your first guesses too readily. Sometimes your initial expectation may be neither 100% right, nor 100% wrong.
For example, it’s possible that your boss really is trying to get rid of you and promoting you will accomplish that by getting you moved to a different office.
Ineffective 3. Exposing yourself to situations that will give you a lot of information won’t necessarily help you discover the right answer.
Mynatt found that most participants didn’t change their hypothesis when they encountered surprising evidence that supported alternative explanations for how the system worked. The few who did, learned more.
So, what can you do with this?
Effective 2: Open your mind. Learn how to think of a few far-out alternatives and keep an eye out for evidence that supports any one of them.
Effective 3: Embrace surprises when they happen to you. When you feel that something didn’t go exactly as you expected, consider that you need to refine some hypotheses about how things are working.
If you adopt a strategy that is one part sticking to your guns, one part considering far-out ideas, and, one part paying attention to surprises, you’re ready to adapt to whatever the world throws at you in the way of evidence.
Figuring out how complex things work is a lot like fishing. If you don’t know which lure works for the fish you’re after, start with your best guess and experiment from there. This strategy might come in especially handy if you plan to go fishing on Mars.
- What if Pigs Really Could Fly? How to Think by Thinking through Unlikely Possibilities
- Learn to Learn by Embracing Surprise
- Critical Thinking Skills: What are They and How Do I Get Them?
- Want to Be Smart? First, Know How Little You Know
- Cognitive Skills in the Internet Era
- Why Overconfidence Occurs and How to Overcome It
- How to Learn from the Web
- Critical Thinking in Decision Making
- How to be Smart: A Simple Approach
Mynatt, C., Doherty, M., & Tweney, R. (1977). Confirmation bias in a simulated research environment: An experimental study of scientific inference Quarterly Journal of Experimental Psychology, 29 (1), 85-95 DOI: 10.1080/00335557743000053
Mynatt, C., Doherty, M., & Tweney, R. (1978). Consequences of confirmation and disconfirmation in a simulated research environment Quarterly Journal of Experimental Psychology, 30 (3), 395-406 DOI: 10.1080/00335557843000007