The internet has made confirmation bias more pervasive than ever before. If you’re seeking evidence for your theory—regardless of how reliable the sources of information—a quick search is certain to confirm your idea.
Coined by cognitive psychologist Peter Wason, who pioneered the Psychology of Reasoning (the study of how people reason), confirmation bias is a cognitive bias that allows people to validate their personal beliefs and preconceptions regardless of their soundness.
As engineering professor, Barbara Oakley, says, emotion-based thinking might feel good (who doesn’t like to be right?), but it can have a negative long-term impact on your mental health and worldview. Having your false intuitions confirmed can be a recipe for disaster.
There are ways to combat confirmation bias, however. First, let’s look at how it functions.
Understanding is believing
Peter Wason ran numerous experiments on reasoning, language, and psycholinguistics before coining confirmation bias. His early studies found that affirmative assertions are often evaluated faster than negative assertions.
For example, our brains process “7 is even” much faster than “9 is not odd.” Double negatives take a beat longer to evaluate.
Social psychologist Daniel Gilbert wanted to dig deeper into this bias. In his 1991 essay, “How Mental Systems Believe,” he wondered if there was a difference between believing an idea and “merely” understanding it.
Gilbert thought that understanding requires belief—at least initially. Using the Dutch philosopher Baruch Spinoza’s model, he writes,
The acceptance of an idea is part of the automatic comprehension of that idea and the rejection of an idea occurs subsequent to, and more effortfully than, its acceptance.
To test this assumption, Gilbert presented volunteers with a nonsensical sentence: “a dinca is a flame.” He then offered “true” and “false” options. Since the dinca sentence is a positive assertion, more people likely chose it as “true”—they relied on their intuitive response system.
Gilbert then added a twist: some participants had to memorize numbers while answering, meaning their brains were further occupied. With their cognitive resources being tapped, their ability to judge the validity of the sentence was hindered.
Participants were later tested to see what they could recall from this study. Those who had to memorize numbers were more likely to judge false sentences as true. With limited cognitive powers, their ability to remember (and therefore to reason) was stunted.
The comprehension and acceptance of ideas are not clearly separable psychological acts, but rather…comprehension includes acceptance of that which is comprehended.
This helps explain why we become so invested in an idea, even in the face of contradictory evidence. If believing is a natural component of understanding, then unbelieving takes a lot more work. In fact, research on social media has shown that we often first believe headlines before having to endure the more rigorous process of disbelief.
As Daniel Kahneman writes in Thinking: Fast and Slow, this helps explain why we’re susceptible to “empty persuasive messages, such as commercials,” especially when our brains are tired (think: late-night commercials or social media scrolling). Willpower is a limited resource. We’re more likely to believe whatever is shown to us at the end of the day.
Once we understand (and believe) an idea, it takes an inordinate amount of work to change our story. Marketers know—and exploit—this cognitive quirk.
Unconfirming our bias
Cognitive biases rarely work in isolation. The anchoring effect states that we’re likely to believe the first piece of information on a topic that we read. And since, as Gilbert shows, comprehension requires belief, anchoring is step one in confirmation bias.
This is why ubiquity matters, and why companies make so much money from advertising. When ads are plastered everywhere, you anchor to promises being made regardless of their truth. Some people are more skeptical than others, but as a general rule of thumb we give information we see the benefit of the doubt.
There’s a caveat, however: information that conflicts with pre-existing beliefs is immediately rejected. Since we’ve constructed a worldview, rejecting previously learned (and believed) information can be seen, at its extreme, as an existential crisis. But it doesn’t have to be that way.
While Kahneman knows that we’re likely to fall victim to the promises of marketing (especially when tired), he also recognizes that we can recruit our reasoning capabilities to overturn previously learned (and believed) ideas. It just takes more effort than defaulting to what’s conveniently placed in front of us.
How to accomplish this? Let’s return to Barbara Oakley. While the biases mentioned so far seem like a purely intellectual exercise, she advocates for changing physical patterns to gain a new perspective.
You want to try to expose yourself to novel stimuli as much as possible. That doesn’t mean that you have to live a topsy-turvy life, but try things like sitting at a different place at the dinner table or brush your teeth with the other hand. And, of course, travel is a great way of getting out of your comfort zone.
Debate classes function like this: you’re forced to convincingly argue points that you might not hold. Sure, the practice is counterintuitive, and at times frustrating, just like driving a nail into a wall with your left hand (if you’re a rightie). But the ability to overcome initial assumptions and reason out your decisions—to move out of your comfort zone—is an invaluable skill that translates across many domains of life.
You just need to step back from yourself and give it a shot.