How to Overcome Confirmation Bias
Stop Cognitive Bias from Ruining Your Decisions
Confirmation bias is shaping your decisions right now. Not occasionally. Every day. And the unsettling part is that the smarter you are, the harder it is to see it happening.
By the end of this episode, you'll know exactly what confirmation bias is. How to recognize when it has taken over a room. And three specific practices that actually work. Not borrowed frameworks, but what forty years of high-stakes decisions have taught me.
Let's get into it.
What Is Confirmation Bias?
Confirmation bias is your brain's tendency to seek out, favor, and remember information that confirms what you already believe, filtering out everything that contradicts it.
Most people think that just means seeking out information that agrees with them. That's part of it. But here's what makes it truly dangerous.
Once you form a strong belief, three things happen automatically.
Unequal Evaluation. Picture two studies landing on your desk. One says your strategy is working. One says it isn't. You read the first and nod. You read the second and start looking for the flaw: the methodology, the sample size, the funding source.
Selective Memory. Your brain doesn't store evidence equally. What supports your belief stays accessible. What contradicts it becomes harder to recall the longer you hold the belief.
The Backfire Effect. When someone directly challenges a belief you hold, your brain treats it as a threat. The response isn't reconsideration. It's defense. Studies show you actually leave the argument more convinced than when you entered it.
Together, the longer you hold a belief and the more it matters to you, the harder it becomes to change, no matter how much evidence says you should.
Confirmation Bias in Today's World
Confirmation bias has always been part of human thinking. What's changed is the environment around it.
Algorithms feed you content that matches what you already believe. Social media shows you opinions from people who think like you. Search engines rank results based on what you've clicked before. Every system you interact with daily is built to confirm your existing views. Not by accident, but because confirmation keeps you engaged.
The result compounds. The more confirming information you consume, the stronger your existing beliefs become. The stronger your beliefs become, the more your brain filters out opposing information. The more that information gets filtered, the harder it becomes to update your thinking, even when updating is exactly what the situation demands.
This is mindjacking in action. The systematic replacement of your thinking by systems built to do it for you. And confirmation bias is one of its most powerful tools.
It's visible everywhere. In public discourse where people can no longer agree on basic facts. In organizations that keep funding failing strategies long after the evidence says stop. In leaders who build teams designed to tell them what they want to hear.
You might assume that smarter, more experienced people are less susceptible to this. The research says otherwise.
The Smartest Person in the Room Gets It Wrong
Here's what surprises most people.
Confirmation bias doesn't get weaker as you get smarter. It gets stronger.
Dan Kahan at Yale ran a study. He gave people a math problem where the correct answer contradicted their political beliefs. The smarter the person, the more likely they were to get the answer wrong, in the direction that protected their belief.
More intelligence, applied more effectively, in service of the conclusion they'd already reached.
A smart person who has formed a wrong belief is better at defending it. They find flaws in the opposing data faster. They construct more sophisticated arguments. They're more convincing to others and to themselves.
I watched this play out in a board meeting. A CEO had championed a major strategy. Three separate analyses came back contradicting it. Each time, he found a different flaw in the methodology. By the end of the meeting he'd convinced the room the data was unreliable. The strategy continued. The outcome was exactly what the data predicted.
He wasn't dishonest. He was skilled. His intelligence was working against him. And everyone in that room let it happen.
If you're intelligent, experienced, and confident in your judgment, you are not immune to confirmation bias. You are more vulnerable to it.
If you know someone who is always the smartest person in the room, send them this episode. They need it more than most.
Enjoying this? Studio Sessions delivers innovation decision insights to your inbox.
How to Overcome Confirmation Bias: What Actually Works
Knowing about confirmation bias doesn't stop it. I know this from experience, not from research. I've been in rooms where everyone understood exactly what was happening and it happened anyway.
What works is different from what you've probably been taught.
Catch It in Yourself: The Flip Debate
The moment I've most reliably caught confirmation bias operating in myself hasn't come from a checklist or a framework. It's come from a specific kind of conversation.
I keep a small group of trusted advisors, people I call my kitchen cabinet. These aren't peers. They're almost never inside the organization. They have no stake in the outcome and no incentive to tell me what I want to hear. When I'm about to make a significant decision and I feel the pull of certainty, I take it to one of them.
The conversation has a specific structure. I argue my position, fully and genuinely, the strongest version I can make. Then I stop. And I argue the opposite. Not a token acknowledgment of the other side. A real debate. I take the side I'm most resistant to and make the best case I can for it.
What happens in that second argument is where confirmation bias shows up. The gaps. The assumptions I'd been protecting. The evidence I'd felt the urge to dismiss. When you're forced to argue a case you don't believe, you find the things you didn't want to see when you were arguing the one you do.
An outside advisor is essential. Someone who will push back, ask hard questions, and notice when the flip argument is being faked. You can't do this with someone who needs something from you. The absence of stakes is what makes the honesty possible.
Catch It in a Room: Two Signals to Watch For
I've learned to watch for two signals that tell me confirmation bias has taken over a room. Both are visible before the decision is made. Almost everyone misses them.
The first signal is the unwillingness to debate the other side.
When a room has really decided, before the discussion is officially over, nobody wants to argue the opposing position. Not even hypothetically. Raise the other side and watch what happens. Eyes go flat. The conversation moves on. Someone changes the subject. If a room can't genuinely engage with the strongest case against the preferred direction, confirmation bias is driving.
The second signal is circular justification.
Listen for reasoning that keeps returning to its own starting point. The evidence for the decision is the decision itself. When you can't find an external reason, just a restatement of the conclusion, confirmation bias is driving.
When I hear circular justification in a room, I stop the conversation. Not to embarrass anyone. To name what's happening. "We're not evaluating anymore. We're confirming. Let's go back to the evidence."
That single intervention has changed the outcome of more decisions than any framework I've ever been taught.
Change How You Decide: Full Options, Real Challenge
Here's the most consistent change I've made in my own decision-making, and it comes directly from watching what confirmation bias costs people: I force a full pros and cons analysis on every serious option. Not just the one I'm leaning toward.
This sounds obvious. Almost nobody does it.
The natural pull is to build the case for the option that already feels right and compare it against the weaknesses of the alternatives. That's confirmation bias disguised as analysis. What I do instead is give every option on the table the same treatment. The best case for it. The best case against it. Without knowing in advance which one I'm going to choose.
For decisions that carry real weight, I take it further. I bring in my brain trust: direct reports who will tell me what I don't want to hear, kitchen cabinet advisors, trusted board members. I ask specifically for the challenges. Not validation. Not enthusiasm. The places where the thinking is weak, the assumptions that might not hold, the evidence I might have filtered out.
One question has changed how I approach every major decision: what am I not seeing?
The answers, from people who have no incentive to protect my view, are exactly where the confirmation bias lives.
Confirmation Bias Exercise: Try This Today
This week, before you finalize any decision you've already started leaning toward, do one thing.
Find one person outside your organization, someone with no stake in the outcome, and run the flip debate. Argue your position fully. Then stop and argue the opposite, with the same effort and commitment.
Don't summarize the other side. Argue it. Make the best case you can for the view you're most resistant to.
Notice what comes up in that second argument. The gaps. The assumptions. The evidence you'd been setting aside.
That's where your confirmation bias is living.
Run that exercise this week. Not once. Every time you feel the pull of certainty on a decision that matters.
The Benefits of Overcoming Confirmation Bias
The payoff from these practices compounds over time.
Examined beliefs are more reliable than accumulated ones. Decisions that accounted for opposing evidence hold up better than decisions that filtered it out. Judgment that evaluates rather than confirms earns a different kind of trust from the people around you.
Beyond your own decisions, catching confirmation bias makes you harder to capture. Every algorithm, every platform, and every persuader around you is built to exploit it. Seeing it operate in yourself reduces their leverage over your thinking.
That's what these practices build. Not certainty. Something better.
Examined confidence.
Subscribe To The Weekly Newsletter
Get four decades of decisions. Delivered to your inbox.
Free or paid — your choice.
Endnotes
1. Dan Kahan — Identity-Protective Cognition Kahan, D.M. (2017). "Misconceptions, Misinformation, and the Logic of Identity-Protective Cognition." Cultural Cognition Project Working Paper Series No. 164. Yale Law School Public Law Research Paper No. 605. Available at SSRN: https://ssrn.com/abstract=2973067
Kahan's research demonstrates that individuals with higher cognitive ability are more likely to process empirically contested information in ways that protect their cultural group identity rather than update their beliefs based on evidence. Higher numeracy and scientific literacy were associated with greater polarization on contested empirical questions, not less. The finding that intelligence amplifies rather than reduces motivated reasoning is the foundation for the section "The Smartest Person in the Room Gets It Wrong."
2. The Backfire Effect Nyhan, B., & Reifler, J. (2010). "When Corrections Fail: The Persistence of Political Misperceptions." Political Behavior, 32(2), 303-330. https://doi.org/10.1007/s11109-010-9112-2
The original research documented instances where corrective information increased rather than decreased belief in a targeted misperception. Note: Nyhan subsequently published a reassessment in PNAS (2021) indicating that the backfire effect may be less robust and widespread than initially reported. The more durable finding from this research is that corrections frequently fail to reduce misperceptions — not necessarily that they reliably increase them. The script's claim that "knowing about confirmation bias doesn't stop it" reflects the more conservative and better-supported finding.
3. Confirmation Bias — Foundational Research Nickerson, R.S. (1998). "Confirmation Bias: A Ubiquitous Phenomenon in Many Guises." Review of General Psychology, 2(2), 175-220. https://doi.org/10.1037/1089-2680.2.2.175
The foundational academic review of confirmation bias across multiple domains. Nickerson identifies three primary mechanisms — selective information search, selective interpretation, and selective recall — which map directly to the three sub-concepts in the "What Is Confirmation Bias?" section: Unequal Evaluation, Selective Memory, and the Backfire Effect.
4. The Pre-Mortem Klein, G. (2007). "Performing a Project Premortem." Harvard Business Review, 85(9), 18-19. https://hbr.org/2007/09/performing-a-project-premortem
Klein developed the pre-mortem methodology to interrupt the confirmation bias that protects decisions already made. The technique — assuming future failure before it occurs and working backward to identify causes — is referenced in the practices section as one approach to catching confirmation bias before a decision is finalized.
5. Mindjacking McKinney, P. "Mindjacking." philmckinney.com. https://www.philmckinney.com/mindjacking/
McKinney, P. "Mindjacking: When Your Opinions Aren't Yours." philmckinney.com. https://www.philmckinney.com/mindjacking-when-your-opinions-arent-yours/
The concept of mindjacking — the systematic replacement of independent thinking by external systems designed to think for you — is Phil McKinney's original framework. Confirmation bias is identified as one of the primary mechanisms through which mindjacking operates at the individual cognitive level.