Confirmation bias is the tendency to search for, interpret, and recall information in ways that confirm existing beliefs. It operates before, during, and after a decision: shaping what evidence you seek, how you read ambiguous data, and what you remember later. In decisions, it typically means the conclusion is formed early and the "research" that follows is unconsciously curated to support it. First documented by Peter Wason in the 1960s, it remains one of the most studied and costly errors in human judgment.
Where this came from
Peter Wason, a British cognitive psychologist, first described confirmation bias through a deceptively simple experiment in 1960. He gave participants a sequence of three numbers: 2, 4, 6. He told them the sequence followed a rule and asked them to discover the rule by proposing new sequences. Participants could test as many sequences as they liked. When they were confident they had identified the rule, they stated it.
Nearly all participants proposed sequences that were consistent with their initial hypothesis: 4, 6, 8 or 10, 12, 14 or 20, 22, 24. The actual rule was simply "any three numbers in ascending order." But because participants only tested sequences that confirmed their hypothesis (the rule involves even numbers, or the rule involves incrementing by 2), they never discovered their hypothesis was wrong. They did not test disconfirming sequences, for example 1, 2, 10, which also fit the actual rule.
Wason called this "confirmation bias" and subsequent decades of research have extended the finding across virtually every domain of human reasoning: scientific hypothesis testing, medical diagnosis, legal judgments, investment analysis, and strategic planning. The bias is not the result of low intelligence. Wason's participants were Oxford students. It reflects a default property of the human reasoning system.
How it works
Confirmation bias operates through three distinct mechanisms that compound each other.
The first is biased information search. When you hold a hypothesis or preference, you instinctively look for evidence that confirms it. An investor who believes a stock will rise reads bullish analysis more thoroughly than bearish analysis. A manager who has already decided to hire a candidate frames interview questions that allow the candidate to demonstrate strengths rather than expose weaknesses. The search is asymmetric from the start.
The second is biased interpretation. Even when people encounter disconfirming evidence, they tend to scrutinise it more critically than confirming evidence. Psychologists Ziva Kunda and colleagues documented this "motivated reasoning" in multiple studies: people apply stringent methodological standards to studies whose conclusions they dislike and accept confirming studies uncritically. The bar for evidence shifts depending on whether the evidence is welcome.
The third is biased memory. People recall confirming instances more readily than disconfirming ones. After a product launch that underperforms, a team will remember the market signals that pointed to success more vividly than the signals that pointed to risk, unless they have documented both systematically in advance.
In combination, these mechanisms mean that by the time a decision is made, the evidence base often looks far more unambiguous than it actually was. The decision-maker has, without conscious intent, constructed a distorted picture of the available information.
When to use it and when not to
Confirmation bias is not a tool; it is a failure mode. The relevant question is when it is most dangerous and when it is relatively benign.
It is most dangerous in decisions with long feedback loops (where you will not learn you were wrong for months or years), high stakes (where the cost of error is large), and strong prior preferences or identity investment (where being wrong means admitting a mistake about yourself). Investment decisions, hiring senior executives, strategic pivots, and relationship choices all carry these characteristics.
It is relatively low-risk in decisions that are quickly reversible and where objective feedback arrives fast. If you buy a product and immediately discover it does not work, confirmation bias has little opportunity to cause lasting damage. The feedback loop closes the bias before it can compound.
Myside Bias
Myside bias is a subset of confirmation bias that activates specifically when the decision involves your own past choices or identity. Leaving a job you took enthusiastically, exiting a relationship you publicly committed to, or abandoning a business idea you have talked about for years all trigger myside bias. Admitting the choice was wrong feels like an indictment of the person who made it. This makes the disconfirming evidence feel more threatening than it would be for a neutral observer, which in turn makes the confirmatory search more intense. It is why intelligent, experienced people often persist longest with their worst decisions.
Run this framework on your actual decision.
DecisionsMatter.ai walks you through a structured 5-step analysis. Your first analysis is free.
How to apply it in practice
The most effective single intervention is structured red-teaming. Before finalising any significant decision, spend at least 15 minutes writing the strongest possible case for the opposite conclusion. Do not write a strawman. Write the version of the opposing argument that a smart, informed person who disagrees with you would actually make.
A practical variant: identify the two or three pieces of evidence that, if true, would most strongly suggest your preferred option is wrong. Then actively search for those specific pieces of evidence. This inverts the default search pattern.
For hiring decisions, use structured interviews with pre-set questions and scoring rubrics applied identically to all candidates. This removes the opportunity for biased interpretation of answers.
For investment or business decisions, write down your thesis and the specific conditions under which the thesis would be wrong before you commit. Keep a written record. When those conditions arise, you will have a harder time dismissing them than if you had not pre-committed to them in writing.
Finally, where possible, seek out people who genuinely disagree with your conclusion and ask them to explain their reasoning in full. Not to find holes in their argument, but to understand it on its own terms. If you cannot steelman the opposing view, you have not yet understood the decision you are making.
This is one model from the upcoming Decisions Matter book.
30 mental models. 40 cognitive biases. A 5-step decision system. Get the relevant chapter in your inbox when it publishes.
Frequently asked questions
What is confirmation bias?
Confirmation bias is the tendency to seek out, interpret, favour, and recall information in a way that confirms your pre-existing beliefs. It operates at every stage of reasoning: which information you look for, how you read ambiguous evidence, and what you remember afterwards. British psychologist Peter Wason documented the phenomenon through a series of experiments in the 1960s, most famously the 2-4-6 task, in which participants consistently failed to test their hypotheses against disconfirming evidence.
How does confirmation bias affect decision-making?
In decisions, confirmation bias causes people to treat the research phase as a validation exercise rather than a genuine inquiry. By the time most people begin "researching" a choice, they have already formed a preference. They then unconsciously prioritise sources and arguments that support that preference, discount those that challenge it, and remember the supportive evidence more vividly later. This is particularly dangerous in investment decisions, hiring, strategic planning, and relationship choices, where the cost of a wrong decision is high and the feedback loop is slow.
How do you detect confirmation bias in yourself?
Three diagnostic questions are useful. First: when did I form my initial view on this decision, and what was it based on? If the view formed early and quickly, you are at risk. Second: can I state the strongest possible case against my preferred option, in its own terms, without strawmanning it? If you struggle to do this, you have not genuinely engaged with the opposing view. Third: what would have to be true for my preferred option to be the wrong choice? If no answer comes to mind, confirmation bias is likely already operating.
How do you overcome confirmation bias?
The most reliable method is structured red-teaming: explicitly assigning yourself or another person the role of arguing the opposing view in full seriousness. Charlie Munger called this inversion. Jeff Bezos builds it into Amazon's writing culture by requiring six-page memos that must represent the strongest version of any counterargument. For individual decisions, a simpler technique is the "consider the opposite" exercise: before finalising a view, spend 10 minutes writing down every reason the opposite conclusion could be correct. Research by psychologists Mussweiler, Strack, and Pfeiffer found this single step significantly reduces confirmatory reasoning.