Biases and Fallacies: Confirmation Bias

May 7, 2026

We think we follow the evidence. More often, the evidence follows us.

Confirmation bias shapes what we notice, ignore, and remember — reinforcing what we already believe while filtering out what might challenge it. In a data-rich world, that bias doesn’t fade. It becomes more convincing.

This article looks at why that happens — and how to push back against it.

Photo: ChatGPT 5.4, confirmation bias, on ChatGPT 5.4.

Introduction

We live in an age of overwhelming information. Data, news, opinions and analysis compete for our attention every day. But how much of what we believe is shaped not by evidence, but by invisible shortcuts our brains take without telling us? Cognitive biases and logical fallacies are the hidden architecture of human thinking. They influence how we interpret statistics, how we read the news, how we make decisions — and how often we get things wrong while feeling completely certain we're right. Every Thursday, Open Data Insights publishes one bias or fallacy — clearly explained, grounded in research, and connected to the kind of data stories we tell on this site. Because understanding your own thinking is the first step toward understanding the world more clearly. Welcome to Biases and Fallacies — a weekly guide to the gaps between what we think we know and what the evidence actually shows.

The Concept

Confirmation bias is our tendency to search for, interpret, and remember information in a way that confirms what we already believe — while unconsciously ignoring or dismissing evidence that contradicts it. It is arguably the most pervasive and consequential bias in human thinking. We don't see the world as it is. We see it as we expect it to be.

The Experiment

In 1960, British psychologist Peter Wason designed a deceptively simple test. He showed participants the sequence 2, 4, 6 and asked them to discover the rule behind it by proposing their own sequences. The experimenter would say only whether each proposed sequence followed the rule or not. Most participants quickly assumed the rule was "even numbers increasing by two" and proposed sequences like 4, 8, 12 or 10, 12, 14 — all of which the experimenter confirmed. Satisfied, they announced their rule. They were wrong. The actual rule was simply "any ascending sequence." The participants never tested sequences that would have challenged their assumption — like 1, 2, 3 or 5, 10, 100. They sought only confirmation. This elegant experiment, now known as the Wason Selection Task, became one of psychology's most replicated findings and the foundation of confirmation bias research.

Everyday Examples

You believe a particular neighbourhood is dangerous. Walking through it, you notice every piece of graffiti and every suspicious glance — and mentally file them as evidence. You don't notice the families, the clean streets, the quiet cafés. A manager believes a new employee is underperforming. In meetings, they notice every hesitation and missed point. The good contributions are registered but quickly forgotten. During an election campaign, you read articles that support your preferred candidate and scroll past those that criticise them — feeling increasingly certain your choice is obviously correct.

In each case, the belief came first. The evidence was selected to match it.

The Data Connection

Confirmation bias is particularly dangerous when interpreting statistics and data stories — which is why it matters to us here at Open Data Insights. When people believe crime is rising in their city, they interpret every reported incident as confirmation — while remaining unaware that official statistics may show a decade-long decline. When politicians believe a policy is working, they highlight supporting indicators and question the methodology of contradicting ones. Data doesn't protect us from confirmation bias. If anything, the authority of numbers can make it worse — we feel more justified in our selective reading when we can point to statistics that agree with us. The antidote isn't more data. It's actively seeking the data that could prove you wrong.

How To Counter It

The most effective technique is simple but uncomfortable: steel-manning the opposite view. Before concluding that your belief is supported by evidence, ask yourself: what would the data look like if I were wrong? Then go looking for exactly that. If you can't find any evidence against your position, you probably haven't looked hard enough. Scientists call this falsification — the deliberate attempt to disprove your own hypothesis. It feels unnatural. It works.


Sources & Further Reading



🤖 This text was generated with the assistance of AI. All quantitative statements are derived directly from the dataset listed under Data Source.