The Psychology of Data: Why Even Scientists Fall for Cognitive Biases

Alicia de Mendieta

Why our brains are wired to be wrong, even when we’re trying to be right.

We like to think of scientists as super-rational beings who follow evidence wherever it leads, immune to the mental shortcuts and biases that trip up the rest of us. But here’s the uncomfortable truth: scientists are human too, and that means they’re just as vulnerable to cognitive biases as anyone else.

This isn’t just an academic curiosity – it’s a real problem that affects everything from medical research to climate science. Understanding how cognitive biases influence scientific decision-making is crucial for identifying and managing biases that compromise scientific results.  

What Are Cognitive Biases and Why Should We Care?

Cognitive biases are predictable patterns in how our brains process information. These systematic errors are not only a sign of our limited rationality, but they also explain the way our judgments and decisions work. Think of them as mental shortcuts that usually help us make quick decisions but sometimes lead us astray.

Your brain evolved to keep you alive, not to make you a perfect scientist. Our brain draws quick and cheap conclusions to do us a favour. Most of the time, they are sufficient and roughly relevant to our immediate needs. But sometimes, they do us a disservice.

Here’s what makes this particularly tricky: People form opinions based on emotions, such as fear, contempt and anger, rather than relying on facts. New facts often do not change people’s minds. And yes, this applies to scientists too.

The Confirmation Bias Trap: When Scientists Cherry-Pick Evidence

Let’s start with the big one: confirmation bias. This is our tendency to seek out information that confirms what we already believe while ignoring evidence that contradicts our views.

Confirmation bias is the tendency to believe or pay attention to evidence that confirms our expectations or beliefs, while ignoring or rejecting evidence that disconfirms or goes against our beliefs or expectations.

The famous example everyone talks about is Arthur Eddington’s 1919 eclipse expedition to test Einstein’s theory of relativity. Eddington already knew the results he expected to get, and he got there regardless of all the noise in the evidence. When some of the photographic plates didn’t support Einstein’s theory, Eddington simply discarded them without proper justification.

But confirmation bias isn’t just about dramatic historical examples. Positive results are considerably well-valued by publications, especially in the so-called “Social Sciences”. This creates a system where researchers are unconsciously incentivized to find results that confirm their hypotheses.

The Retrospective Bias: When Everything Seems Obvious in Hindsight

There’s another sneaky bias that affects how we interpret scientific results: retrospective bias. We tend to overestimate the probability of an event when we know that it has taken place.

After a scientific breakthrough, it’s easy to look back and think, “Of course that was going to work!” This bias can make scientists overconfident in their theories and less likely to question their assumptions.

The retrospective bias relies on the extraordinary capacity of the human mind for rationalisation, i.e. the justification of events after the fact. We’re storytelling creatures, and we can’t resist creating neat narratives even when the reality was messier and more uncertain.

The Hidden Biases in Research Design

Here’s where things get really interesting – and concerning. Some of the most problematic biases in science happen during the research design phase, where they’re hardest to spot.

Take the comparator bias in medical research. Comparator bias emerges when treatments known to be beneficial are withheld from patients participating in controlled trials. This can happen when researchers unconsciously choose comparison groups that make their treatment look better than it actually is.

The scary part? Many times, a comparator bias cannot be clearly identified through quality assessment tools, making it possible for a research study to appear of good or even excellent quality, even when it has this bias.

When Your Brain Hijacks Your Science

The neuroscience of bias makes this problem even more complex. Your brain is hard-wired to protect you, which can lead to reinforcing your opinions and beliefs, even when they’re misguided.

When scientists encounter evidence that challenges their theories, their brains can literally go into defensive mode. In situations of high stress or distrust, your body releases another hormone, cortisol. It can hijack your advanced thought processes, reason, and logic.

This isn’t a character flaw – it’s biology. Your brain’s amygdala becomes more active, which controls your innate fight-or-flight reaction when you feel under threat. Even the most rational scientist can fall prey to this response when their life’s work is questioned.

The Backfire Effect: When Facts Make Things Worse

Sometimes, presenting scientists with contradictory evidence doesn’t just fail to change their minds – it actually makes them more convinced they’re right. This is called the backfire effect.

Being presented with facts that suggest their current beliefs are wrong causes people to feel threatened. This reaction is particularly strong when the beliefs in question are aligned with your political and personal identities.

This explains why scientific debates can sometimes become so entrenched. When a researcher’s career and reputation are built on a particular theory, challenging that theory can feel like a personal attack.

The Economics of Bias: When Money Talks

Financial incentives create their own set of biases. Industry-sponsored studies are significantly more likely to obtain results favoring sponsors than independently funded research. The twist? Industry-sponsored studies have a lower risk of bias, and their methodological quality is at least as good as, sometimes even better than, the quality of independent studies.

This suggests that financial bias often operates at an unconscious level, influencing how researchers interpret data rather than how they collect it.

Fighting Back: Can We Debias Science?

The good news is that we’re not helpless against these biases. Recent research suggests that implicit attitudes have the potential to change through both associative and deliberative information.

Some promising approaches include:

Cognitive forcing tools: Simple interventions like checklists and structured decision-making processes. The implementation of simple checklists has proven extraordinarily successful in reducing human error in medical settings.

Structured peer review: Creating systems where research decisions are evaluated by multiple people can help catch biases that individual researchers miss.

Transparency in methodology: Making research methods more transparent and requiring researchers to justify their choices can reduce unconscious bias.

The Path Forward: Embracing Our Humanity

The solution isn’t to expect scientists to become perfectly rational machines. To fight against oneself, against the natural slope of cognitive biases that weaken our discernment, requires minimal training in what the scientific method is, not only for those who are destined to a scientific profession.

Work to keep an open mind. Allow yourself to learn new things. Search out perspectives from multiple sides of an issue. Try to form and modify your opinions based on evidence that is accurate, objective, and verified.

The goal isn’t to eliminate bias entirely – that’s impossible. Instead, we need to build systems that account for our human limitations while still allowing scientific progress to happen.

Understanding cognitive biases in science isn’t just important for researchers – it affects all of us. These biases can influence everything from the medications we take to the policies our governments adopt to address climate change.

Acknowledging that individual scientists are prone to cognitive biases, as any other human being, is the first step to understanding how a series of biases might be populating scientific research today.

The next time you read about a scientific study, remember that behind every data point is a human being with all the cognitive quirks that entails. That doesn’t make the science less valuable – it just makes it more human.

Biotational – The Open-Access Hub for Computational Science

Collaborate. Share. Innovate.

Website: https://www.biotational.com

Email: info@biotational.com

LinkedIn: https://www.linkedin.com/company/biotational/

© 2025 Biotational. All Rights Reserved.

This article is published under a Creative Commons CC BY-NC license, allowing for non-commercial sharing with proper attribution.

Want to share your research? Submit your article on Biotational today by emailing info@biotational.com!