Picture two jurors sitting through the same trial. They listen to identical testimony, watch the same videos, and take notes on the same pieces of evidence.
Yet by the end of the trial, they are more certain than ever that the other is wrong. One votes guilty, the other not guilty, and both are convinced their interpretation of the facts is the rational one. How can that be?
This familiar scene captures something fundamental about how the brain processes information. In a series of studies, my co-authors and I showed that the biases we see in human judgment are not just social or emotionalâthey are physiological.
Why Our Brains Are Prone to Bias
The brain is an efficient processor of information, but it operates under constraints. Neurons can only fire so much, and signals muâŚ
Picture two jurors sitting through the same trial. They listen to identical testimony, watch the same videos, and take notes on the same pieces of evidence.
Yet by the end of the trial, they are more certain than ever that the other is wrong. One votes guilty, the other not guilty, and both are convinced their interpretation of the facts is the rational one. How can that be?
This familiar scene captures something fundamental about how the brain processes information. In a series of studies, my co-authors and I showed that the biases we see in human judgment are not just social or emotionalâthey are physiological.
Why Our Brains Are Prone to Bias
The brain is an efficient processor of information, but it operates under constraints. Neurons can only fire so much, and signals must reach a threshold before they trigger an action. These biological limits mean that the information we âseeâ is never a perfect reflection of what exists in the world; it is an interpreted, filtered version shaped by both our prior beliefs and our preferences.
When new information arrives, it passes through this filtering system. The sensory system encodes the evidence, but before it becomes a decision, the brain compares it to a threshold that represents what we expect or value most. If the evidence is strong enough to surpass that threshold, we adjust our belief. If not, we tend to interpret it as consistent with what we already thought.
This mechanism is adaptiveâit allows for fast, energy-efficient decisionsâbut it also gives rise to confirmatory bias. The very same process that makes the brain efficient also makes it prone to reinforcing existing opinions.
Importantly, this bias does not stem from stubbornness or ideology. It arises because the brain weighs new information against both what it already believes and what it cares about. Preferences shape perception.
Two people may start from the same prior belief but have different motivations or values, leading their brains to tune the threshold differently. The result is that they literally process the same information in distinct ways and end up further apart. What looks like irrationality from the outside is, in fact, a predictable by-product of an efficient but imperfect information system.
When these individual processes unfold in social settings, they can amplify dramatically. Within a jury, a newsroom, or a family dinner, people bring different priors and different motivations to the table. Exposure to the same ambiguous or âmixedâ evidence can then lead each brain to confirm its own narrative.
This is polarizationânot simply a failure of open-mindedness, but a natural consequence of how the brain encodes and interprets information under constraint. The more people discuss, the more they rehearse and reinforce their interpretations, deepening the divide.
This framework helps explain why providing additional information rarely resolves disagreement. In political debates, climate discussions, or public-health messaging, one might expect that giving people more data would move opinions closer together. Yet the opposite often happens: Each side selectively integrates facts that fit its worldview and dismisses or reinterprets the rest.
Our findings suggest that this outcome is not only psychological but physiological. Brains with different priors and values literally process identical evidence differently. The gap is not informationalâit is computational.
Generational divides can be understood in a similar way. People who have lived through different historical periods have accumulated distinct experiences, which form their priors. They also value different things: security versus freedom, innovation versus stability, independence versus belonging.
When they encounter new events or social changes, their brains weigh this evidence through those motivational filters. What one generation sees as progress, another might perceive as lossânot because either side refuses to learn, but because they learn through different internal thresholds.
The same logic applies to the legal system itself. Courts rely on the idea that impartial judges and juries can objectively process evidence to reach a fair verdict. Yet physiological constraints make perfect impartiality impossible.
Early cases in a judgeâs career can subtly shape the way later cases are interpreted, and even the order in which evidence is presented in a trial can affect outcomes. First impressions are not merely psychologicalâthey shape the neural encoding of subsequent information. This means that verdicts reflect not only the facts but also the brainâs structure for interpreting them.
Bias Essential Reads
Beyond the courtroom or the ballot box, this mechanism can explain everyday phenomena such as information avoidance, selective attention, and the tendency to overweight particular kinds of evidence when making health, investment, or consumption decisions. We might read an article that confirms our lifestyle choices more carefully than one that challenges them, or interpret financial news differently depending on whether we are risk-averse or optimistic. Our beliefs and preferences are intertwined because the brain circuits that compute value and interpret evidence are intertwined.
Is It Possible to Become More Impartial?
Recognizing this connection does not mean we are doomed to bias. On the contrary, understanding the neurobiology of belief gives us a path toward designing better environments for collective decision-making.
If we accept that all information is filtered, we can create structuresâeducational, institutional, and socialâthat account for these natural constraints rather than pretending they do not exist. Encouraging exposure to diverse viewpoints, varying the order in which information is presented, or designing deliberation rules that slow down early anchoring can help offset the brainâs built-in tendencies.
Our research suggests that reasoning is not detached from emotion or motivation. It is embodied, bounded, and deeply human. The challenge is not to eliminate bias but to understand its biological rootsâand to use that understanding to foster empathy and design systems that make our collective decisions a little less polarized, and a little more fair.