Give 158 scientists the exact same data and ask them a straightforward question about immigration policy. You'd expect similar answers, right? Not quite.
A new study published in Science Advances reveals something uncomfortable about scientific objectivity: when researchers analyzed identical datasets, their conclusions aligned remarkably well with their pre-existing political views on immigration.
George J. Borjas and Nate Breznau recruited 158 researchers across 71 teams and gave them all the same task: analyze International Social Survey Program data from 1985-2016 to determine whether immigration affects public support for social welfare programs. Before starting, participants rated their own stance on immigration policy on a 0-6 scale.
The results were striking. Those 71 teams collectively produced 1,253 different statistical models with vastly different conclusions. And the pattern was clear: teams that personally supported more immigration tended to find evidence that immigration doesn't threaten welfare support. Teams skeptical of immigration found the opposite.
Now, before you assume this is about bad actors cherry-picking data, here's where it gets interesting: the bias didn't come from calculation errors or outright fraud. It came from data design choices — how to measure immigration levels, which countries to include, how to group variables, what timeframe to analyze.
These sound like technical minutiae, but five specific methodological decisions accounted for roughly 68% of the variation in outcomes between pro-immigration and anti-immigration teams. The researchers ran a multiverse analysis of 883 statistical models and confirmed that ideology's effect was statistically significant in nearly 88% of cases.
The study also had peer reviewers (blinded to researchers' identities) rate the quality of each analysis. Interestingly, teams with moderate views on immigration received higher quality scores than those holding extreme positions on either end.
"Scientists are also human beings," Breznau told PsyPost. "They are not infallible and are not perfectly objective in their work."
This has real implications for how we think about peer review and scientific consensus. If trained researchers analyzing the same data can reach opposite conclusions based on subtle methodological choices influenced by ideology, what does that mean for fields where the "right" answer is less clear than physics or chemistry?
The researchers acknowledge important limitations: this was exploratory rather than confirmatory evidence, there was imbalanced representation between pro- and anti-immigration researchers, and they couldn't observe the unconscious decision-making processes that might have driven the bias.
But the core finding stands: ideology shapes science, not through deliberate manipulation, but through a thousand small choices that feel entirely reasonable at the time.
The universe doesn't care what we believe. But apparently, what we believe affects what we think we've discovered about the universe. That's worth sitting with for a moment.


