A peer-reviewed study in Nature just dropped a bombshell: X's algorithmic "For You" feed may be systematically shifting users' political views toward conservatism. And the effects appear to persist even after users stop using the algorithmic feed.
Researchers from Bocconi University and the Paris School of Economics ran a field experiment with nearly 5,000 active X users. They randomly assigned participants to either the algorithmic feed or the chronological "Following" feed for seven weeks, monitoring their content consumption and political attitudes through surveys and browser data.
The results were stark. Users on the algorithmic feed engaged more with the platform—no surprise there, algorithms are designed for engagement—but they also adopted more conservative policy priorities and were more likely to follow conservative activist accounts. The algorithm "showed more conservative and activist posts while demoting traditional news outlets."
Here's what makes this particularly concerning: the effects didn't disappear when users switched back to the chronological feed. That suggests the algorithm isn't just showing people different content—it's actually changing how they think about politics.
Now, is this intentional? That's the $44 billion question. Did Elon Musk deliberately tune the algorithm to push users right? Or is this an emergent property of optimizing for engagement, where outrage-driven conservative content happens to perform well? The study doesn't answer that. It just documents the effect.
What we do know is that algorithmic feeds aren't neutral. They shape what we see, which shapes what we think. When a platform with hundreds of millions of users has an algorithm that systematically shifts political attitudes, that's not a bug. It's a feature. The question is whose feature it is.




