One in three people no longer actively seek news from traditional outlets. Instead, they've adopted what researchers call the "news will find me" mindset—relying on algorithms and social networks to deliver information passively. And according to new research from Penn State, this makes them significantly more vulnerable to misinformation.
The study, published in Social Media & Society, reveals a troubling pattern. People with higher "news will find me" (NFM) tendencies view algorithmically-recommended content and socially-shared posts as equally credible as professionally-curated journalism. Those with lower NFM levels, by contrast, more critically evaluate sources and prioritize recommendations from editors and reporters.
S. Shyam Sundar, who led the Penn State research team, noted the core vulnerability: "people with this tendency to rely on news coming to them are trusting algorithms and social media friends to be their news sources."
The implications are significant. When audiences grant algorithms and social networks the same authority as journalists, bad actors can manipulate information spaces far more easily than if they had to impersonate established news organizations. It's easier to game a recommendation algorithm or create viral social content than to build decades of editorial credibility.
This isn't just about being lazy or preferring convenience. The NFM mindset represents a fundamental shift in how people conceptualize information gathering. Traditional news literacy assumes active seeking—you choose which outlets to trust, you develop media literacy skills to evaluate sources. Passive consumption breaks that model.
The research raises uncomfortable questions about platform design. Social media companies and content aggregators have spent years optimizing algorithms to keep users engaged, not to maximize information quality. When engagement becomes the metric, sensational misinformation often performs better than careful journalism.
Sundar proposed targeted media literacy interventions for high-NFM individuals, specifically educating them about information origins and journalistic verification processes. That's a reasonable suggestion, but it treats symptoms rather than causes. The deeper problem is structural: we've built information ecosystems that reward passive consumption while simultaneously making passive consumption risky.
The universe doesn't care how we prefer to get our news. But understanding how information actually reaches us—who curated it, what incentives shaped that curation, which verification processes it underwent—matters for distinguishing truth from fiction. Let's find out what's actually true, not just what algorithms think will keep us scrolling.





