The European Union just fired a warning shot at TikTok and Meta that targets the core of how their platforms actually work. Not content moderation. Not data privacy. The algorithms themselves.
The enforcement action goes after algorithm design that targets children and creates addictive behavior. If the EU wins this fight, it could force a fundamental redesign of how social platforms work worldwide.
Here's what makes this different from previous regulatory actions: every other crackdown has focused on what platforms show. This one targets how they decide what to show. That's a much bigger deal.
The EU's argument is straightforward: TikTok and Meta deliberately design their recommendation algorithms to maximize engagement, particularly among young users. Infinite scroll. Autoplay. Personalized feeds that keep you watching. These aren't accidents, they're features.
And according to EU regulators, they're illegal under the Digital Services Act.
The DSA, which went into full effect last year, requires platforms to assess and mitigate risks from their systems. That includes risks to mental health, especially for minors. The EU is now claiming that addictive algorithms are a systemic risk that platforms failed to address.
If they're right, this gets expensive fast. The DSA allows fines up to 6% of global revenue. For Meta, that could be billions.
But the bigger question is whether you can actually regulate "addictiveness" without breaking the business model. Because here's the uncomfortable truth: the algorithms work exactly as designed. They maximize time on platform. They surface content you'll engage with. They're incredibly good at keeping you scrolling.
That's not a bug. That's the product.
Social media companies have known this for years. Internal research at Facebook (now Meta) showed Instagram was harmful to teen mental health. They shipped the features anyway. TikTok's algorithm is even more aggressive, with recommendation systems that can hook users in minutes.
The platforms will argue that users want these features. That parents should monitor their kids' screen time. That regulation will stifle innovation. They've made these arguments before, in every market they operate in.
The EU is calling their bluff.
What's interesting is the precedent this sets. If European regulators can force algorithmic changes, other countries will follow. The UK is already watching closely. So is Australia. Even some U.S. states are exploring similar frameworks.
TikTok and Meta can't just exit Europe the way they've threatened to do with smaller markets. Europe is too big, too wealthy, and too important to abandon. Which means if the EU wins, they'll have to actually change how their platforms work.
The question is: can you make TikTok less addictive without making it less TikTok? Can Meta's algorithm prioritize wellbeing over engagement and still keep shareholders happy?
I'm skeptical. The entire economic model of social media is built on attention capture. You don't become a trillion-dollar company by optimizing for users' mental health. You do it by optimizing for time on site.
But maybe that's exactly why regulation is necessary. Because left to their own devices, these companies have shown they'll prioritize growth over everything else.
The EU's enforcement action won't resolve quickly. These cases take years. But the signal is clear: algorithms are no longer untouchable. If you build systems that harm users, especially kids, regulators will come for you.
And unlike content moderation debates, where platforms can claim they're just hosting speech, this is about the systems they built. The code they wrote. The choices they made.
They can't claim they're just a platform when the algorithm itself is the product.





