A Los Angeles jury found Meta and YouTube liable in a landmark case over social media addiction. The verdict establishes that platforms can be held responsible for deliberately addictive design - and those internal emails just became legal liabilities.
As someone who's been inside tech companies, I can tell you about the "engagement" metrics that drive these products. Time on site. Daily active users. Session length. Every feature is A/B tested to maximize the number that goes up. The jury just said that's not innovation - it's negligence.
The case centered on claims that Meta (Facebook and Instagram) and Google's YouTube deliberately designed their platforms to be addictive, particularly for young users. The plaintiffs presented internal documents showing the companies knew their products were causing harm but prioritized growth over safety.
Sound familiar? It should. This is the tobacco playbook: know your product is harmful, optimize it anyway, and deny responsibility when people get hurt.
What makes this verdict significant is that it pierces the Section 230 shield that typically protects platforms from liability for user-generated content. The jury found that the design of the platforms - the infinite scroll, the algorithmic feeds, the notification systems - was itself the harm. You can't Section 230 your way out of that.
For tech companies, this changes everything. Those internal Slack messages about "increasing stickiness"? Discovery. Those A/B tests that measured how to keep teens scrolling? Evidence. Those emails where someone pointed out the mental health risks and got ignored? Exhibit A.
Every major platform uses similar engagement tactics. They all have internal research about the psychological effects. They all made the same calculation: growth now, deal with consequences later. "Later" just arrived.
I expect appeals, and this will take years to fully resolve. But the jury has spoken, and they said platforms can't hide behind "we just provide the technology" while deliberately engineering addiction. They made choices, those choices caused harm, and they're liable.
