A jury just rejected Meta's attempt to blame users for social media addiction, and it's the most important tech verdict you haven't heard enough about yet.
I've built products designed to maximize engagement. The industry knows exactly what it's doing. Every A/B test, every notification strategy, every algorithmic tweak is measured against one north star metric: time on platform. We hire behavioral psychologists. We study dopamine loops. We test infinite scroll against paginated feeds and—surprise—infinite scroll wins because it removes the natural stopping point.
Then when people can't put their phones down, we shrug and say "user choice."
Meta's defense in this case was textbook tech industry deflection: People choose to use social media. They choose to keep scrolling. We just built a product people love. The implication is that if you're addicted, that's on you. The jury didn't buy it, and they shouldn't have.
Here's the thing about addiction and choice: When you engineer a product to be habit-forming, you can't claim neutrality. Las Vegas casinos know this. They design everything from the carpet patterns to the lack of clocks to keep you gambling longer. No one thinks that's an accident. No one thinks slot machines are just "giving people what they want."
Social media works the same way. The autoplay. The read receipts. The algorithmic feed that shows you exactly the content that makes you angriest or most engaged. The notifications calibrated to come just frequently enough to keep you checking. None of that is accidental. It's engineered behavioral design, informed by decades of psychology research.
This verdict matters because it says companies can't engineer addiction and then claim it's the user's fault. It creates liability for design choices. That changes incentives. Suddenly the product team has to care about whether users can actually put the app down, not just whether they're hitting engagement targets.
Will this fix social media? No. One verdict isn't a silver bullet. But it's a crack in the "we're just a platform" defense that tech companies have hidden behind for a decade. The products aren't neutral. The algorithms aren't just recommendations. They're behavioral modification engines, and now there's legal precedent that says companies can be held responsible for how they deploy them.

