Internal Meta research found that their parental supervision tools do virtually nothing to curb teens' compulsive social media use. The company built the features anyway, likely knowing they were security theater designed to placate regulators rather than protect kids.This is the tobacco playbook applied to social media. Meta knew their parental controls were useless but shipped them anyway to check a regulatory box. The question isn't whether Big Tech will regulate itself—it's whether we'll finally stop pretending it will.Let's talk about what these parental controls actually do. They let parents set time limits. They let parents see who their kids are messaging. They give the appearance of supervision, of control, of responsible platform design. But Meta's own research shows they don't actually reduce problematic use.Why? Because addiction doesn't care about parental controls. When a platform is engineered to maximize engagement through dopamine-driven feedback loops, letting Dad set a screen time limit is like putting a speed bump on a highway. It might slow things down for five seconds, but it doesn't change the fundamental design.The truly damning part is that Meta knew this. They researched it. They found that their parental supervision features weren't effective. And they launched them anyway because they needed to show regulators and angry parents that they were "doing something" about teen mental health.I've covered tech companies for years, and this pattern is exhaustingly familiar. Build a feature you know doesn't work. Launch it with a press release about how much you care about safety. Use it to deflect regulation. Repeat.The social media industry has spent years insisting they can self-regulate. That they'll build the tools parents need. That they take teen mental health seriously. And all the while, their internal research keeps showing that none of it actually works.Parental controls aren't the problem. The platforms themselves are the problem. You can't fix a slot machine by letting parents set betting limits. You have to question whether kids should be playing slots at all.Europe is moving toward real regulation—age verification, restrictions on algorithmic feeds for minors, actual penalties for violations. The U.S. keeps hoping voluntary measures will work. Meta's internal research proves they won't.The question now is how many more leaked documents we need before we stop believing that Facebook, Instagram, and their peers will ever prioritize child safety over engagement metrics. Because every time we give them a chance to do the right thing, they do the thing instead.Parental controls don't work.
|
