A father found himself trapped in a support nightmare after discovering Discord had knowledge his teenager lied about their age, but only took action after a data breach exposed the discrepancy. The case raises uncomfortable questions about platform liability and age verification enforcement that the tech industry would prefer to ignore.Every platform has age requirements. Almost no platform enforces them until something goes wrong. The reason is simple: actually enforcing age restrictions would require banning millions of users and tanking engagement metrics. The incentives are completely misaligned.Discord had the data to know this kid was underage. The information was sitting in their systems. But treating that information as real — as something requiring action rather than just a checkbox for legal compliance — would mean acknowledging that a huge percentage of their user base lied about their age.According to the father's account, it took a data breach dumping Discord's internal records for the company to suddenly care about the age discrepancy they'd known about all along. That's not a system failure. That's a system working exactly as designed: ignore the problem until external pressure forces action.This pattern repeats across social platforms. Instagram, TikTok, YouTube — they all claim to prohibit users under 13, all have age verification at signup, and all know perfectly well that millions of kids lie and get through anyway. The platforms could implement stricter verification. They choose not to.The father's support nightmare is instructive. Once he discovered the issue and tried to resolve it through official channels, he hit the classic tech support wall: automated responses, transferred between departments, no human with actual authority to solve the problem. This is what happens when companies optimize for scale over accountability.The technology to verify age more strictly exists. Require ID verification. Use AI to estimate age from photos and behavior patterns. Cross-reference data with school records or parents. But each of those solutions has costs — in user friction, in privacy concerns, in excluding legitimate users.So platforms choose the cheapest option: pretend to care, collect birthdates that nobody verifies, and hope nothing bad happens. When it does, blame the users for lying rather than the platforms for not checking.The question isn't whether Discord can prevent underage users. It's whether they actually want to.
|
