The European Commission is preparing major fines against Meta for systematic failures to prevent under-13 users from creating accounts on Facebook and Instagram—undermining age restrictions the company has pointed to for years whenever child safety concerns arise.
The EU is essentially saying those restrictions are theater. And they have the evidence to prove it.
The Charges
According to sources familiar with the investigation, the European Commission found that Meta's age verification systems are "inadequate to the point of negligence." Creating an account requires users to enter a birthdate, but there's no meaningful verification that the date is accurate.
Researchers created test accounts using obviously false birthdates—listing ages of 5, 8, and 10 years old—and all were approved without additional verification. The systems flagged suspicious activity in some cases, but users could bypass these checks by simply changing the birthdate to meet the 13+ requirement.
The Commission's argument: if Meta can deploy sophisticated AI to detect copyright infringement, hate speech, and ad fraud, they can verify user ages. The fact that they don't suggests the company benefits from lax enforcement.
Why Meta's Defense Falls Apart
For years, Meta has responded to child safety criticism by pointing to its terms of service: users must be 13 or older. If underage users lie about their age, that's on them and their parents, not on Meta.
The EU isn't buying it. Under Europe's Digital Services Act, platforms have affirmative obligations to prevent illegal activity—not just prohibit it in fine print. If millions of children are using your platform despite age restrictions, and you're doing nothing meaningful to stop it, you're liable.
Internal documents obtained by EU regulators reportedly show Meta has known about the scope of underage usage for years. They've discussed implementing stricter verification but decided against it due to "friction concerns"—corporate speak for "it might reduce user growth."
The Potential Penalties
The Digital Services Act allows fines up to 6% of global annual revenue. For Meta, that could mean penalties exceeding $7 billion. The Commission hasn't announced final figures, but sources indicate the fine will be "substantial enough to ensure compliance."
Beyond the fine, Meta will likely be required to implement robust age verification systems—potentially including biometric verification, ID checks, or third-party age estimation technology. All of which will add friction, reduce growth, and cost money.
The Broader Implications
This case matters because it sets precedent for how platform liability works in the digital age. If companies can dodge responsibility by putting age restrictions in their terms of service while doing nothing to enforce them, those restrictions are meaningless.
The EU is saying: if you operate a platform, you're responsible for who's using it. Not just for detecting copyright violations or removing hate speech after the fact, but for preventing prohibited users from accessing the service in the first place.
Other countries are watching. The UK's Online Safety Act includes similar provisions. Australia is considering mandatory age verification for social media. If the EU wins this case, expect a wave of enforcement actions targeting platforms that have turned a blind eye to underage usage.
Meta has spent years building a business model that relies on maximum user growth and minimal friction. The EU is about to make that a lot more expensive.
