Unsealed court documents reveal that Meta staff explicitly warned executives that encrypting Messenger would cause 7.5 million annual child sexual abuse reports to vanish—and the company rolled out encryption anyway.
The documents, obtained by IBTimes UK from a New Mexico lawsuit, show internal communications from 2022 where Meta's Trust and Safety team quantified exactly what would be lost when end-to-end encryption went live. The number—7.5 million reports per year—represented the vast majority of child safety reports Meta was filing with the National Center for Missing & Exploited Children.
To be clear: Meta knew. They had the number. They made the decision anyway.
End-to-end encryption means Meta can't see message content, even if they want to. That's good for privacy—it means governments can't compel Meta to hand over your private conversations. But it also means Meta's automated systems can't scan for child sexual abuse material (CSAM), which is how they generated those 7.5 million reports.
The internal documents show staffers proposing alternative detection methods: scanning images before encryption, using on-device scanning, or building AI systems that could work with encrypted data. All were either rejected or deprioritized. Meta's position was that privacy comes first, and any scanning—even for CSAM—undermines the promise of encryption.
That's a defensible philosophical position. But it's hard to square with Meta's public messaging, which has always emphasized child safety as a top priority. If you knew you were about to make 7.5 million annual reports disappear, wouldn't you at least try to build alternative detection systems first?
The New Mexico lawsuit alleges that Meta chose speed over safety. The company was facing pressure from Apple and Signal, both of which offer encrypted messaging. WhatsApp already had encryption, and Meta needed Messenger to catch up or risk losing users. Encryption became a competitive feature, and child safety was the acceptable cost.
Meta disputes this characterization. A spokesperson told me that the company has invested heavily in safety features that work with encryption, including reporting tools for users, parental controls, and partnerships with law enforcement. They also point out that most CSAM doesn't originate on Meta platforms—it's shared from other sources.
That's true, but it's also irrelevant. If Meta was catching 7.5 million reports per year, that means their systems were working. Turning them off without a replacement is a choice.
The broader question is whether it's possible to have both privacy and safety. Apple proposed on-device scanning in 2021—AI that would check photos for CSAM before they were uploaded to iCloud. Privacy advocates freaked out, arguing it was a backdoor that could be expanded to scan for anything. Apple shelved the plan.
But the alternative is what we have now: platforms that are either fully scanned (no privacy) or fully encrypted (no scanning). There's no middle ground, and that's a policy failure, not a technical limitation.
The New Mexico case is part of a wave of lawsuits targeting tech platforms for enabling abuse. New Mexico's attorney general is arguing that Meta knew their product was being used to exploit minors and failed to take adequate action. The unsealed documents are damning because they show awareness at the highest levels.
Mark Zuckerberg personally approved the encryption rollout. That's in the documents. So he knew about the tradeoff.
This isn't a story about encryption being bad. I use encrypted messaging. I think privacy matters. But I also think child safety matters, and I don't accept the argument that we can't have both. The fact that Meta didn't even try to build an alternative detection system before flipping the switch is damning.
The lawsuit will take years. In the meantime, those 7.5 million reports are gone. The exploitation is still happening—Meta just can't see it anymore.
The technology is impressive. The question is whether Meta prioritized the right things. And based on these documents, the answer is no.
