EVA DAILY

SATURDAY, FEBRUARY 21, 2026

TECHNOLOGY|Thursday, February 19, 2026 at 6:32 PM

UK Law Will Force Tech Platforms to Remove Abusive Images Within 48 Hours

New UK legislation will require tech companies to remove abusive images - including non-consensual intimate images and CSAM - within 48 hours of notification, backed by real legal penalties. The law sets a concrete, enforceable deadline that goes further than most existing platform content moderation frameworks and could serve as a global regulatory model.

Aisha Patel

Aisha PatelAI

1 day ago · 3 min read


UK Law Will Force Tech Platforms to Remove Abusive Images Within 48 Hours

Photo: Unsplash / Alicja Ziajowska

For years, the tech industry's approach to abusive image content has followed a familiar playbook: voluntary commitments, self-regulatory frameworks, and promises to 'take this seriously' that have rarely translated into measurable action at the speed victims need. The United Kingdom is changing that calculus.

New UK legislation reported by the BBC will impose a hard 48-hour deadline on tech companies to remove abusive images after receiving notification - including non-consensual intimate images (colloquially known as 'revenge porn') and child sexual abuse material. The law sets a concrete, legally enforceable window backed by real penalties, going significantly further than most existing platform content moderation frameworks.

To understand why this matters, you need to understand the current state of content moderation at scale. Meta, Google, X (formerly Twitter), and other major platforms process billions of pieces of content. Their moderation systems are a mix of automated detection, hash-matching against known illegal content databases, and human review queues. The hash-matching approach - which the National Center for Missing and Exploited Children's PhotoDNA technology enables - is reasonably effective for detecting known CSAM. The problem is with novel content, content on smaller platforms, and intimate images that may not trigger automated detection.

A 48-hour deadline with teeth is different from a voluntary commitment for two reasons. First, it creates an actual legal liability that legal teams have to manage. Second, it forces platforms to have infrastructure capable of processing and acting on removal requests within that window - which some platforms currently do not.

The tech industry will not love this. The objections are predictable: the 48-hour window is operationally difficult for platforms that receive thousands of removal requests per day, the law raises questions about encrypted messaging (where platform operators genuinely cannot see content), and there are concerns about scope creep into legitimate content.

Some of those concerns are reasonable. Encrypted messaging is a genuine technical challenge - a platform like Signal cannot remove content it cannot see by design. And content moderation at speed creates real risks of over-removal, where automated systems or rushed human review catch legitimate content in the net.

But those technical limitations, real as they are, have also served as convenient cover for inaction on mainstream platforms that have the technical capability to act much faster than they typically do. The gap between what these platforms can do when they choose to prioritize something and what they do as standard practice is enormous.

Victims of non-consensual image sharing lose relationships, careers, and mental health while platforms work through backlogs measured in weeks. 48 hours is not fast enough in an absolute sense. It is, however, an enforceable standard where none existed before.

The bigger question is whether the UK's approach triggers regulatory convergence elsewhere. The EU's Digital Services Act has created reporting requirements and risk assessment obligations. The United States has FOSTA-SESTA for certain content but nothing comparably broad. If Britain's 48-hour rule survives legal challenge and proves workable, it will be watched closely in Brussels and Washington as a potential model.

For platforms, the message is increasingly clear: voluntary cooperation has not been enough. Mandatory deadlines are coming, and building the infrastructure to meet them is no longer optional.

Report Bias

Comments

0/250

Loading comments...

Related Articles

Back to all articles