Leaked internal communications from YouTube show employees explicitly discussing strategies to create addictive viewing patterns—and shelving safety tools that would have reduced user engagement.
This isn't about algorithms accidentally keeping people hooked. It's about intentional design choices prioritizing watch time over user wellbeing.
The chat logs, revealed through legal discovery, show YouTube engineers and product managers using terms like "viewer addiction" in internal discussions about recommendation algorithms. More damning: the documents show that safety features designed to help users moderate their viewing habits were deliberately scrapped because they would reduce engagement metrics.
I've been in rooms where these tradeoffs get made. When you're a tech company, growth metrics aren't just numbers—they're tied to bonuses, promotions, and your next funding round. The pressure to optimize for engagement is immense.
But there's a difference between building an engaging product and engineering compulsive behavior. These chat logs show YouTube crossing that line.
One particularly telling exchange shows engineers discussing how to increase "session length" by making the autoplay feature more aggressive. Another thread reveals that a proposed "take a break" reminder feature was killed because early tests showed it reduced daily active users.
The technology isn't subtle. YouTube's recommendation algorithm is designed to keep you watching, and it's very good at its job. It learns what keeps your attention and feeds you more of it, creating what researchers call a "rabbit hole effect."
Guillaume Chaslot, a former YouTube engineer who worked on the recommendation algorithm, has been warning about this for years. "The algorithm is not neutral," he's said. "It's optimized for watch time, and watch time is not the same as user wellbeing."
YouTube has long maintained that its goal is to help people find videos they'll enjoy. But these internal communications suggest a more cynical calculation: maximizing watch time, even when the company knew it could implement safeguards.
The question now is what regulators will do with this information. In Europe, the Digital Services Act already requires platforms to give users more control over recommendation algorithms. The United States has been slower to act, but internal documents like these could change the conversation.
From a technical perspective, building less addictive features isn't hard. YouTube already has the capability to implement break reminders, viewing time caps, and less aggressive autoplay. The chat logs show these features were built and tested.
They just chose not to deploy them.
The technology is impressive. The question is whether it's being used responsibly. These chat logs suggest YouTube already knew the answer—and chose engagement metrics anyway.




