When Instagram chief Adam Mosseri told the BBC that 16 hours of daily platform use is "problematic" but stopped short of calling it addiction, most listeners probably heard a distinction without a difference. They are wrong. This semantic choice is one of the most consequential legal and regulatory decisions a tech executive can make, and it was not accidental.
Let me explain why this matters.
If Instagram causes addiction, the company faces product liability. Addiction is a recognized clinical category with measurable criteria. Products that cause addiction can be subject to FDA-style oversight — the same regulatory framework that governs alcohol, tobacco, and pharmaceuticals. Plaintiffs in class action lawsuits can argue that an addictive product caused them harm the same way cigarettes caused cancer.
If Instagram causes "problematic use," the company can offer screen time tools, parental controls, and educational resources — and call that a sufficient response. There is no legal or regulatory framework specifically targeting "problematic use." It is a free pass from liability dressed up in clinical-sounding language.
This is the playbook that tobacco companies perfected and social media companies have been quietly refining for years. Meta's own internal research showed that Instagram was harmful to teenage girls' mental health — that research was published by whistleblower Frances Haugen — and the company continued optimizing for engagement anyway. That is not ignorance; that is awareness of the harm combined with a legal strategy to avoid liability for it.
The number Mosseri chose to contextualize this with is telling. Sixteen hours a day is not a hypothetical. It is two-thirds of your waking life on a single app. The fact that the Instagram chief's response to this scenario is "problematic" rather than "we would never allow that to happen" tells you something important about the company's actual relationship with heavy users.
Heavy users are valuable users. A person spending 16 hours a day on Instagram is seeing an enormous number of ads. Their behavioral data is generating extremely detailed profiles for targeted advertising. They are creating content that keeps other users engaged. Instagram's business model benefits from maximizing time-in-app, and its recommendation algorithms are specifically engineered to do exactly that.
The legislative environment is relevant here. Multiple U.S. states have passed or are considering laws restricting social media use by minors. Congressional hearings have put social media executives on record about harms to young users. Mark Zuckerberg famously apologized to parents of harmed children at a Senate hearing. The pressure for legal accountability is real and growing.
In that environment, Mosseri's word choice is not casual. "Problematic" is the version of the statement that does not generate new legal exposure. "Addictive" is the version that would bring trial lawyers to the company's door within a week.
I am not claiming that Instagram is clinically addictive in the same sense as heroin. The science on behavioral addiction and social media is genuinely contested, and honest people disagree about where compulsive use ends and addiction begins. But Mosseri's framing is not motivated by scientific precision. It is motivated by legal risk management.
The technology that enables 16 hours of daily engagement is genuinely impressive. The question is whether the company building it has any obligation to the people it has engineered to use it that much.




