Scarlett Johansson has joined forces with a coalition of actors, musicians, and artists in accusing AI companies of large-scale intellectual property theft, escalating the legal and political battle over how artificial intelligence systems are trained.
The group argues that AI firms scraped their voices, likenesses, and creative works without permission to train systems that can now generate synthetic performances—potentially eliminating the need to hire the actual artists.
Johansson has particular standing in this fight. Last year, OpenAI released a voice assistant that sounded remarkably similar to her voice from the film Her—despite her declining to work with them. The company pulled the voice after public backlash, but the incident highlighted how easily AI can clone vocal patterns.
The technology has gotten disturbingly good. Modern voice cloning systems need just a few minutes of audio to create convincing synthetic speech. Image generation models can produce photorealistic images "in the style of" specific artists. Video generators are getting there too.
Here's the legal question: is training an AI model on copyrighted works fair use, or is it infringement? AI companies argue they're doing the digital equivalent of an art student studying masterworks. Artists argue it's more like photocopying their work and selling the copies.
The courts haven't decided yet. Multiple lawsuits are working through the system, including cases brought by authors, visual artists, and now performers. The outcomes will determine whether AI companies need to license training data or can continue scraping anything publicly available.
What makes the celebrity involvement significant: these are people with resources to fight multi-year legal battles against well-funded tech companies. Most individual artists can't afford that. But Scarlett Johansson can.
The AI companies' position is that restricting training data would cripple AI development. They're probably right—current models require massive datasets. But "we need to steal your work to build our product" isn't typically a winning legal argument.
What's at stake: if artists win, AI companies face billions in licensing costs and potentially having to retrain models from scratch using only licensed or public domain data. If AI companies win, the entire creative industry faces competition from systems trained on their own work.
The technology is impressive. The question is whether you can build it on the backs of artists without paying them.




