Evan Spiegel isn't some Luddite critic standing outside the tech industry throwing rocks. He runs Snap, a company with 400 million daily users and billions in revenue. So when he tells other tech leaders they're "drastically underestimating" public anger about AI, it's worth paying attention.
Speaking at a tech conference this week, Spiegel warned that the industry is courting a serious backlash over AI—specifically the economic displacement, creative theft, and erosion of trust that generative AI is causing. He didn't mince words: "We're building resentment faster than we're building value."
The Three Pillars of Anger
Spiegel identified three sources of growing public frustration:
1. Economic displacement without solutions. AI companies promise that automation will free humans from drudgery while offering no coherent plan for people whose livelihoods depend on the work being automated. "We're telling writers, artists, and programmers that AI will make them more productive," Spiegel said. "What we're actually doing is making their skills less valuable."
2. Creative theft at scale. Generative AI models are trained on copyrighted work without permission or compensation. Artists see their styles replicated. Writers find their prose regurgitated. Musicians discover AI-generated tracks that sound suspiciously like their compositions. The industry's response has been, essentially, "that's how machine learning works." That's not flying anymore.
3. Erosion of trust. AI-generated content is flooding the internet—fake news, deepfakes, synthetic media that's indistinguishable from reality. The technology makes it trivial to generate convincing lies at scale. "We're teaching people they can't trust anything they see or read," Spiegel warned. "That has consequences."
Why This Matters
The tech industry has weathered backlash before. Privacy scandals, antitrust investigations, content moderation controversies—companies have navigated all of it with a combination of lobbying, PR campaigns, and minor policy adjustments.
But Spiegel argues AI is different. It's not a single issue that can be addressed with better privacy controls or content policies. It's a technology that fundamentally threatens people's economic security, creative autonomy, and ability to distinguish truth from fiction.
"This isn't like Cambridge Analytica," he said, referencing the Facebook data scandal. "This is structural. And it's accelerating."
Is This Genuine Concern or Strategic Positioning?
Cynics will note that Snap has been cautious with generative AI compared to competitors. The company uses AI for features like augmented reality lenses and recommendation algorithms, but hasn't gone all-in on generative content the way Meta, Google, and Microsoft have.
Is Spiegel genuinely worried about societal backlash, or is he positioning Snap as the "responsible" tech company to differentiate from competitors?
Probably both. Spiegel has a history of taking contrarian positions that turn out to be strategically smart. Snap resisted algorithmic feeds when everyone else was chasing engagement metrics. They prioritized privacy when competitors were monetizing user data. Both moves looked risky at the time and aged well.
If AI backlash materializes—and there are signs it's already starting—Snap's restraint could look prescient.
The Brewing Backlash
There's evidence Spiegel is onto something. Artists are organizing boycotts of AI image generators. The Writers Guild of America successfully negotiated contract terms limiting AI use in Hollywood. Lawsuits from The New York Times, Getty Images, and multiple authors are working through courts.
Public opinion is shifting too. Early excitement about AI tools is giving way to concern about job security, misinformation, and concentration of power in tech companies. The "move fast and break things" ethos that worked for social media isn't playing well when the thing being broken is people's careers.
What Happens Next
The tech industry can either address these concerns proactively—fair compensation for training data, transparency about AI use, investment in economic transition support—or it can wait for regulation and public backlash to force changes.
Spiegel's warning suggests he thinks the industry is choosing the latter. And if he's right, the AI boom could end not with regulation or technical limitations, but with a public that simply refuses to accept what's being built.
