Alaska's House just unanimously passed a bill that tackles two of tech's thorniest problems: AI-generated sexual imagery and predatory social media design targeting kids. It's not the first state to move on these issues, but the combination and the bipartisan support signal that the regulatory walls are closing in.
House Bill 47, sponsored by Republican Rep. Sarah Vance, does something smart: it treats AI-generated child sexual abuse material the same as images of real children. No more legal gray area where prosecutors have to prove an actual child was harmed. If you're generating synthetic CSAM, it's a crime. Full stop.
The bill also creates penalties for creating deepfake sexual imagery of adults without consent - addressing the revenge porn problem that's exploded with accessible AI tools. This matters because the technology has gotten really good and really accessible. You don't need technical skills anymore. You just need a photo and a sketchy website.
On the social media side, the bill requires parental permission for anyone under 18 to create accounts, gives parents full access to their kids' accounts, and imposes a 10:30 PM default curfew on usage. It also bans algorithmic feeds and targeted advertising for minors.
Now, let's talk about what's actually enforceable here. The intent is right. The implementation is going to be messy.
How do you verify age online without creating a surveillance nightmare? How do you authenticate that the person giving parental permission is actually the parent? How do you enforce a usage curfew without platforms monitoring every user constantly? These are hard problems, and the bill doesn't solve them - it just mandates that someone figure it out.
But here's why I think this legislation matters even if the execution is imperfect: it's forcing the conversation. For years, tech platforms have argued that regulation would be impossible to implement, too burdensome, a violation of free speech. And lawmakers mostly bought it.
Now states are calling that bluff. Alaska's bill includes a provision making AI companies liable for $1 million per occurrence if their systems are used to create child sexual abuse material. That's not a slap on the wrist. That's money.

