While Western governments are still figuring out what questions to ask about AI, China is already writing rules for digital humans.Whether you like their approach or not, they're setting precedents that will influence how this technology develops globally.China announced new regulations governing AI-generated digital humans, including restrictions on addictive features in services marketed to children. The move represents one of the first government attempts to regulate synthetic personas as they become more prevalent in entertainment and commerce.Digital humans are AI-generated avatars that can speak, interact, and appear in videos or live streams. They're used as virtual influencers, customer service representatives, news anchors, and entertainment personalities in China. The technology has advanced to the point where distinguishing real from synthetic is genuinely difficult.China's new rules require companies creating digital humans to register with authorities, disclose when content features synthetic personas, and avoid designs that might be "addictive" to children.That last part is interesting. What makes a digital human addictive? The regulations don't define it precisely, but the intent is clear. China is worried about children forming parasocial relationships with AI personalities designed to maximize engagement.The technology is impressive. The question is whether it should be deliberately engineered to keep kids hooked.China thinks not. These regulations follow the country's broader pattern of aggressive tech regulation focused on social harm. The government previously restricted gaming hours for minors, limited livestream tipping for kids, and banned certain social media algorithms.Western countries have mostly taken a hands-off approach. The United States has no specific regulations for digital humans. The EU is working on AI rules but hasn't addressed synthetic personas directly. Meanwhile, the technology is advancing faster than policy.China's regulations might be imperfect, even heavy-handed. But at least they're addressing questions that matter. Should digital humans disclose they're synthetic? Should there be restrictions on using them to target children? What responsibility do companies have for parasocial relationships their AI personas create?These aren't hypothetical concerns. Virtual influencers already have millions of followers. AI companions apps are growing rapidly. Digital humans are becoming commercial and emotional infrastructure.If those digital humans are designed to maximize engagement without regard for user wellbeing, especially children's wellbeing, that creates real harm. is trying to prevent that harm before it scales.Western critics will point out that 's regulations serve government control as much as consumer protection. True. The disclosure requirements make it easier to monitor who's creating digital humans and what they're saying.But that doesn't mean the underlying concerns are invalid. Digital humans are powerful technology. They can inform, entertain, and assist. They can also manipulate, deceive, and exploit.Regulation is coming everywhere eventually. is writing the first draft. Other countries will either learn from it or reinvent the wheel.Some of 's requirements might become global norms. Disclosing synthetic content makes sense. Restricting deliberately addictive features for children has parallels to existing advertising regulations.Other aspects are specific to 's political system and won't translate. But the fact that is regulating digital humans while Western governments debate whether to regulate AI at all means will shape this market's development.Tech companies building digital human products should pay attention. Today it's . Tomorrow it's the . Eventually, it's the . The rules might differ, but the direction is clear.Digital humans aren't going away. The question is what responsibilities come with creating synthetic personas that people trust, follow, and emotionally invest in. thinks those responsibilities include disclosure, registration, and protecting children from addictive design. That's more than most countries have done so far.
|

