Manitoba will prohibit social media and AI chatbot access for young people, becoming the first Canadian province to impose such restrictions. The move follows similar efforts in Australia and multiple U.S. states.
It also follows Australia's ban failing spectacularly, which apparently wasn't a deterrent.
Another jurisdiction is trying to put the social media genie back in the bottle. These laws sound decisive but the implementation is always the problem. The real story is why governments keep reaching for unenforceable bans instead of regulation that might actually work.
Manitoba's approach will likely mirror Australia's: age verification systems that teenagers will immediately figure out how to bypass. Face masks for facial recognition. Parents' IDs for identity verification. VPNs to route around geographic restrictions. The playbook for defeating these systems is already widely distributed on TikTok — the platform they're supposedly being banned from.
The ban also extends to AI chatbots, which raises interesting questions about enforcement. Social media platforms have centralized chokepoints — app stores, domain names, hosting providers. AI chatbots can run locally, operate through APIs, or be accessed through countless websites and services. How exactly does Manitoba plan to prevent a teenager from accessing ChatGPT or Claude?
Governors and premiers love these bans because they look like action. Standing up to Big Tech, protecting children, doing something about youth mental health. Great soundbites, good optics, addresses legitimate parental concerns.
But the actual mechanism never works. We have decades of evidence that age verification systems are trivially easy to bypass. Every teenager knows how to lie about their birthdate. Many have access to parents' identification. Tech-savvy teens can use VPNs, create accounts with fake information, or find the countless platforms and services that don't even try to verify age.
What's frustrating is there are policy approaches that might actually help. Mandatory algorithm transparency so we understand how platforms manipulate engagement. Restrictions on infinite scroll and notification patterns designed to maximize addiction. Required parental notification for data collection and targeted advertising. Actual enforcement of existing privacy laws that already restrict what companies can do with minors' data.
