Civil society organizations just sent Meta a warning that's impossible to ignore: if you add facial recognition to your Ray-Ban and Oakley smart glasses, you're building tools for stalkers and sexual predators.The statement is blunt because the concern is real. The technology to recognize faces in real-time through wearable cameras absolutely works. Meta knows it works — they've already demoed it internally. The capability exists. The question isn't whether they can do it. It's whether society can handle it if they do.And according to Wired's reporting, privacy advocates are saying: no, we can't.Here's the scenario they're worried about. Someone wearing smart glasses can walk through a crowd, look at strangers, and instantly pull up their names, social media profiles, work history, and home addresses — all without consent, all without the person even knowing they've been scanned. For journalists, activists, and abuse survivors, that's not a convenience feature. It's a nightmare.Students at Harvard already built a proof-of-concept using Meta's current Ray-Ban smart glasses combined with facial recognition APIs. It worked flawlessly. Meta doesn't even need to add the feature — third parties are already reverse-engineering it.The difference is whether Meta legitimizes it by making it official.The company has so far resisted adding facial recognition to its glasses, likely because they know the backlash would be catastrophic. But the pressure to add it is enormous. Competitors are experimenting with it. Users want it. And from a purely technical standpoint, it's the logical next step.The problem is that logical and ethical aren't the same thing. This is one of those rare cases where the tech works perfectly — and that's exactly why it shouldn't ship. Not everything that can be built should be built. And not every feature that users want is one they should have.Meta's decision on this will set the standard for the entire wearable AI industry. Either they hold the line and accept that some features are too dangerous to release, or they ship it and deal with the consequences — which will likely include regulation, lawsuits, and very real harm to vulnerable people.The technology is impressive. The question is whether anyone needs it. And more importantly, whether the people who don't want it should have to live in a world where everyone else has it.
|
