Domestic abuse organizations and privacy advocates are sounding the alarm on Meta's plan to add facial recognition to its smart glasses. They warn the technology could enable stalking and harassment on an unprecedented scale, particularly for women fleeing abusive partners.
Every privacy concern about Google Glass is about to come roaring back—except now the tech actually works.
The Meta Ray-Ban smart glasses already let wearers capture photos and video discreetly. Adding facial recognition means they'll soon be able to identify anyone they look at in real time. Walk past someone on the street, and the glasses could pull up their name, workplace, social media profiles, and home address. All without the target knowing they've been scanned.
UK-based domestic abuse charities are particularly concerned. They work with thousands of women in protective housing, living under assumed names to escape violent ex-partners. Facial recognition smart glasses could bypass years of careful operational security with a single glance.
Ruth Davison, CEO of Refuge, told The Independent: "This technology could quite literally put women's lives at risk. We work with survivors who have gone to extraordinary lengths to stay hidden. Smart glasses with facial recognition would make that nearly impossible."
The engineering is genuinely impressive. Meta's demos show near-instant identification, even in crowded environments with poor lighting. The on-device processing means queries don't need to round-trip to the cloud. It's fast, accurate, and increasingly hard to detect.
That's exactly the problem.
When Google Glass launched in 2013, the backlash was immediate. "Glassholes" got kicked out of bars, restaurants, and gyms. The visible camera and awkward design made wearers conspicuous. The social immune response killed the product before the technology matured.
Meta's version learned those lessons. The Ray-Ban collaboration produces glasses that look normal. The recording indicator light is small enough that most people don't notice it. And the facial recognition happens invisibly, locally, with no obvious external sign.
