An engineer who just wanted to control his robot vacuum with a PlayStation 5 controller ended up with a $30,000 bug bounty after discovering a security vulnerability affecting 7,000 devices—a reminder that the most serious security flaws are often found by people just trying to make their gadgets do something fun.
The vulnerability, found in a popular brand of Wi-Fi-enabled robot vacuums, would have allowed an attacker to remotely access the device's camera and microphone, effectively turning household cleaning robots into mobile surveillance devices. The affected vacuums were internet-connected, app-controlled, and—as it turns out—seriously under-secured.
Here's how it happened: The engineer—who has chosen to remain anonymous—was reverse-engineering the vacuum's communication protocols to build custom firmware that would let him control it with a gaming controller. In the process, he discovered that the vacuum's authentication system was essentially decorative. Once he understood the API structure, he could access any vacuum on the network without proper credentials.
That's bad. But it gets worse.
The vacuums included cameras for navigation and obstacle avoidance. Those cameras were accessible through the same poorly-secured API. An attacker who knew the vulnerability could potentially watch live feeds from thousands of robot vacuums in homes around the world. The microphones, used for voice commands, were similarly exposed.
The engineer reported the vulnerability through the manufacturer's bug bounty program. To their credit, the company responded quickly—confirming the issue, patching the firmware, and pushing updates to affected devices within weeks. They also paid the $30,000 bounty, which is on the higher end for IoT security findings and suggests they understood the severity.
What makes this case interesting isn't just the vulnerability itself—IoT devices are notoriously insecure, and camera-equipped gadgets with weak authentication are practically a cliché in security research. What's interesting is how it was found. The engineer wasn't conducting a security audit. He wasn't trying to find flaws. He just wanted his vacuum to respond to a game controller, thought it would be a fun project, and stumbled into a serious privacy issue along the way.
This is how a lot of significant security research happens. Someone wants to customize a device, starts poking around, discovers the security is held together with digital duct tape, and realizes they've found something genuinely concerning. The line between hobbyist tinkering and security research is often just a matter of what you find when you start looking.

