An app literally called TeleGuard that guards nothing.
Security researchers found that TeleGuard, a chat app downloaded over a million times that markets itself as ultra-secure, uploads users' private encryption keys to company servers. That makes its encryption functionally useless. You might as well be sending plaintext.
This is security theater at its worst.
Here's how encryption is supposed to work: you have a private key that never leaves your device and a public key you can share freely. Messages encrypted with your public key can only be decrypted with your private key. As long as the private key stays private, the system is secure.
TeleGuard uploads your private key to their servers.
Once your private key is on someone else's server, the entire security model collapses. The company can decrypt your messages. Anyone who hacks the company can decrypt your messages. Law enforcement with a warrant can decrypt your messages. The encryption is technically present but cryptographically meaningless.
What makes this particularly egregious is the marketing. TeleGuard promotes itself as a secure alternative to mainstream chat apps. It uses all the right buzzwords: end-to-end encryption, military-grade security, privacy-focused. A million people downloaded it specifically because they wanted private communications. They got surveillance with extra steps.
I see this pattern constantly in the security space: companies that use cryptographic complexity as cover for fundamentally broken implementations. They'll tout their use of AES-256 or RSA-4096 - impressive-sounding numbers that mean nothing if you're handing the keys to a third party.
The scary part isn't that one app is broken. It's that a million people trusted it with their private conversations and had no way to know they were being deceived. Unless you're technically sophisticated enough to audit the code or reverse-engineer the protocol, you're relying entirely on the company's claims. And those claims were lies.
This is why security researchers get so frustrated with trust us encryption. Signal publishes their protocol. WhatsApp uses Signal's protocol and allows independent audits. When a company says we've built our own proprietary security system, just trust that it works, that's a red flag the size of a billboard.
TeleGuard went even further: they built a system that looked like end-to-end encryption to casual inspection but secretly undermined it by uploading keys. That's not incompetence. That's either gross negligence or deliberate deception.
The researchers who discovered this reported it responsibly. As of this writing, TeleGuard is still available in app stores, still marketed as secure, still uploading private keys. A million downloads. Ongoing betrayal of user trust. No consequences.
For users, the lesson is simple: if a chat app isn't using a well-established, independently audited encryption protocol, don't trust it. We built our own security system is almost never a good sign. Use Signal. Use WhatsApp. Use something where the encryption has been verified by people who aren't being paid by the company selling it.
The technology existed to make this app actually secure. The question is why the company chose to build security theater instead.





