- Cyborg Bytes
- Posts
- Why Your Texts Were Never Private (The Encryption Wars)
Why Your Texts Were Never Private (The Encryption Wars)
Your texts were never private—and not just the old ones. Even now, the messages you think are encrypted can be pulled, stored, and quietly analyzed.
iMessage downgrades to SMS without telling you. WhatsApp backups sit unprotected in the cloud. And Telegram? Most of it isn’t even encrypted end-to-end.
In this video, I’ll show you exactly how your “secure” messages leak—from metadata to backup keys to quantum threats—and what you can actually do about it.
I’m Addie. I spent 8 years doing cryptography in the Air Force. Now I help companies prepare for post-quantum risk—and help people like you stay 10 steps ahead.
What If Everything You Thought About Privacy Was Dead Wrong?
Everyone thinks their texts are private. But here's the truth—it’s been a lie.
If that sentence just made your stomach drop a little—good. Let it. Because this is where the encryption rabbit hole begins.
Here’s the truth: sending an SMS is like writing a postcard in pencil and trusting strangers not to read it. Law enforcement can wiretap it without even breaking a sweat. No warrants. No notice. Just… swoop, and they’re in.
And it’s not just SMS. You know who else thought they were safe? Paul Manafort. Yeah, that Paul Manafort. He was chatting on WhatsApp, assuming “end-to-end encryption” meant FBI-proof. It didn’t. They just yanked his iCloud backups—in plain, readable text.
If the government can casually pull messages off a politician’s iPhone, what do you think they could do with yours?
Let that simmer.
Billions of people are walking around with phones they believe are locked vaults. Meanwhile, those “vaults” have side doors wide open—and barely anyone’s talking about it.
So if even political insiders and everyday users are being exposed without realizing it…
who’s actually safe?

Are You Unknowingly Fighting a War You Didn’t Sign Up For?
Here’s what no app store disclaimer tells you: you’re already a soldier in The Encryption Wars—you just didn’t know you’d been drafted.
You’ve probably installed a bunch of messaging apps that promise “secure,” “encrypted,” “private.” But let’s be real: do you actually know what those terms mean? Or who’s on the other end of that encryption?
You shouldn’t have to be a cryptography PhD to figure out if your messages are safe. And guess what? You don’t. That’s why you’re here. I’m not your professor—I’m your guide. I’ve been deep in this mess, I’ve mapped the landmines, and I’m here to make sure you don’t step on one.
Because this isn’t just about tech—it’s about power. Who gets to see what you say. Who owns your communication. Who can build a profile of your life from your digital breadcrumbs.
And the good news? You can absolutely learn how to protect yourself. By the end of this, you’ll know which apps are smoke and mirrors… and which ones are solid steel.
You’ll know how to push back. To resist.
If you’ve already been drafted into a war you didn’t sign up for… then what’s hiding inside the apps you’re trusting with your most private thoughts?
What’s Hiding Behind the Apps You Trust Most?
You think you’re safe because the app says “encrypted.”
Let’s test that.
SMS: No Encryption. No Excuse.
SMS is raw text. No encryption in transit. No encryption at rest. Your carrier can read it. So can cops. Or anyone with the right hardware.
It’s still used for bank codes, password resets, and medical alerts.
It’s not private. It never was.
iMessage: Secure—Until You Back It Up
iMessage is encrypted between Apple devices—but only blue bubbles. Talk to an Android user? It downgrades to SMS without warning.
Even worse: if you use iCloud backups (enabled by default), Apple stores your messages—and the encryption keys—unencrypted. They can read it. So can law enforcement. And they have.
Apple did release full encrypted backup support—but only if you manually enable Advanced Data Protection. Most people haven’t.

WhatsApp: Strong Encryption, Leaky Ecosystem
WhatsApp uses Signal Protocol to encrypt messages. Good.
But if you back up chats to iCloud or Google Drive? That’s not encrypted. Just a plain-text copy of your private life, ready to be pulled with a warrant or breach.
Metadata’s also fair game:
Who you talk to
When, where, how often
Device info and IP
All of it gets logged—and shared with Meta.
Telegram: Privacy Theater
Telegram talks a big game—but chats aren’t encrypted by default.
Only “Secret Chats” are end-to-end. Group chats? Regular messages? They’re stored on Telegram’s servers, with Telegram holding the keys.
They also built their own encryption protocol instead of using a proven one. Security experts don’t love that.
And when pressured by governments? Telegram has folded before.
Signal: The One App That Doesn’t Play You
Signal is the outlier:
End-to-end encrypted by default
No backups unless you opt in
No metadata logging
No company with ads to sell you out
When subpoenaed, Signal handed over exactly two things:
→ Account creation date
→ Last time it connected
That’s it.
So… How Private Are Your Messages, Really?
Most people don’t lose their privacy because they were hacked.
They lose it because they trusted the defaults.
Encryption only works if it’s actually on, the keys are safe, and the backups aren’t leaking everything behind your back.
If you’re using iMessage, WhatsApp, or Telegram without changing settings—
you’re not encrypted. You’re exposed.
The real threat isn’t that encryption is broken—it’s that you’re trusting tools that never turned it on.
And the only reason encryption exists at all… is because a few rebels refused to let it die.

Why Did the Government Try to Ban Encryption in the '90s—and How Did Hackers Beat Them?
The reason you even have private messaging today?
It’s because a few pissed-off cryptographers decided not to roll over.
Back in the early 1990s, the U.S. government was in full panic mode. The internet was exploding, encryption was going public, and they realized—for the first time—they might actually lose the ability to watch everything.
So they tried to stop it.
Their golden ticket? The Clipper Chip—a piece of hardware the government wanted to embed in every phone and device. It would encrypt your messages, sure… but they’d keep a copy of the keys. A “lawful access” backdoor.
The whole idea was built on one word: trust. Trust us with the master key. Trust that we’ll only use it when we really need to. Trust that it won’t leak.
No one bought it.
Then came the backlash.
Enter Phil Zimmermann, the cryptography activist who created PGP—Pretty Good Privacy. It was one of the first tools that gave regular people strong, unbreakable encryption. The government hated it. They classified it as a weapon. Exporting it was treated like smuggling arms.
So what did Zimmermann do?
He printed the source code as a book.
Because while software exports were illegal, books were protected by the First Amendment. And so PGP spread—legally—across borders. People scanned it, typed it back in, shared it, and started using it. The government couldn’t stop it.
They launched an investigation. It dragged on for years. But they never charged him. The pressure fizzled. The Clipper Chip died. And the first crypto war ended with a rare outcome:
We won.
But it was barely a win. And it came at a cost.
The only reason encryption survived is because a tiny number of people refused to give it up. Not corporations. Not governments. Not the tech elite. Just a handful of stubborn weirdos working out of basements and cafés.
And the irony? Today’s encryption infrastructure—the stuff inside WhatsApp, Facebook Messenger (in Secret Mode), and even parts of Google’s messaging system—is built on a protocol that came from a 50-person nonprofit. No ad revenue. No data harvesting. Just code.
Billions of people rely on it every day without realizing where it came from.
It wasn’t inevitable. It wasn’t planned.
It was accidental resilience, duct-taped together by people who knew what was coming.
They won that round. Barely. But now, governments have traded wiretaps for AI, and phone taps for policy.
How Much Can Someone Learn About You Without Reading a Single Word You Said?
What if I told you that someone could know you were suicidal—without ever reading a single message?
You don’t need content to surveil someone. You need metadata.
Metadata is the shadow of everything you do online. It’s not what you said—it’s when, where, to whom, how often, and from what device. And it’s deadly accurate. You called a suicide hotline from the Golden Gate Bridge? That shows up. You messaged your doctor, then an HIV clinic, then your insurance provider—back-to-back? That shows up too.
This stuff isn’t just speculation. It’s been used in court. It’s been sold to advertisers. It’s been turned into targeting data for cops and governments. The NSA calls it “contact chaining.” Think six degrees of Kevin Bacon—except you’re Kevin, and your friends, family, exes, coworkers, and therapist all get mapped because you sent one emoji to the wrong person.
Encrypted messages don’t hide metadata. That’s what caught Natalie Edwards. She was a whistleblower leaking Treasury documents to a journalist through WhatsApp. The feds couldn’t read her messages, but they saw who she messaged and when—and that was enough.
Even Signal, the most locked-down app in the game, can’t completely avoid this. They minimize it—but if your phone's OS is bleeding metadata and your contacts are leaky, Signal’s privacy ceiling can’t protect you from your phone’s floor.
So if governments can already paint a full picture of your life just from metadata…
why are they still demanding access to your actual messages, too?

Is the Second Crypto War Already Happening—and Are We Losing?
The first crypto war was a fistfight. The second? It’s turning into a quiet execution.
While we’ve been debating E2EE like it’s a fringe nerd issue, governments have been reloading—and now they’re going for the throat.
The U.K.’s Online Safety Bill demands that companies build a way to scan encrypted messages for illegal content. That means one thing: backdoors. The EU has floated similar proposals, under the sugar-coated name “chat control.” And the U.S.? The FBI's still pushing the tired “Going Dark” narrative—claiming they need access to encrypted chats to fight crime, terrorism, whatever sells.
But here’s the bait-and-switch: backdoors don’t just let in the “good guys.” Once the door’s there, anyone can walk through. You don’t get a “safety backdoor” and a “hacker-proof one.” It’s one key, and whoever gets it—wins.
Meanwhile, WhatsApp, under pressure, has hired over 1,000 content moderators. These aren’t just for abuse reports. They review flagged media and accounts. Facebook says they're not reading your messages. But how exactly are they reviewing them then?
Here’s the unspoken reality: The second crypto war isn’t theoretical. It’s already happening. And it’s dressed in nice suits, “safety” bills, and AI-powered flagging systems.
If they win this round—privacy doesn’t just take a hit. It flatlines.
The gloves are off. But what if encryption itself is about to expire… not from policy—but from physics?
Can You Actually Protect Yourself—Without Being a Tech Genius?
Before we look at what’s coming, let’s talk about what you can do right now.
Because while governments are gutting privacy in broad daylight, you still have leverage—you just need to use it.
You don’t need to learn cryptography to protect your privacy. But you do need to stop assuming the defaults have your back.
Let’s turn this into a mission. Real actions. Real gains.
🛑 Don’t use SMS. Ever. If your bank still texts you codes, switch to an authenticator app or push notifications. If your friends insist on texting—pull them aside and explain that SMS is a wide-open megaphone, not a message.
📦 Turn off backups on WhatsApp and iMessage. Seriously, right now. That cute group chat backup? It’s sitting unencrypted on a server. That’s your vault door, wide open.
📱 Use Signal by default. It’s not perfect, but it’s the cleanest option we’ve got: open-source, zero logs, no ads, no creepy data pipelines. Get your circle on it. Make it the norm.
🕳️ Enable disappearing messages. Not just for privacy, but hygiene. You shouldn’t need to scroll through six years of convos to find the one important thing. Set it and forget it.
👁️🗨️ Stay alert to legal and policy changes. Follow the EFF, Privacy International, Fight for the Future. These orgs track the bills, court cases, and policy creep that tries to slide backdoors into your apps while you're just trying to live your life.
Now try these:
📌 Go into your phone settings. Check if WhatsApp backups are on.
📌 Scroll to your Signal settings and enable disappearing messages.
📌 DM three people and ask them to switch to Signal. Send this video as context.
You don’t need to wait for tech to save you. You can start resisting surveillance today—with just your thumbs.
But even if you take every step… the war doesn’t stop with you.
So where does this fight actually end?

What If Everything You’re Encrypting Today Is Already Being Stored to Break Tomorrow?
Even if you take every precaution—even if you follow everything we just walked through—it still might not be enough.
Because there’s a threat encryption was never designed to survive.
And it’s not coming from lawmakers or tech companies.
It’s coming from physics.
Here’s the part that feels like science fiction—but isn’t.
Right now, governments and intelligence agencies are collecting encrypted messages en masse, even the ones they can’t read yet. Why? Because they’re betting they will be able to—soon.
Quantum computing isn’t about faster Netflix. It’s about breaking math. RSA, ECC, even the cryptography that protects banking systems, military comms, and yes—your messages—all rely on problems that are hard for normal computers to solve. Quantum tears through them like wet paper.
This isn’t a someday threat—it’s already started. The NSA straight-up told vendors to start migrating to post-quantum cryptography. China is heavily investing in quantum tech, and the U.S. is scrambling to keep up. Some experts believe that if adversaries capture your traffic now, they’ll be able to decrypt it retroactively in 5–15 years.
So if you’re a journalist, a whistleblower, an activist, or just someone having private conversations you don’t want state access to later… you’ve got a problem.
The good news? There’s already a shortlist of post-quantum algorithms that could survive the storm—Kyber, Dilithium, SPHINCS+, and a few others. The bad news? Almost no consumer apps use them yet. Signal’s quantum-resistance plans are in progress, but they’re not deployed.
So yeah—even the strongest encryption today has an expiration date.
And most people? They’ll never know their “private” messages were just ticking time bombs, waiting to be decrypted.
You can harden your tools. You can upgrade your habits. But at some point, the question stops being what you can do…
And starts being:
Who’s still fighting to make privacy possible at all?
What If the Fight for Privacy Never Actually Ended—And You’re the Next Line of Defense?
Here’s the twist they don’t teach in school: you were never given privacy. You inherited it from people who refused to shut up.
You only have encryption today because rebels printed code in books, because whistleblowers risked prison, because coders in basements pushed updates faster than governments could draft laws. Every time they tried to slip in a backdoor, somebody slammed it shut with code, protests, lawsuits, or raw public pressure.
But that cycle never stopped. It just changed uniforms.
Round one was the Clipper Chip. Round two is in your pocket right now—wrapped in legislation with names like "Online Safety" and "Child Protection." And round three? It’s already in the labs, quietly prepping the moment encryption shatters under quantum brute force.
So the question isn’t if this war ends. It’s who fights it next.
You don’t need to be some cybersecurity prodigy. You don’t need permission. You just need to stay awake. Shift your tools. Push your friends. Learn the system well enough to bend it back in your favor. You can’t “opt out” of surveillance capitalism, but you can sure as hell make it work harder.
This is the part where you decide whether you keep sleepwalking through the digital world with your data bleeding out…
Or you fight.
So—
what side are you on?
Stay Curious,
Addie LaMarr