How the UK is Weakening Safety Worldwide
This week, the UK put the entire world at risk. I understand that may sound like alarmist hyperbole, but follow me. To understand, we have to go back to 2016, a year that I like to describe as “the year the world got collectively blackout drunk and decided to call our ex.” A lot of wild stuff happened that year – especially in the UK – so one thing that might’ve slipped under the radar was the passing of a little law called the Investigatory Powers Act. This law would later go on to be nicknamed “The Snooper’s Charter” by critics, and it allowed the UK to dramatically expand their electronic surveillance powers. How dramatically, you might ask? Two weeks ago, Apple was ordered to insert an encryption backdoor (more on that in a moment) into iCloud. And they weren’t allowed to publicly disclose it. And also even if they wanted to fight it, they’d still have to comply while the courts considered their appeal.
Yeah. That dramatically.
The thing is, Apple has made it abundantly clear in no uncertain terms that they will refuse any request by any government to knowingly insert backdoors into their software. So in light of their inability to fight this request, Apple simply decided to remove iCloud encryption entirely for UK users.
Advanced Data Protection
Some readers may be a little confused on how all this works (also my wording wasn't 100% accurate for the sake of brevity), so let me break it down real quick. Encryption is the practice of taking data and making it unreadable to unauthorized parties, much like a secret code or language. Most modern devices and internet traffic is encrypted, but in such a way that the service provider still has access. Your Gmail inbox, for example, is encrypted as it travels between your phone and Google’s servers but it’s decrypted at those two spots, meaning anyone at Google who has the proper access can access your emails. (There's a lot of valid reasons for this, but it also opens up your data to being spied on, sold, or otherwise shared and potentially abused.) Some providers, however, take it a step further and implement what's called “end-to-end encryption,” (aka “E2EE” or “zero-knowledge encryption”). This is data encrypted in such a way where it’s only ever accessible on your devices, never the provider’s. (More on all of this in the page I linked earlier in this paragraph.) While there are no doubt a handful of evil people who would abuse E2EE to better cover their harmful tracks, it also benefits ordinary, law-abiding users by giving them a huge defensive boost against data breaches, massive data collection, unchecked mass surveillance, and a myriad of other threats online.
This brings us to Apple’s “Advanced Data Protection” program. Released in 2022, ADP was a huge upgrade to Apple’s iCloud suite, encrypting nearly everything stored in iCloud (except emails, contacts, and calendars) using E2EE. This feature required users to manually enable it, and while I always encourage readers to explore other options by privacy-first companies, I (among many other privacy enthusiasts) still touted this as a win for giving the everyday user an easy, effective way to protect their data. This is the encryption that Apple removed in the UK. Certain Apple products – like iMessage, Health, Passwords, and other data – remain end-to-end encrypted as it always has been, but ADP is specifically what the UK ordered a backdoor into, and thus ADP is what was pulled.
Backdoors & Salt Typhoon
Now, quickly, let me explain what an “encryption backdoor” is since I’ve already used it several times and it’s critical to this discussion. A “backdoor” is a term used to describe any sort of loophole in the encryption (or program) that allows someone – usually the developer – to gain direct access to the program or data. Usually this is done quietly, in the background, without the consent or awareness of the end user. (This should not be confused with things like analytics, telemetry, or granting access to tech support voluntarily.)
For years, politicians (and others) have called for backdoors in various software – usually encrypted messengers and the like – with the promise that it will only ever be used by the “good guys.” The problem is that backdoors are very aptly named: a backdoor of a house can be a great tool, allowing residents to come and go into their back yard conveniently and freely. For example, many households will leave the backdoor unlocked as a failsafe for residents to get in or out while reducing their risk – the backdoor is behind the house, maybe behind a fence, hidden out of sight where it may be less tempting to would-be crooks than leaving the front door unlocked.
At least, that’s the theory, right? But we all know that it’s a feel-good gesture at best. Bad guys have no way of knowing that the front door is unlocked, and if the backdoor is unlocked all it takes is one brave criminal who’s scouting the house to hop the fence and try the door. Furthermore, even if it is locked, I’ve personally noted in the past that residential locks are criminally (no pun intended) easy to pick, often taking just a few seconds to someone with even the slightest amount of practice.
There's a reason the term was applied to encryption. Technology is difficult for most people to understand. For those without training in code or electronic engineering, computers seem akin to magic boxes powered by fairy dust and electricity. Even for those trying to learn, it can be difficult to find a good entry point. As such, it’s easy – and tempting – to hear someone promise “a backdoor that only good guys can use” and go “I don’t understand the details but surely it must be true.” After all, deepfakes, cyberespionage, and quantum computers are real things. Surely the magic box full of YouTube and cats makes anything possible, right?
Except – as the tech savvy in the crowd know – no, that’s not at all how it works. A software backdoor is exactly like a physical one: it only keeps out people acting in good faith. Those determined to get in will find their way, whether that’s a vulnerability in the software/encryption itself, or by phishing a legitimate user and stealing their login credentials or any other number of ways. And that’s to say nothing about the idea of “good guys;” who are they, exactly? Insider threats are a real thing. An abusive homeowner has “legitimate” access to the house’s backdoor, just as a rogue employee would have access to the same backdoor in your encryption. People are people. Some of them are traitors, or make mistakes, or give in to temptation.
In fact, we have real-world proof that backdoors are indiscriminate: Salt Typhoon. In late 2024, it came to light that the Chinese government had hacked into at least 9 American telecoms providers (as well as dozens of other countries) using – among other techniques – a backdoor specifically designed to be used by law enforcement for court-approved wiretapping. I cannot impress upon you, dear reader, how big of a deal this is. Every major American cell provider – Verizon, T-Mobile, AT&T, and others – had their backdoor abused by the “bad guys” which gave them access to call logs, messages, and tons of other sensitive data for people up to and including President Donald Trump, Vice President Kamala Harris, and more – using the exact same tool designed for the “good guys.” This is absolute, unarguable proof that “a backdoor only the good guys can use” is a fantasy. Politicians who promise this may as well promise unicorns and orcs while they’re at it.
The Broader Pattern
At this point, you may be convinced that backdoors are bad and the UK was wrong for ordering Apple to implement one. But you may wonder how this threatens users worldwide as I claimed at the start of this article. After all, ADP was only removed from the UK. The answer lies in the larger context: this is a pattern.
I promise I’ll keep this brief, but allow me to give you a quick bit of important internet history: if you’re reading this article, there’s a possibility you've at least heard of services like ProtonMail or Tuta. These are encrypted email providers, and their services are basically built on top of an encryption scheme called “PGP,” which premiered in 1991. Almost immediately, the government tried to stifle PGP, claiming that it should be classified as a weapon and therefore subject to the same regulations as weapons exports in terms of how it was distributed to foreign users. The case was taken to court, where the Supreme Court ultimately ruled that code was free speech, that subjecting code to the same regulations as weapons exports was insane, and that intentionally weakening code to try to one-up our enemies was pointless (and also insane). Thanks to this ruling, we now have a much safer digital with the proliferation of encryption technologies like TLS and AES, allowing for safe online shopping, encrypted storage, and much more. This case became one of the most prominent episodes of what was later dubbed The Crypto Wars. (Fun fact: this case was also what put the Electronic Frontier Foundation on the map, and their current Executive Director, Cindy Cohn, was the lead attorney in the case.)
The thing is, the Crypto Wars never really ended. It just comes and goes in phases. For example, during the first Trump administration, US Attorney General William Barr very outspokenly supported encryption backdoors. But don’t worry, that’s not a partisan take, because the Biden administration also indirectly supported backdoors via the 2022 Kids Online Safety Act, which likely would’ve encouraged platforms to weaken, backdoor, or disable encryption to comply. (No word on how Biden would’ve responded to even more hostile bills like the EARN IT Act or the STOP CSAM Act.)
But lest you think this is a US trend, rest assured that it’s not. The most infamous assault on your digital safety in Europe comes in the form of Chat Control, a proposal to force messengers like WhatsApp to scan all your messages for CSAM (Child Sexual Abuse Material). If that sounds familiar, that’s because Apple voluntarily proposed a similar system a few years ago, intending to scan photos on-device before being uploaded to iCloud to look for any illicit content. They ultimately decided against the plan after public backlash, including several proofs-of-concept showing that the system was faulty and an admission that this, too, was a slippery slope ripe for abuse. They ultimately decided there was no way to implement this program in a way that was both effective and reasonably abuse-proof, so they decided to drop it altogether. (Chat Control – and other abominations like the EARN IT Act – still rear their hideous heads from time to time. Apparently the memo is lost in transit somewhere.)
The UK is now spearheading the smear campaign against privacy with things like their #NoPlaceToHide campaign, which aimed to paint anyone using encryption as a pedophile. (No, that’s not an exaggeration, actually.) There are still numerous bullish threats to privacy across the western world including calls to erase online anonymity to fight hate speech, vast social media surveillance programs, and more. The UK’s latest demand of Apple is not just an attack on their own citizens. It’s another step forward in a global assault against privacy, online safety, and self-sovereignty for individuals. Placed in the larger context of these ongoing proposals, regulations, and attempts to shift public narrative, this is another blow to privacy – and at least for now, an effective one.
Fighting Back
In light of the overwhelming evidence against backdoors – things like Salt Typhoon and rampant data breaches – the only conclusion I can come to is that those who push for backdoors are either ignorant (uneducated on how technology works and thus misled to believe that such a thing is possible) or malicious (knowing full-well it’s not possible but prioritizing their own goals – be it national security or control – over the safety of the individuals under them). But for the UK and those who view encryption as a threat to their ends, it doesn’t matter. The end result is the same: UK user data is now accessible. This is a huge encouragement to other regimes seeking to undermine user safety in the name of whatever packaging they’ve picked this week: child safety, national security, etc.
There are some who speculate that Apple’s withdrawal was strategic: now they can fight the order without needing to comply while it works its way through the courts. I don’t know if that’s true (I haven’t seen any credible sources confirming it) but I would like to believe it is, not because I’m an Apple fan (I discourage my users from using Apple services even if they do decide to use an iPhone) but because I think Apple fighting back and winning would regain lost ground for the right to privacy, much as the ruling for Phil Zimmermann and PGP was ultimately a foundational win that future internet safety would go on to be built on top of. It would set a legal precedent saying “no, you can’t order backdoors (or at least you can’t do them in secret and force compliance in the meantime).” Either of those rulings would be a step forward for the safety of everyday, ordinary digital citizens everywhere. When one country – especially a global superpower – does something, it emboldens others to try following suit. Usually for the worse.
In the meantime, I would like to leave readers with some actionable takeaways. Even if Apple fights this case, it will take years to play out, and we have no guarantee of how it will go.
First things first: those who use iCloud should stop doing so. While ADP remains available in other countries, personally I’ve never been a fan of trusting all your data to a single, proprietary company. While iCloud is very convenient, I’m not entirely convinced it’s worth it. Many of Apple’s services can be safely and easily replicated in other trustworthy services, like photos, calendar, notes, and even drive storage. These are companies who put privacy and security first – not bolting it on as an afterthought the way companies like Apple have done. They're also transparent and rarely have to appease investors, meaning they are incentivized to protect you – the user – instead of selling your data to maximize shareholder profits. The New Oil offers a list of such services here.
For those things that can’t be easily replicated – like health data, reminders, and digital wallets – I would encourage users to consider what you can store on-device – or better yet, live without. While I will be the first to admit the convenience of things like Apple Wallet or enjoying the insight into the quality of my sleep, I can also confidently say that the trade off is very minimal. Studies have shown cash to be more effective in helping you stay on budget, and for most people basic sleep hygiene (like getting off screens 30 mins before bed, keeping the bedroom dark, and being consistent with your routines) will be plenty.
Finally, I believe that it’s always worth getting involved politically. While we can’t always rely on politicians and laws to protect us, I strongly believe that good privacy laws can act as a layer of defense (just compare the problem of “people search websites” between America and the EU, where GDPR makes them virtually nonexistent). If you’re a UK citizen, you should absolutely find the contact information for your representatives and (politely) express your displeasure at this request and demand they drop the request and start protecting users online instead. (For those who need help making sure they’re communicating effectively, you’ll want to inform them you disprove of the UK’s “technical capability notice” issued to Apple regarding their Advanced Data Protection offering in January of this year.) If you're unsure where to even start with political action, check out the “Get Involved” section of my website. Some of the organizations listed – like Big Brother Watch and Privacy International – are UK based.
Conclusion
Nobody in the privacy community defends crime, especially serious crimes that harm others. But we must always balance our desire to stop the tiny percentage of criminals against our right to freedom as citizens (I have written about this many times before, probably most notable here). Many of those who support privacy-hostile proposals do so in the name of protecting the children. I don’t have kids of my own, but I do have nieces and nephews that I adore and would do anything for. I can’t help but wonder what sort of world we’re leaving them where they have no freedom to explore and grow, where they’re forever haunted by their mistakes in a digital permanent record, where the only thing standing between their freedom of expression and a fear of self-exploration is policy and not real, technical safeguards that make such abuse impossible. Protecting people is admirable, and we should take all reasonable measures we can to do so. Banning encryption is neither reasonable nor protecting people. It’s actively putting them at risk. Force the UK to respect your privacy and safety by switching to services outside their reach, and let them know that you – a voter – do not support this authoritarian red herring.
Tech changes fast, so be sure to check TheNewOil.org for the latest recommendations on tools, services, settings, and more. You can find our other content across the web here or support our work in a variety of ways here. You can also leave a comment on this post here: Discuss...