The Online Safety Act isn't just about age verification – end-to-end encryption is also at risk

Composite image of digital eye and British flag
(Image credit: Getty Images)

The internet has been imploding since July 25, the day when mandatory age verification finally landed in the UK. Britons are now forced to prove they are over 18 to access adult-only content or view potentially harmful materials on the internet.

Many experts, technologists, politicians, and everyday users have already raised concerns about the negative effects these checks could pose for people's online privacy, security, free speech, and access to information. Yet, age verification is only one part of the 300-page-long Online Safety Act.

Even before becoming law, the legislation that aims at making the internet a safer place attracted strong criticism for another critical issue – undermining encryption. A risk that's still very much on the table.

What happened to the Spy clause?

Deemed by critics as the "spy clause," Section 122 of the Online Safety Bill introduced a requirement for tech companies to client-side-scan private and encrypted messages in the lookout for illegal content, such as terrorism or child sexual abuse material.

Client-side scanning, the process of analyzing data on a device before or after it's encrypted/decrypted, is a practice that experts have long argued cannot happen without breaking the security infrastructure of encrypted services and violating people's privacy.

Back in 2023, as the bill was about to become law, popular messaging apps like WhatsApp and Signal threatened to leave the UK if the legislation was implemented as it was.

Such a strong backlash brought the UK government to a bold decision – the scanning included in Article 122 would be halted until it is "technically feasible" to do so.

The devil's in the details, they say. The fact that this controversial provision has only been halted means that lawmakers are very likely to still be working in the background to find a technically sound solution to scan our chats.

Conceptual image of a large group of cctv camera watching and spying on a mobile phone with messages, it illustrates digital surveillance concept

(Image credit: Getty Images)

So, two years on, what happened to the spy clause?

According to the Internet Society's Senior Director for Internet Trust, Robin Wilton, there are two possibilities authorities are looking at.

The first will be simply scanning the messages before being encrypted. This would mean preemptively looking at everyone's communications, "and that should be unacceptable from the outset, certainly in the EU it would be interpreted as a general monitoring obligation," Wilton told Techradar.

This may leave authorities with only one feasible option – forcing device manufacturers to create a technical switch that law enforcement can turn on to access suspects' chats before being encrypted. A plan that seems to echo the push in the EU to make digital devices monitorable at all times.

Certainly more targeted, but "that's putting the technology of mass surveillance onto every device that everyone has," said Wilton. "I don't think governments should be contemplating that either."

Why breaking encryption is a bad idea

Encryption refers to the technical infrastructure that scrambles our online communications to prevent unauthorized access.

Encrypted messaging apps, but also secure email providers like Proton Mail and Tuta, and the best VPN services all rely on end-to-end encryption to ensure our communications remain private between the sender and the receiver – end-to-end.

Law enforcement bodies, however, have long argued that this level of protection is an obstacle during investigations and have been pushing to create an encryption backdoor (in and out of the UK) for years.

Digital rights experts keep fighting back against this idea, though, warning that a backdoor could make more harm than good.

According to Wilton, a legal encryption backdoor will inevitably leave citizens vulnerable to abuse by the authorities, as new governments may have a totally different view on safeguards in place.

Take the fall of Roe vs Wade in the US, which saw millions of women losing their reproductive data privacy rights overnight.

Besides privacy, experts have been warning over and over again that an encryption backdoor is also really dangerous because it creates a cybersecurity vulnerability that both hostile governments and cybercriminals could easily exploit.

"Imagine being able to attack the digital infrastructure of a country by using a botnet that is exploiting every single Android phone. It would be catastrophic. Yet, that's what happens if you build a systemic vulnerability into a monoculture of devices," said Wilton.

Beyond the Online Safety Act

A hand holding an iPhone with the iCloud logo on screen.

Apple removed its Advanced Data Protection in the UK on February 21, 2025, after experts warned that an iCloud backdoor "jeopardizes the security and privacy of millions." (Image credit: Shutterstock / nikkimeel)

A push to scan people's private messages – and undermining encryption on the way – certainly isn't exclusive to the UK government.

In the EU, for example, there are multiple efforts ongoing in this direction at the time of writing. What's been deemed Chat Control has perhaps never been so close to reaching an agreement since 2022, right after the Commission also shared further plans to decrypt Europeans' private data by 2030.

That said, the looming spy clause hidden amongst the Online Safety Act provisions may be only the tip of the UK Surveillance State iceberg.

An amendment to the already controversial Investigatory Powers Act (IPA), now requires tech firms offering messaging services (like Apple and Meta) to ask for government approval before releasing new security features. These firms must also block or disable security features upon Home Office requests, without telling users about it.

It was exactly a Technical Capability Notice (TCN) issued under the IPA that led to Apple killing its iCloud's end-to-end encryption feature in the UK. The Big Tech giant is currently on trial with the UK government about this matter.

So, on one side, we have IPA rules that could interfere with the design and development of tech devices and their features – encryption included. On the other hand, the Online Safety Act seeks to force tech providers to take preemptive steps to prevent potential harm.

As Wilton put it, these companies are "being held liable for the behavior of their users and turned into enforcement agencies."

We certainly don't know when or how all the Online Safety Act's provisions will be enforced. On these premises, however, we could safely say that age verification may just be the beginning of our privacy and security nightmare.

You might also like

Chiara Castro
News Editor (Tech Software)

Chiara is a multimedia journalist committed to covering stories to help promote the rights and denounce the abuses of the digital side of life – wherever cybersecurity, markets, and politics tangle up. She believes an open, uncensored, and private internet is a basic human need and wants to use her knowledge of VPNs to help readers take back control. She writes news, interviews, and analysis on data privacy, online censorship, digital rights, tech policies, and security software, with a special focus on VPNs, for TechRadar and TechRadar Pro. Got a story, tip-off, or something tech-interesting to say? Reach out to chiara.castro@futurenet.com

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.