This company believes to have the solution to ChatGPT privacy problems

ChatGPT OpenAI logo on smartphone in conceptual Artificial intelligence futuristic background
(Image credit: Shutterstock / Rokas Tenys)

AI is certainly the hot topic for the tech world this year. While governments are busy drafting ad-hoc legislation to regulate this new exciting world, commentators are increasingly divided between those thrilled by the new possibilities powered by AI applications and those fearing for users safety.

Those concerned about safety mention how AI chatbots get trained from information available online without asking for consent and keep learning as they talk to users. We already touched on the issues around how large language models (LLMs) collect and use people's data, turning ChatGPT and similar software into the latest privacy nightmare.

Now, a cryptography company believes to have found a solution to these problems. Zama claim to be able to implement fully homomorphic encryption on LLMs. So, how will encrypting ChatGPT work in practice?  

Fully homomorphic encryption (FHE): what is it?

Encryption is the process of scrambling data to prevent third-parties from accessing information without the right decryption key. Encryption is essential to the running of many online services including VPNs. For messaging services like Signal and WhatsApp and secure email services, one of the main forms of encryption used is end-to-end encryption (E2E).

"They call it end to end, but it's just in transit," said Pascal Paillier, CTO at Zama. "It's encrypted by the user and decrypted by the server. But it's just a protection, a layer of encryption on top of the transmission."

This from of encryption is great for messaging apps but falls short when you don't want the second party to know what the data contains. If you want to process a large amount of medical health data for predicative analysis without privacy issues E2E encryption cannot help.

"Fully homomorphic encryption (FHE) allows encryption in processing, which means you can process data without decrypting it," said Paillier, adding that FHE is compatible with the classic way of encrypting data as well.

FHE brings then a big advantage in terms of security as it enables third-parties to manipulate data blindly, without breaching users privacy.

Paillier went on to say, "This is the big magic of FHE. It's really, I would say, the third pillar of encryption which was missing for so many decades. We had encryption in transit. We had encryption at rest. We never had encryption at use."

Padlock on wire mesh network and glowing particle data.

(Image credit: Getty Images)

The way towards encrypting ChatGPT

Paillier has been working as a cryptographer for many years and most of his career has been devoted to solving the conundrum of how to implement fully homomorphic encryption (FHE).

As mentioned, FHE is a type of encryption that allows processing data without the need to decrypt it. The concept has been around for almost 50 years but hasn't had much use cases as its extremely difficult to implement. Paillier did his PHD in the 90s on FHE, managing to create a structure that could perform additions on encrypted data.

"In the 90s it was still a big challenge to invent FHE where you can do any type of computation," he said. "Then, in 2009, a cryptographer from IBM discovered a way to make it work. It was suddenly an eye opener for all the cryptographers in the scientific community."

This is the big magic of FHE...We had encryption in transit. We had encryption at rest. We never had encryption at use.

Pascal Paillier, CTO Zama

At the time, performance was so slow it made adoption of FHE impossible. Now, Paillier believes we are just a step away from finally building the necessary tools to apply FHE where classic encryption methods aren't enough—and this includes AI chatbots like ChatGPT. 

In simple terms, FHE would encrypt all the communication between users and the machine. AI chatbots would then reply to these queries by directly manipulating secure and encrypted data so there's no risks that OpenAI and co. could retain and potentially misuse people's sensitive information.

Developers would be also able to train the algorithm in a more privacy-friendly way, encrypting the training data before being processed. Although, this most powerful functionality is likely to come on a later stage.

ChatGPT is just one of the many examples where fully homomorphic encryption is going to be a game changer at a data privacy level.

It's not difficult to envision, for example, how healthcare data privacy—a central issue, especially in the US after the fall of Roe vs Wade—would benefit considerably from this kind of encryption. 

"Right now it's a question of engineering," Paillier Told TechRadar. "We need new development tools to enable any developer to actually automatically generate homomorphic applications. What we're building at Zama is this technology stack."

The challenges of fully homomorphic encryption (FHE)

There are two challenges to solve before we can fully adopt FHE: its high cost and slow performance. Which are both currently heavily linked to hardware limitations. 

Processing encrypted data means that computing platforms need to handle a bigger and heavier stack of information at the same time. Not only that, the computations required are also more complex as homomorphic software needs to "manipulate the ciphertexts [encrypted text] instead of the context data," explained Paillier.  

For the same reasons, this type of encryption is also way slower. "Everything is bigger, everything is longer in terms of time. We've reached pretty much the limit of what we can do at the mathematical level, but at the engineering side we can improve things drastically. The most dramatic way to improve this is hardware acceleration."

It's going to be some kind of small revolution within the AI revolution.

Pascal Paillier, CTO Zama

The incredible success of generative AI applications seems to be just the right push to develop new solutions even faster.

ChatGPT-like apps require increasingly bigger and more powerful devices. Hence, tech companies have been busy shaping their products. One solution currently under development, known as Field Programmable Gate Arrays (FPGAs), is a hardware architecture specifically built for crunching homomorphically encrypted data.

Paillier said to be excited about the progress so far, confirming that Zama has a lot of partnerships with FPGA designers. "It's coming faster than we expected. By 2025 there will be commercially available hardware accelerators for FHE. That was the only thing missing for mass adoption. It's going to be a revolution for everything that has to do with handling digital data. Some kind of small revolution within the AI revolution, if you will."

What's next?

It looks like we are just a couple of years away from being able to enjoy a more secure AI-powered experience. In the meantime, government regulations could help to limit the risks. But that's a road not without bumps. 

"Regulators are trying to find the best compromise between a lot of constraints. Privacy is obviously a big one, but it's not the only one," said Paillier. "It's going to be a mess for a while." 

What's certain now is that Zama and other few players out there are working hard to speed up the process of getting fully homomorphic encryption ready for the masses. While other competitors are working on specific use cases, Zama's target is the open source community of developers with the mission to bring FHE into the mainstream with adaptable diversified iterations.

Chiara Castro
Senior Staff Writer

Chiara is a multimedia journalist committed to covering stories to help promote the rights and denounce the abuses of the digital side of life—wherever cybersecurity, markets and politics tangle up. She mainly writes news, interviews and analysis on data privacy, online censorship, digital rights, cybercrime, and security software, with a special focus on VPNs, for TechRadar Pro, TechRadar and Tom’s Guide. Got a story, tip-off or something tech-interesting to say? Reach out to chiara.castro@futurenet.com