Why sending the keys to the castle – with the castle – isn’t a smart idea

A padlock icon next to a person working on a laptop.
(Image credit: Shutterstock)

Encryption has been on a long journey, transcending centuries, from leather scrolls to quantum-resistant algorithms. But if we look at security requirements for businesses 25 years ago, the world was very different then. I sold encryption software back in 2000 when no one apart from the Government knew they needed it. It was free for home or personal use, so trying to sell encryption in a world where users borrowed it free of charge was tough.

One of the most notable examples at the time was OpenSSL, an open-source project that provided free encryption tools for securing internet communication. Another popular tool was PGP (Pretty Good Privacy), which had both free and commercial versions. The free version was widely used by privacy advocates, journalists, and tech-savvy users to encrypt emails and files.

However, eventually, public-key cryptography and tools like PGP started to gain traction for secure email and online communication. It was a pivotal moment as encryption moved from being a nice-to-have to becoming essential for privacy and trust online.

Tilo Weigandt

Co-Founder and COO, Vaultree.

Unlocking Encrypted Data Without Compromising Security

Fully Homomorphic Encryption (FHE) has also been in the mix for about 15 years. But it has been labelled as too complicated, requiring too much processing, too much disk space, being too slow and more.

But we’ve seen a breakthrough in FHE whereby customers are using it not just to encrypt data, but to query, decrypt and use it. They can also search data faster when it's encrypted than when it's not!

Without going into lengthy explanations as to how the integers (the raw material that makes encryption possible) and cryptographic algorithms work, in simple terms, it’s only the user who has access to and can understand and read the data. Where anyone else is concerned, the data is just garbage.

Why is this so important? If we look at how encryption tools work today, we create data, which we then encrypt. Every time we search, every time we move, we decrypt – because that’s the only way to make the data work for us. Once we have decrypted it, we must re-encrypt it to ensure it is safe. Therefore, we have multiple touch points where data, which we tell users is encrypted, isn’t encrypted.

This might be okay for data-at-rest, but once you migrate data from A to B, in many cases, we send the encryption keys with it. This is the equivalent of sending the keys to the castle with the castle. Also, the moment the user wants to do anything with that data (which is arguably when it is most useful) then all that safeguarding is gone.

Therefore, for us to be able to allow customers to use data whilst it still has a safety net around it and ensure data remains confidential is a huge leap forward. Most importantly, we don't have encryption keys travelling with the data; the keys are generated at the point of login.

Is Bootstrapping FHE The Answer?

Many in the industry advocate bootstrapping FHE as a workaround. This is a clever trick that allows encrypted data to be refreshed so it can be used in computations without becoming too noisy to decrypt. But it’s important to recognize both its promise and its challenges.

Done properly, bootstrapping has significant potential if implemented efficiently. It could unlock powerful capabilities in computation, especially in domains where data sensitivity is paramount.

However, bootstrapping remains computationally intensive. Even with optimized schemes, it can take seconds per operation, which means it cannot be used for real-time applications. Bootstrapping implementations are highly complex.

That said, dismissing bootstrapping outright may be premature. As research and engineering efforts continue - including our own - there’s reason to believe that more efficient, scalable implementations are within reach.

We live in a data-driven world, and FHE is going to be the key to enabling access to data and the use of technology like AI tools. However, I recently read an article in which Elon Musk discussed the fact that we’re running out of real-world data to train AI models.

This idea (often called the “peak data” theory) suggests that the internet no longer offers enough fresh, high-quality data to improve large AI models meaningfully.

In other words, we will hit a point where data is not going to develop in a way that is useful to organizations because of all the rules we put around it, some of which are self-imposed, others which are required by regulators.

Many Cast Caution Aside

Many regulations are vague, which means organizations will interpret these in a way that enables them to reduce risk by taking a more cautious approach. Other companies throw caution to the wind. We're seeing that play out in real-time.

Take Meta as an example. The company has faced multiple fines recently. The most notable in 2025 is a €200 million penalty from the European Commission under the Digital Markets Act (DMA). The fine was issued because Meta’s “pay-or-consent” model, where users had to either pay for ad-free access or consent to data tracking, was found to violate users’ rights to genuine choice.

Regulators argued that this setup pressured users into giving up their data, undermining the DMA’s goal of fair digital competition.

Improving Lives

I would argue that we must make data more usable, particularly where data helps make the world a better place. Clinical trials, for example, have faced challenges due to disparate data, which hampers progress. If a trial is quite niche, the organization needs to be able to get a decent data set to perform tests and analyze the results in a meaningful way.

We are helping organizations keep the data in the jurisdiction it is supposed to be in and still collaborate without moving the data around and without plain text data ever flying across the internet. This means we can make people's lives better; we can stop people dying of illnesses we could prevent because we understand more.

Of course, we want to be able to use data in a way that still respects privacy. This is where we see FHE being an enabler. We can have AI using data that is encrypted to make better decisions without affecting the underlying owner of the data from a user level. It's a brave new world – but an exciting one.

We list the best data recovery software.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Co-Founder and COO, Vaultree.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.