Making quantum cloud computing sing

Representation of a quantum chip
(Image credit: Shutterstock)

Ask a quantum computer the right question in the right way, and it will answer in milliseconds rather than the seconds, hours, days or even years that the best supercomputers on earth might manage. It’s fair to say then, that the coming quantum revolution will be game-changing in the extreme – with quantum technology enabling breakthroughs that are unthinkable with today’s classical computers. But in this article, I want to explore something wholly ‘thinkable’ and indeed entirely imaginable… how the quantum computing cloud could transform business workflows.

For this exercise I’m going to use an imagined future situation within the pharmaceutical industry. Business is good, the pipeline is busy, and an employee of a company called Futura Pharma Inc is uploading omics data into a cloud server. She knows full well that within just a few hours – probably while she’s eating dinner and getting a good night’s sleep – proprietary macros will call upon a set of quantum algos to generate a personalized treatment plan for a patient, as well as a suite of drug variants.

Over the next few days, the team at Futura Pharma Inc will synthesise and test the recommendations to hone the optimal treatment. Files will be automatically archived to make sure they comply with regulatory necessities. And the whole team will be nicely relaxed, knowing that good outcomes are more or less guaranteed. They might even find themselves moving on to even more challenging and satisfying tasks.

Edmund Owen

Edmund Owen is Principal Quantum Physicist of Cambridge Consultants, part of Capgemini Invent.

Challenges

Let’s get back to the present day now and look at the obstacles that are currently standing in the way of this scenario. First up is that we need to realize that such a vision requires more than just an improvement in quantum hardware. Quantum computers are complex pieces of kit requiring dedicated infrastructure and experience. For the foreseeable future, most companies won't be able to just go out and buy, own and maintain their own quantum computers - they're going to need to access remote machines on the cloud.

Quantum clouds are great in concept, but the challenge is that there's a huge gap between the maturity of quantum and classical high performance computing services. The quantum cloud infrastructure to transfer data securely and distribute tasks effectively doesn’t exist. Today, classical cloud services such as Amazon Web Services (AWS) provide a flexible, secure, convenient means for access – and successful quantum cloud systems will help drive growth of this powerful, emerging technology – and this is true for the future. On the plus side, we are seeing a burgeoning ecosystem of cloud-based quantum computing. New quantum software tools and cloud services such as Amazon Braket and Microsoft Azure Quantum are up and running. But these cloud platforms are really for quantum algorithm specialists to conduct research, explore new use cases and test for potential quantum advantage.

Educational tools such as IBM Quantum Experience are enabling application specialists to get up to speed on how quantum computers work. But the fact remains that there is a considerable gap still to be bridged between the current pre-production research platforms and mature quantum cloud services ready for integration into commercial workflows.

The biggest factor in all of this of course is us humans. We like stuff that’s easy to use, enjoyable to use, rewarding to use. Your hand might linger over a faster, more powerful tablet in the electrical store – but chances are you’ll go for the one with the seamless, more fluid user experience. This truism mustn’t be underestimated. It’s imperative that we find a way to make cloud-based quantum services easier to use. It follows that if quantum computing is to become truly commercially successful, workflows have to feel great.

Solutions

The solution here lies in digital service development. That means a focus on design thinking with the end-user’s goals always front of mind. Here’s what can be abstracted away… here’s the features that really need to be accessible… this is the best way to transfer data… this is how much uptime we need. You see where I’m going here. This is the process that will make sure quantum cloud solutions really sing – and are successful.

Excellent digital service design doesn’t happen by chance. It is driven by excellent service architecture. As I’ve said, designing around user needs is paramount, and requirements and trade-offs will no doubt vary depending on the use case and industry. So, let me expound on that by dipping into the quantum trade-offs of reliability against performance.

The fragility of a quantum computer adds up to a requirement for regular calibration which will take part of it offline. The rate of runtime errors on a classical computer doesn’t vary with time, but it gets progressively worse on quantum computers as system parameters drift. Calibration brings error rates down to a base level but deciding how often to do this will affect the quality of service experienced. Therefore, your judgement about how often you want to calibrate will depend on what users need. If they are running many short programs, fewer calibrations are needed as a failure is not as critical. This also means that resources can be utilized at a higher rate, reducing costs. For long-running programs, it is important they run to completion so computers should be calibrated regularly to increase probability of success.

A hybrid approach

I’d like to zero back in on life sciences and pharma again now, to talk about a quantum cloud solution optimized either for target identification or for quantum chemistry simulations. Submissions can be batched for initial sweeps of massive databases for potential drug targets and low accuracy solutions. That means runs can be small. Yet if you want to make a lot of them, those runs need to be inexpensive.

Large complex simulations will be required to simulate drug chemistry to high accuracy once promising leads have been identified. The cloud provider should have a good understanding of user needs, so could adapt its calibration schemes. Quality of service guarantees will depend on use, along with partnerships between quantum cloud providers and their customers.

When it comes to heterogeneous computers, quantum processing units (QPUs) will be able to carry out currently unfathomable computations easily – with an entirely different form of logic than current computational hardware. On the other hand, some tasks which are currently easily carried out on classical computers will be hard for QPUs. To maximize their potential, then, QPUs will be integrated into a hybrid computational environment that mixes classical and quantum elements.

The seamless management of computation and data flows across heterogenous components will be vital within the quantum cloud infrastructure. Without automated, easy-to-use tools, wide uptake of the technology won’t happen and there’ll be an erosion in terms of commercial value.

But let’s not beat around the bush here. Platforms which deliver high quality orchestration of heterogenous computational elements will create equally high commercial value. They will beat platforms that have higher individual component performance but don’t have simple integration. In the life sciences context, users need to be able to focus on the science rather than worrying about how to get different computational resources to work together coherently.

The scenarios and visions I’ve set out here are transferrable. Quantum computing will have a huge impact on just about every sector you can think of. The user needs will be different, along with the requirements of new service platforms. And if we are to serve the new generation of companies liberating their business workflows with quantum cloud systems, then current system architectures focused on research simply must evolve.

We've featured the best productivity tools.

Edmund Owen is Principal Quantum Physicist of Cambridge Consultants, part of Capgemini Invent.