A professional’s guide to Quantum Technology
- Preface: Why this guide?
- Part 1: The background: Why are we so enthusiastic about Quantum Technology?
- Part 2: The applications of Quantum Computing: how will these machines change society?
- Part 3: When can we expect a Quantum Computer? (Coming soon)
- Part 4: Why we can not trust every claimed Quantum breakthrough? (Coming soon)
- Part 5: How can my organisation get started with Quantum? (Coming soon)
- Further reading: An overview of resources
Part 2: The applications of Quantum Computing: How will these machines change society?
In the previous part, we saw that Quantum Computers are actually quite slow computers, but they happen to solve certain problems more efficiently, that is: in fewer steps. Then of course, their true usefulness depends on what the specific problems are that they happen to solve well!
What are projected applications and use cases?
1. Simulation of other quantum systems and materials
Most materials can be studied very well using our classical computers. However, for certain materials, the locations of the electrons within molecules are notoriously hard to describe, and require quantum mechanics to be carefully simulated. Such problems are prototypical examples where a quantum computer could offer a significant advantage. Realistic applications could be in making chemical reactions faster (leading to more efficient factories), estimating the effects of new medicine, or working towards materials with certain properties (like superconductors or semiconductors). Of course, scientists will also be excited to simulate the physics that occur in exotic circumstances, like at the Large Hadron Collider or in Black Holes.
Examples of interesting business news:
- Startup Qu & Co creates a software package to simulate molecules, which runs on Amazon’s platform, they also started a project to simulate ‘multiphysics’ with LG.
- Startup PhaseCraft studies the famous Fermi-Hubbard model using a quantum computer
- Startup Zapata reduces the runtime and error rate of famous chemistry algorithm
- IBM and Daimler research next-gen batteries
2. Cracking a certain types of cryptography
The security of today’s internet communication relies heavily on a cryptographic protocol invented by Rivest Shamit and Adleman (RSA) in the late 70s . The protocol ensures that distant internet users can establish secret keys for encryption (so that nobody else can read their messages), and to guarantee the origin of files and webpages (so that when you open your bank’s webpage, you can be confident that you don’t send your password to some malicious attacker). RSA works thanks to an ingenious mathematical trick: honest users can set up their encryption using relatively few computational steps, whereas adversaries who aim to crack other’s cryptography would need to solve an extremely hard problem. For the RSA system, that problem is prime factorization, where the goal is to decompose a very large number (as a small example for illustration purpose, let’s think of 15) into its prime factors (here: 3 and 5). As far as we know, for sufficiently large numbers, this task requires such a long computation that nobody would bother to do so. At least, this was the case until computer scientist Peter Shor started working on what would later become one of the most famous and most impactful quantum algorithms.
In 1994, Shor published a method to crack RSA (and also its cousin Elliptic Curve Cryptography) in a relatively efficient way using a quantum computer. To be more concrete, according to this paper, a plausible quantum computer could factor a 2048-bit number in roughly 8 hours (and using approximately 20 million imperfect qubits). The authors needed to make several assumptions about what a quantum computer would look like in the future, because many things are unknown. However, they did so in a very careful way, assuming hardware that is close to current experiments (but with many more qubits), and using existing theory for error correction and algorithm optimization. Note that future breakthroughs could reduce the stated time and qubit requirements even further.
Shor’s algorithm will have a massive impact on cybersecurity, and will force a major fraction of today’s IT to migrate to a form of so-called post-quantum cryptography (PQC). PQC is a term used for all forms of cryptography that are resilient to attackers with a large-scale quantum computer, and most PQC does not require the honest users (that’s you) to have anything more fancy than a conventional computer. In other words, PQC can be done using purely classical computers. However, don’t be mistaken: IT transitions (`migrations’) that change core security code can cost many years and are relatively costly, especially for larger organizations. Several authorities make guides available to approach this migration:
- TNO (the Dutch Organisation for Applied Scientific Research): Migration to quantum-safe cryptography
- AIVD (Dutch secret service): Bereid je voor op de dreiging van quantumcomputers [NL]
- NCSC (Dutch National Cyber Security Center): Factsheet Postquantumcryptografie [NL]
- US National Institute for Standards and Technology (NIST): Getting Ready for Post-Quantum Cryptography: Explore Challenges Associated with Adoption and Use of Post-Quantum Cryptographic Algorithms
3. Optimization and machine-learning
This is the part where most enterprises become enthusiastic. The theory of quantum mechanics relies heavily on linear algebra, and the actions of a quantum computer are mathematically described by very (very!) large matrices. For this reason, many researchers that aim to find new quantum algorithms are searching in directions where classical computers benefit from faster operations with very big matrices, such as optimization problems, solving differential equations, machine learning and artificial intelligence. For conciseness, we will collect all these fields under the term ‘optimization’. The classical field of optimization is of great relevance to many more specific disciplines, such as operations research, finance, manufacturing, logistics, weather forecasting, and so forth. The classical branch of machine learning has proven itself through celebrated technological breakthroughs such as self-driving cars and computers that beat humans at board games like Chess and Go.
However, the quantum optimization algorithms have not proven themselves yet. There exist various quantum optimization algorithms that offer so-called “quantum speedups”, meaning that a quantum computer would require fewer elementary steps to solve a problem than a classical machine. For example, a famous quantum algorithm invented by Lov Grover allows us to find the maximum of a function using fewer steps than any classical computer could. The main question is whether this also means that the quantum computer requires less time? Recall that a quantum computer would need much more time to execute a single step. According to a recent paper, most of the so-called ‘quadratic speedups’ that are often found in quantum algorithms will, in practise, not be competitive with a modern-day classical computer in terms of speed.
Further reading: What exactly is a ‘quadratic speedup’ or an ‘exponential speedup’?
In general, a “quantum speedup” does not automatically mean that Quantum Computers will ever solve that specific problem faster in practice!
On the other hand, there are algorithms that promise much larger, so-called ‘exponential speedups’, but do not have undeniable evidence to back this up. Often, these algorithms are tested on a small dataset using the small quantum computers that are available today. Small examples do not yield any guarantee that such algorithms will also work in more difficult cases. Nonetheless, it is these algorithms with bold claims that receive media attention.
Unfortunately, there does not yet exist a killer optimization algorithm that is both useful (like Grover, and many other optimization algorithms) and significantly faster than their classical competitors (like Shor’s algorithm is for prime factorization).
As a result, the field of quantum optimization is rather hyped: many terms, such as ‘quantum artificial intelligence’ sound very promising in people’s ears, but in reality, we do not yet know of any algorithms that can live up to these high expectations.
Still, most larger enterprises keenly follow the developments of quantum optimisation tools, and several even experiment with early proof-of-concepts themselves. To name some: Volkswagen and ExxonMobil investigated how routes for busesa and transport ships can be optimised, Roche started a project to find medicines for Alzheimer’s, and the three major Dutch banks collaborated on a workshop to make employees aware of quantum technology.
Which quantum optimisation algorithms can one experiment with today?
An example of a promising but also hyped approach are quantum versions of neural networks. They have intimidating names such as Quantum Approximate Optimization Algorithm (QAOA), or variational quantum eigensolvers (VQE), or simply variational circuits. The strength of this approach is that they can be applied on today’s hardware: they work on small quantum computers (with a few hundred qubits), don’t require error correction, and a large portion of the computation can be taken care of by classical computers. For these reasons, variational circuits are often chosen for a first investigation by companies that want to explore the use of a quantum computer. However, once again, there are no guarantees that these algorithms will have serious advantages against even the most basic classical algorithms for practical use cases.
- A survey of the QAOA algorithm, see Chapter 3
- An implementation tutorial using IBM’s QISKIT
- An implementation tutorial using Xanadu’s Pennylane
Another approach is so-called Quantum Annealing, most popularly offered by the Canadian company D-Wave. There are some important distinctions between annealers and other types of quantum computers like those made by IBM, Google and IonQ. D-Wave’s machine is able to execute just a single quantum algorithm, namely quantum annealing (and it’s extremely good at that). If you want to run anything else, then tough luck. Most other quantum computers are so called ‘universal’ computers, meaning that they can run any algorithm that’s thrown at them. (So, we need not worry that D-Wave will run Shor’s algorithm to break cryptography any time soon).
Sceptics will directly ask two questions: is D-Wave’s machine actually quantum, and does it have any advantage over conventional supercomputers? There is no scientific consensus about this. As it stands today, we believe that D-Wave’s claims about thousand-fold speedups are exaggerated, but we do notice that skilled programmers can get a lot of performance out of D-Wave’s service, as long as the problem aligns well with the annealer’s strengths. In the future, quantum annealers may gain a greater advantage, although the long-term advantage over classical machines is limited, mainly because annealers lack quantum error correction which is likely required for even larger problems. We highlight some contrasting claims below:
The bottom line for the technology, as it stands, is that D-Wave offers a very mature service to solve hard optimization problems that could have benefits for very specific applications — but conventional supercomputers remain more versatile and reliable.
All these different types of computers…
In the figure below, we summarize the differences between various available types of quantum computers. Universal computers could in principle run any quantum algorithm (that’s what it means to be ‘universal’), although all devices that we have today are still noisy (meaning that they are prone to errors during a computation). There exist so-called quantum simulators that are much larger than universal quantum computers, but these are more specialized: they can only simulate a certain class of materials. In other words, they solve a subset of the problems we addressed in the first section on “Simulation of other quantum systems and materials”. Lastly, quantum annealers, in the way that they’re made by D-Wave, are another example of a specialized devices. These can run just a single algorithm, namely quantum annealing for a certain class of problems.
The set of capabilities of the machine are not directly linked to the type of qubit that’s used. For example, the technology used in IBM’s universal computer is closely related to D-Wave’s annealers. We highlight some of the manufacturers and the number of qubits that have been proven to work in experiments (although this list is far from complete).
What about future quantum optimization algorithms?
Well, we simply don’t know! However, some useful technical hints may be:
- When reading data is a limiting factor (as in “big data” applications), then quantum computers are very likely NOT faster. Getting the data into a quantum computer likely takes at least as long as processing the data on a much cheaper supercomputer. This holds, for example, when searching through a database (i.e., the famous ‘finding a needle in a haystack’ search algorithm by Grover), but also for data-intensive simulations like weather forecasting.
- Algorithms that claim an ‘exponential speedup’ often have important caveats. For example, the algorithm that solves systems of linear equations (a very important task in many relevant applications!) does not return its answer in a format that we humans can efficiently read or retrieve.
- Classical computers are very fast, and many real-world computational problems are not bottlenecked by the computer’s speed or the algorithm used. If a current application does not already require a supercomputer, it’s unlikely that anyone is willing to invest in using a quantum computer in the future.
Finally, don’t fall for these common pitfalls:
- Quantum computers will not replace conventional computers, not even in the relatively long term. They will most likely be used in synergy with their conventional counterparts.
- Quantum computers will not magically output new breakthroughs in chemistry and optimization. They will most likely be used like current computers: as a tool, to support a team of hard-working and knowledgeable experts who may choose to incorporate quantum computers in their ‘classical’ workflow.
- We sometimes run into articles that claim that a new algorithm “finds solutions faster” on a quantum computer. However, upon deeper inspection, it turns out that these solutions are sub-optimal or otherwise less reliable. Suspiciously, no timing comparison is given when looking at the accuracy we expect from state-of-the-art methods. In other words: speed isn’t everything!
By now, you should have a good overview of the (im)possibilties of quantum computers, and how they fit in with existing (classical) IT. In particular, it should be clear that its highly unsure how big the impact of quantum computers will be on real-world problem solving!
Next time, we will zoom into the timelines that are sketched for quantum hardware. In other words: when can we expect a decently-sized quantum computer?