q.a-header-letters-pages-20

A professional’s guide to Quantum Technology

Part 2: The applications of Quantum Computing: How will these machines change society?

In the previous part, we saw that Quantum Computers are actually quite slow computers, but they happen to solve certain problems more efficiently, that is: in fewer steps. Then of course, their true usefulness depends on what specific problems they happen to solve well! 

What are projected applications and use cases?

1. Simulation of other quantum systems and materials

Most materials can be studied very well using our classical computers. However, for certain materials, the locations of the electrons within molecules are notoriously hard to describe, and require quantum mechanics to be carefully simulated. Such problems are prototypical examples where a quantum computer could offer a significant advantage. Realistic applications could be in designing new chemical processes (leading to more efficient factories), estimating the effects of new medicine, or working towards materials with certain properties (like superconductors or semiconductors). Of course, scientists will also be excited to simulate the physics that occur in exotic circumstances, like at the Large Hadron Collider or in Black Holes. 

Simulation is, however, not a silver bullet, and quantum computers will not be spitting out patents by themselves. Breakthroughs in chemistry, pharma and material science will still require a mix of theory, lab testing, and computation, and most of all the hard work of smart engineers. In this perspective, quantum computers offer an inspiring new toolset for a certain niche of problems.

Examples of interesting business news:

2. Cracking a certain types of cryptography

The security of today’s internet communication relies heavily on a cryptographic protocol invented by Rivest, Shamit and Adleman (RSA) in the late 70s. The protocol ensures that distant internet users can establish secret keys for encryption (so that nobody else can read their messages), and to guarantee the origin of files and webpages (so that you know that the latest Windows update actually came from Microsoft, and not a malicious cybercriminal). RSA works thanks to an ingenious mathematical trick: honest users can set up their encryption using relatively few computational steps, whereas adversaries who aim to crack other’s cryptography would need to solve an extremely hard problem. For the RSA cryptosystem, that problem is prime factorization, where the goal is to decompose a very large number (for illustration purpose, let’s think of 15) into its prime factors (here: 3 and 5). As far as we know, for sufficiently large numbers, this task can take a classical computer such a long time that nobody would ever succeed in breaking a relevant code. RSA was deemed adequately secure, at least, until computer scientist Peter Shor started working on what would later become one of the most famous and most impactful quantum algorithms.

In 1994, Shor published a method to crack RSA (and also its cousin Diffie-Hellman key exchange on which Elliptic Curve Cryptography is based) in a relatively efficient way using a quantum computer. To be more concrete, according to a recent paper, a plausible quantum computer could factor a 2048-bit number in roughly 8 hours (and using approximately 20 million imperfect qubits). The authors needed to make several assumptions about what a quantum computer would look like in the future, because many things are unknown. However, they did so in a very careful way, assuming hardware that is close to current experiments (but with many more qubits), and using existing theory for error correction and algorithm optimization. Note that future breakthroughs could reduce the stated time and qubit requirements even further.

Luckily, not all cryptography is broken as easily by a quantum computer. RSA falls in the category of public key cryptography, which delivers a certain range of functionalities. A different class of protocols is symmetric key cryptography, which is reasonably safe against quantum computers, but which also doesn’t provide the same rich functionality that public key crypto does. The most sensible approach is to replace RSA by so-called post-quantum cryptography (PQC): one of many (public-key) cryptosystems that are resilient to attackers with a large-scale quantum computer. Interestingly, PQC does not require the honest users (that’s you) to have anything fancier than a conventional computer.  In other words, PQC can be done using purely classical computers.

See also: NIST’s competition to standardize new public-key algorithms 
Further reading: The race to save the Internet from quantum hackers (a popular science article from Nature).

During the following decade, a major fraction of today’s IT will need to migrate to a new cryptographic standard. For large organizations, updates that change core security code can take many years and are relatively costly. On top of that, the new protocol could require more computational power, and could come with new risks: PQC systems are fairly new and have not been as thoroughly tested as their old, quantum-vulnerable counterparts. That’s why it is important to start preparations as soon as possible – rather today than tomorrow! Several authorities make guides available about how organisations should approach such a migration: 

The quantum threat to encryption is often mentioned in news articles relating to quantum key distribution (QKD), sometimes (incorrectly) presented as an ‘unhackable internet’. QKD allows honest users to establish secret keys or passwords by using a quantum communication channel. It’s important to know that QKD does not offer al the functionality of public-key cryptography, and also that it’s not necessarily unhackable. Therefore, it is not considered a full alternative by itself. We discuss QKD further in a future post.

See also: The NCSC states that it “does not endorse the use of QKD for any government or military applications”

3. Optimization and machine-learning

This is the part where most enterprises become enthusiastic. The theory of quantum mechanics relies heavily on linear algebra, and the actions of a quantum computer are mathematically described by very (very!) large matrices. For this reason, many researchers that aim to find new quantum algorithms are searching in directions where classical computers benefit from faster operations with very big matrices, such as optimization problems, solving differential equations, machine learning and artificial intelligence. For conciseness, we will collect all these fields under the term ‘optimization’. The classical field of optimization is of great relevance to many more specific disciplines, such as operations research, finance, manufacturing, logistics, weather forecasting, and so forth. The classical branch of machine learning has proven itself through celebrated technological breakthroughs such as self-driving cars and computers that beat humans at board games like Chess and Go. 

However, quantum optimization algorithms have not proven themselves yet. There exist various quantum optimization algorithms that offer so-called “quantum speedups”, meaning that a quantum computer would require fewer elementary steps to solve a problem than a classical machine. For example, a famous quantum algorithm invented by Lov Grover (and later extended by Durr and Hoyer) allows us to find the maximum of a function using fewer steps than any classical computer could using brute-force search. The main question is whether this also means that the quantum computer requires less time? Recall that a quantum computer would need much more time to execute a single step. According to a recent paper, most of the so-called ‘quadratic speedups’ that are often found in quantum algorithms will, in practice, not be competitive with a modern-day classical computer in terms of speed. Moreover, Grover’s algorithm only gives a speed-up over brute-force classical approaches, were we just try all possible solutions to find the best one. For many practical optimization problems there exit much smarter classical algorithms, and it is not readily clear that quantum computers would be more efficient. 

Further reading: What exactly is a ‘quadratic speedup’ or an ‘exponential speedup’?

In general, a “quantum speedup” does not automatically mean that Quantum Computers will ever solve that specific problem faster in practice! 

On the other hand, there are algorithms that promise much larger, so-called ‘exponential speedups’, but do not have undeniable evidence to back this up. Often, these algorithms are tested on a small dataset using the small quantum computers that are available today. Small examples do not yield any guarantee that such algorithms will also work in more difficult cases. Nonetheless, it is these algorithms with bold claims that receive media attention. 

Further reading: Professor Scott Aaronson’s warning to ‘Read the fine print’ of optimization algorithms. [Appeared in Nature Physics, with paywall]

Unfortunately, there does not yet exist a killer optimization algorithm that is both useful (like Grover, and many other optimization algorithms) and significantly faster than their classical competitors (like Shor’s algorithm is for prime factorization). As a result, the field of quantum optimization is rather hyped: many terms, such as ‘quantum artificial intelligence’ sound very promising in people’s ears, but in reality, we do not yet know of any algorithms that can live up to these high expectations. 

Still, most larger enterprises keenly follow the developments of quantum optimisation tools, and several even experiment with early proof-of-concepts themselves. To name some: Volkswagen and ExxonMobil investigated how routes for buses and transport ships can be optimised, Roche started a project to find medicines for Alzheimer’s, and the three major Dutch banks collaborated on a workshop to make employees aware of quantum technology.

What about future quantum optimization algorithms?

Well, we simply don’t know! However, some useful technical hints may be:

  • When reading data is a limiting factor (as in “big data” applications), then quantum computers are very likely NOT faster. Getting the data into a quantum computer likely takes at least as long as processing the data on a much cheaper supercomputer. This holds, for example, when searching through a database (i.e., the famous ‘finding a needle in a haystack’ search algorithm by Grover), but also for data-intensive simulations like weather forecasting. 
  • Algorithms that claim an ‘exponential speedup’ often have important caveats. For example, the algorithm that solves systems of linear equations (a very important task in many relevant applications!) does not return its answer in a format that we humans can efficiently read or retrieve. 
  • Classical computers are very fast, and many real-world computational problems are not bottlenecked by the computer’s speed or the algorithm used. If a current application does not already require a supercomputer, it’s unlikely that anyone is willing to invest in using a quantum computer in the future.

After reading this post, you should have a good overview of the (im)possibilties of quantum computers, and how they fit in with existing (classical) IT. In particular, it should be clear that its highly unsure how big the impact of quantum computers will be on real-world problem solving!

That leads us to a crossroads of follow-up questions: do we even know examples of killer algorithms that will make quantum computers produce groundbreaking results? And how long will it take before these applications are within reach? 

One may also look beyond the applications of isolated computers, and wonder what will happen when we connect users with a quantum network or a quatnum internet. [This post will follow Q3 2022].

Part 4: The applications of quantum networks (coming soon!)

Share This Post

Share on linkedin
Share on twitter
Share on facebook
Share on whatsapp
Share on email

More To Explore