# A professional’s guide to Quantum Technology

__Preface: Why this guide?____Part 1: The background: Why are we so enthusiastic about Quantum Technology?__- Part 2: The applications of Quantum Computing: how will these machines change society?
- Part 3: When can we expect a Quantum Computer? (Coming soon)
- Part 4: Why we can not trust every claimed Quantum breakthrough? (Coming soon)
- Part 5: How can my organisation get started with Quantum? (Coming soon)
- Further reading: An overview of resources

## Part 1: The background: why are we so enthusiastic about Quantum Technology?

## What is Quantum?

**Quantum Physics** is the theory that describes the smallest particles, like electrons, atoms, and sometimes even small molecules. It’s the machinery that describes things that happen at the scale of nanometers. You may contrast it to Newton’s classical physics, which works great for objects that are the size of a human or even the size of a bacterium, but becomes inaccurate at much smaller scales. Quantum is, in a sense, a *refinement *of classical physics: the theories agree for large objects, but one requires the more difficult quantum theory to describe small things.

It should not come as a surprise that quantum physics is relevant for several kinds of technology that operate on a small scale, like __nano-cars that consist of just a few atoms__, or __ever-smaller transistors on computer chips__. However, quantum technology goes further than simply telling us how to handle small particles: **quantum allows us to perform certain processes in a fundamentally different way. **We will see an insightful example of this very soon.

In particular, we will refer to ‘classical’ technology to mean devices that don’t carefully exploit the new possibilities of quantum physics. For example, you’re most likely reading from a classical computer or phone right now, using the classical internet. Whereas classical computers work with ‘bits’, quantum technology will use a quantum mechanical variant called ‘quantum bits’, or shortly ‘qubits’. Generally, under the nomer of **quantum technology**, we distinguish four categories:

Devices that perform automatic calculations in order to solve a certain problem. Computing is the main subject of this blog.

A network that sends *qubits *between connected computers. The most relevant use case is to strengthen cryptography used by classical computers, but there are many more applications.

Devices that exploit quantum-mechanical effects to accurately measure certain quantities, like a magnetic field or the strength of the earth’s gravity. Quantum clocks also fall in this category.

Devices that carefully reproduce the behaviour of atoms and electrons in a piece of material, allowing us to understand the properties of said material.

In this blog, we mainly focus on **computers **and **internet**: simply because these are most likely to be used by a large number of (business) users. For further reading about the physics details, we recommend the following sources as a great introduction:

__Quantum Country__ – a great online textbook about Quantum Computing by by Andy Matuschak and Michael Nielsen.

__QuTech Academy’s __* School of Quantum – *A broad range of topics explained using short videos.

__The Map of Quantum Computing (Youtube)__ – A 30-minute video that forms a great supplement to this blog.

## What can a Quantum Computer do for us?

Our new protagonist is the quantum computer: a device that can perform certain quantum computations at our command. The ‘quantum’ part stems from the fact that it can actually exploit the laws of quantum mechanics. This should be contrasted to our conventional ‘classical’ computers, which surely are *described* very accurately by quantum physics, but happen to also work fine according to classical physics. By no means are they supposed to ever deal with superposition or entanglement. Contrarily, a proper quantum computer does exploit these purely quantum mechanical effects.

A naive view of quantum computers is that they’re simply *faster* than their conventional counterparts. Or perhaps one may naively point at Moore’s Law: with transistors reaching atomic scales, we run into quantum effects, hence quantum physics may help us make better chips. However, none of these are proper motivations to look at quantum computers.

Firstly, **quantum computers are much, much slower than conventional computers,** if one focuses only on the ‘number of operations’ they can do. A modern CPU can handle 64-bit numbers (meaning that all numbers between 0 and 2^64-1 can be represented), and it can perform basic arithmetic like addition, multiplication, and so forth on these numbers. The speed of these modern classical machines is incredible: a typical Intel or AMD processor bought in 2022 performs these steps at roughly 3 GHz, or 3 billion steps per second. There’s no way a human can possibly keep up with such speeds when it comes to plain calculations!

Now, quantum computers are supposed to be even faster, right? Well, it’s not hard to find backing for that claim:

__Google creates quantum chip millions of times faster than the fastest supercomputer__- C
__hinese Scientists Create Quantum Processor 60,000 Times Faster Than Current Supercomputers__

However, you may be disappointed to hear that quantum computers, at this moment, cannot even add or multiply numbers of more than 3 or 4 bits. And even if they could, their rate of operation would by no means reach 3 GHz, but more likely several kHz, at best. In other words, they’re more than a million times *slower. *To make things worse, the information in quantum computers is extremely fragile and requires to be constantly checked and corrected, using so-called **error correction***. *This is a form of overhead that could make quantum computers another several orders of magnitude slower. Even in the far future, when quantum computers are more mature and more reliable, we still expect them to be much slower than the classical chips at that time.

How does this rhyme with the news about ever faster quantum computers? And why are we still interested in these slow machines? As we claimed before, we hope to do certain computations in a fundamentally different way. We borrow this great analogy from Andy Matuschak and Michael Nielsen’s Quantum.country.

Imagine that you’d like to travel from Morocco to Spain. If your technology does not allow you to cross the Strait of Gibraltar, your only choice is to traverse all the way through North-Africa, past the Arabian Peninsula, and through Europe, before you can reach your destination. This represents the steps taken by a classical algorithm. In the same analogy, a quantum computer endows you with the ability to traverse the sea at the Strait of Gibraltar, like a boat or a plane. The fundamentally new possibilities that quantum offers allow us to do computations in *fewer steps*: even with a slower vehicle (computer), one may arrive at the destination sooner. In fact, this advantage often grows as problems become larger and more complicated. It is still an active area of research to completely map the landscape over which quantum computers can travel, and hence to determine which problems can be sped up, and which cannot.

We like this analogy in particular, because it indicates that some problems can profit greatly from a quantum computer, whereas many problems will only be much slower when ran on a quantum machine. For this reason, we don’t expect that classical computers will be ‘replaced’ any time soon: instead, different types of processors (classical, quantum, perhaps other new types) should synergize and work together to solve a problem as efficiently as possible.

In the analogy with the Strait of Gibraltar, the precise route ‘route’ that you travel is the algorithm. More precisely, an algorithm is a step-by-step list of instructions that guarantee a human or a computer to find the solutions. The *‘steps’ *here should be sufficiently simple so that it is completely unambiguously how to do them: for example, adding, multiplying or comparing numbers. A quantum algorithm has a wider spectrum of allowed ‘steps’, as made possible by a quantum computer. In the end, simply finding the ‘recipe’ itself is not enough: the algorithm has to be turned into quantum software, a piece of computer code that tells a computer explicitly how to execute the step-by-step instructions, given the possibilities and limitations of the computer hardware.

The difference between ‘algorithms’ and ‘software’ is subtle: you may think of the algorithm as a recipe to bake the perfect chocolate cookie. The algorithm should unambiguously describe the required steps to bake the cookie: what ingredients are needed, in what order they should be mixed, how long the cookie should be baked, etcetera. However, if you’d like to build a factory that produces these cookies, you need to be even more specific: out of what pipe does the dough flow, and in what pot does it have to be mixed? What exactly is the tool that performs the mixing? How are cookies laid next to each other in the oven?

Hence, the ‘algorithm’ is a very general solution to a problem: once found, it can be re-used different times, by different software developers. The ‘software’ is specific to a (type of) computer (here: the ‘factory’).

In the next part, we’ll dive into precisely the **quantum algorithms **we know of, and how much impact these will have. And we’ll ask the question: what new algorithms may we find?