Back to Table of contents

Primeur weekly 2016-07-18

Special

SoftBank Group to acquire ARM for about 29 billion euro (24 billion GBP) ...

Focus

Moore's Law is all about economics but there are alternative technologies on the way ...

CMOS is still here to stay but we need to think out of the box to reclaim efficiency ...

Why the hunt for killer applications to run on quantum computers is challenging ...

Quantum computing

Researchers at UCSB blur the line between classical and quantum physics by connecting chaos and entanglement ...

Soprano and quantum computer combine for world first performance ...

Focus on Europe

New data centre Amsterdam Science Park for SURFsara supercomputers is completed ...

Middleware

Mellanox simplifies RDMA deployments with enhanced RoCE software ...

Hardware

Researchers devise secure, efficient anonymity network ...

Berkeley Lab scientists grow atomically thin transistors and circuits ...

Final expansion stage of the high-performance computer at DKRZ starts operation ...

Huaxintong Semiconductor licenses ARMv8-A architecture ...

SiFive introduces industry's first open-source chip platforms ...

AMAX launches GPU solutions based on latest NVIDIA Tesla P100 GPU accelerators ...

Applications

Netherlands eScience Center initiates three new Path-Finding projects ...

Physicists couple distant nuclear spins using a single electron ...

Supercomputers fire lasers to shoot gamma ray beam ...

Physics researchers question calcium-52's magic ...

Jack Wells brings HPC expertise to seismology meeting ...

Bridges connects evolutionary biologists with genomes of wild species ...

The Cloud

IBM Cloud ranked second in worldwide Infrastructure as a Service (IaaS) ...

Why the hunt for killer applications to run on quantum computers is challenging


22 Jun 2016 Frankfurt - In the ISC 2016 session titled "Scaling Beyond the End of Moore's Law", chaired by John Shalf, the second speaker was Damian Steiger from the Theoretical Physics Lab at ETH Zurich. His goal was to show why quantum computing can solve some of the problems which CMOS can not for at least a few specific applications. Damian Steiger defined a quantum computer as being fundamentally based on different principles of how to process information. One hundred years ago, we figured out that nature is not moving according to classical laws of physics but to quantum physics laws. Since then, we have seen many interesting effects like quantum tunneling, entanglement, and superpositions to describe how atoms and electrons move. Now, with quantum computers, this is the first time that we try to use these laws of physics, not just to build devices but to actually do information processing.

A quantum computer has pretty much the same structure as a classical computer. You need to store your information in quantum bits or qubits. You have to process them and apply operations on them in order to get your final results. To start, you need to have something which is a serial state and you need a one. You can have a system where you have an atom and an electron in a ground state and in an excited state. If you build a computer which acts according to the laws of quantum mechanics you can have a general quantum state of your qubit which is a superposition. The qubit is not just a zero or a one; sometimes it is a zero and a one at the same time, Damian Steiger explained.

We do not have a good intuition of how this works because we never see superpositions in everyday life. There is a law of quantum mechanics which says that if you try to observe any quantum superposition it has to collapse to the classical state, which is zero or one. The qubit is sort of in a probability zero and to some probability one, except that these probabilities can be negative or imaginative. This puzzling and that is why it is hard to think about them with everyday, common intuition, as Damian Steiger pointed out.

Now you have one qubit and you can write the state of this qubit as a formula with two complex numbers. You can store two complex numbers with infinite precision in one qubit. That is much more than you can do with one bit that you can either store as a zero or a one. If you start scaling and you have two qubits in the same system, they can be in all possible classical states of two bits. This means that you can store four complex numbers. If you have N qubits, you can store 2Ncomplex numbers. This is a huge amount of information. Now you can process this information. You can make one operation and you can manipulate all to the N complex numbers at once. The only problem is when you want to read out, you actually have to look at it so you will only get one classical answer. But if you make a smart algorithm you can make use of essentially doing 2Noperations in one qubit, Damian Steiger explained.

Unfortunately, we do not have yet large scale quantum computers but we already have a few different types of qubits. There are different proposals. Damian Steiger mentioned three of them. The best qubits are ion traps. You take one atom, you capture it and you use the electron to actually do the computation. The good thing is that they live for about hours when isolated from the environment. One operation takes 10 microseconds. The problem is that we currently can do only 100 operations. Single atoms are small and well isolated from the environment and you can put them in a vacuum. However, they are relatively slow to do computation and hard to scale beyond 20 qubits. People are working on this problem and will probably solve it but it will take a few years, Damian Steiger said.

The other type of qubits are superconducting qubits. These are on the surface of a semiconductor and superconducting moves which essentially act as qubits. The current investments come from Google and IBM. They have a life time of about 100 microseconds. When you can do a gate operation of 10 microseconds, one operation on 2Ncomplex numbers takes 10 nanoseconds. The advantage is that these qubits are scalable because they are on a semiconductor surface. They are not single atoms that you need to capture and put in a vacuum. Unfortunately, they have short life times. However, in recent years, there has been much progress so these qubits are actually good enough to be error-corrected. They are below a threshold where you can just add more of these physical qubits. In the error-correcting scheme you can make one logical qubit which lives longer than each individual qubit. With this technology you can think of 10,000 small qubits sitting on your chip which need to be error-corrected, Damian Steiger explained.

Topological qubits are the Microsoft approach. They are going into a different direction. In the three dimensions in physics there are two kinds of particles: fermions and bosons. However, in the mathematical theory in physics, you can predict that in two dimensions or in one dimension you can have different particles at the ends. These have a very nice property when you do calculations with them. They are not located at one point in space, they are located at two points. You can not interact with them locally. No noise can just locally be on one side and change anything of your computation. No local noise can detect or destroy the state. They will be much more error-resistant. Since 60 years one is looking for the majorana particle and probably they can show it next year, at least with enough evidence to make sure that the physics community accepts that this majorana particle exists, as Damian Steiger stated.

The main topic of this talk was quantum computing beyond Exascale. The question Damian Steiger prepared for this session was: What are important applications that we can solve on a quantum computer but you cannot solve on a special-purpose post-Exascale classical hardware that we may build in ten years? Usually, Damian Steiger would show small experiments shown in the lab but if one wants people to invest hundreds of millions to build this computer one needs something better. In classical computing, we know how to go forward. Quantum computing is a huge risk to take.

The most famous algorithm was Shor's algorithm for factoring in 1995, a number of 768 bits. The problem is: Can you find its two prime factors? This was actually done with classical CMOS in 2009 taking 2000 CPU years to get the solution. This scales exponentially on all classical hardware we know. We cannot prove that it has to scale exponentially but for all algorithms we know it does. However, we know that on a quantum computer there is an algorithm, called Shor's algorithm, which can solve this problem in polynomial time. You only need 2N+ 3 qubits which is a small number, to solve this problem. Why is this useful? Damian Steiger asked. RSA encryption is based on the fact that you cannot factor numbers, large integers. As the audience could see for the CPUs, it took 2000 CPU years to crack the 768-bit code. It would take thousands of millions of years to actually crack the codes one is currently using. A quantum computer can do this in hours, assuming 10 nanoseconds gate time and a minimal number of 2N+ 3 qubits, and in seconds when using more qubits.

That is why a lot of funding came into the field of quantum computing, simply because the US government is interested in actually cracking RSA encryptions. Even though this is the most famous algorithm, it is not useful. There have been books written on post-quantum encryption. Once we know someone has a quantum computer the encryption scheme can be switched to quantum encryption where completely new devices and a new infrastructure are required. Even more easily, one can go to lattice based cryptography or other classical encryption schemes which are not based on factoring. In that case, the quantum computer is useless to try and crack the code. It will at most provide a square speed-up which is not useful.

So which are the killer applications for quantum computing? Damian Steiger first looked at which applications are running at scale on today's supercomputers. A fair amount of them is simulation of quantum systems in material science, chemistry, condensed matter physics which are important problems to solve, and secondly of economic impact. Can a quantum computer do something for these problems? That is why the first quantum computer was proposed by the Nobel Prize Laureate Richard Feynman who said in 1982: "Solving a quantum system is easiest when you have quantum resources." For people who are working in the field, you have a quantum mechanical systems where you have a wave function. This wave function has 2Ncomplex numbers depending on the number of particles or orbitals you are considering. On a classical computer storing 2Ncomplex numbers is very hard. On a quantum computer N bits can represent 2Ncomplex numbers. That is the basis why quantum computing is very good for them, Damian Steiger explained.

Damian Steiger showed his wish-list of things he wants to do: design a room-temperature superconductor, develop a catalyst for carbon sequestration, and develop better catalysts for nitrogen fixation. The nice thing about these problems from a quantum computing perspective is that these problems are hard for classical computers. You need high accuracy to get the solutions and you cannot achieve them with any of the approximate classical algorithms. To achieve more accuracy it will be exponentially hard in the classical way. However, these algorithms scale polynomially for a quantum computer. That is usually the point where people stop in the quantum information theory. Only in recent years, people started looking at what it takes. The pre-factors matter.

Three years ago, Microsoft looked at a small problem which is just large enough in order to solve it in 10 years on an Exascale machine. If you consider quantum chemistry, you need spin-orbitals. To design his wish-superconductor, Damian Steiger probably needs 10,000 orbitals, for interesting chemical reactions 200 to 400. Classically, you can build only 70. The scale of accuracy is about 1 mHa. The quantum algorithms essentially scale with the number of spin-orbitals you look at. If you want to have 200 spin-orbitals, you need 200 qubits. Classically, you need to store 2200complex numbers which will be a larger number than there are particles in the universe but with a quantum computer, this is easy.

The Microsoft team looked at one molecule to calculate. You can only do discrete sets of operations if you have an architecture which is planar and you need to do error-correction. For these tiny little molecules, they came out with a gate count of 1018. This is very bad. Of course, it scales pretty normally. It is better than exponential but this is not going to happen. Especially when having a quantum computer work for 30 years. You cannot stop the calculation because as soon as you look at it you will destroy it so you would have to have the quantum computer work for 30 years non-stop. This is not going to work, as Damian Steiger pointed out.

There were a series of papers in the past two years coming out to improve the algorithms and to look at high-performance quantum computing to improve the algorithms, especially the constant factors. The researchers came down to two minutes. There is now an order of magnitude reduction but now this problem can be solved on a quantum computer. Still, the next question to ask was: Can we get money for it because it takes a few hundred millions to build such a quantum computer?

The second application Damian Steiger showed was published very recently. Nitrogen fixation is an application with economic benefit and impact that might be solved on a small scale quantum computer. The problem is simple: you need to have fertilizers for your plants to grow, to produce enough food to sustain humanity. Therefore, you need to turn nitrogen from the air into ammonia, preferably at room temperature and with as least energy as possible. The only way we know how to get this done is by a hundred-year-old process from Haber-Bosch. This requires high pressures and high temperatures to produce the fertilizer. This takes about 3 to 5 percent of the world's natural gas consumption and 1 to 2 percent of the world's annual energy consumption from humanity. This is a huge energy cost. There is a better solution because before the Haber-Bosch process was invented, there also were fertilizers in the form of simple bacteria. Bacterias in soil do not have high pressures nor high temperatures. There is an effective catalyst to do this reaction at sufficient high rates.

Currently, chemists are struggling to synthesize such a catalyst to have a few but they are working very poorly. All classical methods to compute how the reaction has to work to construct a better catalyst failed completely because it is exponentially hard. So this is the task described in a paper that was published by two groups from ETH Zurich and Microsoft. They tried to figure out how much this might take on a quantum computer, not just taking an abstract scaling of an algorithm but taking a specific architecture and compiling it down for these operations you can do, adding the error-correction and also see how you can apply it for all kinds of different architectures. What came out in the end is that you can design such a catalyst because one can calculate the energy sufficiently enough and imitate what the bacterias are doing and one can do this with a small quantum computer. One needs a quantum computer in the order of size where one can factor a 4,000-bit number, Damian Steiger explained.

Currently, we are at the stage where we want to look at what a quantum computer can do but we want to find killer applications, not just better scaling algorithms but problems we can bring down to count how many resources we need, how many qubits, how many operations. This is a very hard challenge, first of all because of the HPC community. Every time we come up with a problem, it has to be hard enough so that it cannot be solved on an Exascale machine, because otherwise you can better build an Exascale machine. The problem has to be amenable to quantum acceleration. We only know of a handful of algorithms which have a scaling advantage compared to the classic one. A quantum computer can technically do any calculation you can do with a classical architecture but it won't be faster and it won't be scaling much more. The error-correction overhead would probably make it not really affordable. That is why we have to find a new algorithm. The crossover scale has to short enough. This is something that Damian Steiger has encountered many times where the algorithm scales better when the pre-factor is 1027.

Potential applications are factoring and code breaking. If you are one of the government agencies, this is probably useful. More important are quantum chemistry and material simulations. These are extremely challenging. You have to spend years on optimizing algorithms and obviously build a large scale quantum computer with enough qubits. This is probably worth investing in. There are also other applications such as quantum machine learning. There are algorithms which scale better for machine learning and artificial intelligence on a quantum computer. Here, you are not talking about a small scale quantum computer but about a medium size one, given the amount of data that you have to feed into it. Solving linear systems is yet another application. There is a beautiful algorithm under a few assumptions. You won't be able to have logarithmic bits of your answer vector or complex numbers. You can solve your linear system in logarithmic time. The question is: Do we have small enough applications? Probably not because we can solve linear systems on classical computers fast enough until they grow to a huge size but then we also need a medium-sized quantum computer to solve them, Damian Steiger concluded.

The workshop is covered in full in five articles:

  1. Moore's Law is all about economics but there are alternative technologies on the way
  2. CMOS is still here to stay but we need to think out of the box to reclaim efficiency
  3. Why the hunt for killer applications to run on quantum computers is challenging
  4. SpiNNaker and BrainScaleS neuromorphic systems ready for non-expert use in stochastic inference computing
  5. Beyond Moore's Law panelists address business maturity of new technologies, future developments, neuromorphic chip training, and memristors

Leslie Versweyveld

Back to Table of contents

Primeur weekly 2016-07-18

Special

SoftBank Group to acquire ARM for about 29 billion euro (24 billion GBP) ...

Focus

Moore's Law is all about economics but there are alternative technologies on the way ...

CMOS is still here to stay but we need to think out of the box to reclaim efficiency ...

Why the hunt for killer applications to run on quantum computers is challenging ...

Quantum computing

Researchers at UCSB blur the line between classical and quantum physics by connecting chaos and entanglement ...

Soprano and quantum computer combine for world first performance ...

Focus on Europe

New data centre Amsterdam Science Park for SURFsara supercomputers is completed ...

Middleware

Mellanox simplifies RDMA deployments with enhanced RoCE software ...

Hardware

Researchers devise secure, efficient anonymity network ...

Berkeley Lab scientists grow atomically thin transistors and circuits ...

Final expansion stage of the high-performance computer at DKRZ starts operation ...

Huaxintong Semiconductor licenses ARMv8-A architecture ...

SiFive introduces industry's first open-source chip platforms ...

AMAX launches GPU solutions based on latest NVIDIA Tesla P100 GPU accelerators ...

Applications

Netherlands eScience Center initiates three new Path-Finding projects ...

Physicists couple distant nuclear spins using a single electron ...

Supercomputers fire lasers to shoot gamma ray beam ...

Physics researchers question calcium-52's magic ...

Jack Wells brings HPC expertise to seismology meeting ...

Bridges connects evolutionary biologists with genomes of wild species ...

The Cloud

IBM Cloud ranked second in worldwide Infrastructure as a Service (IaaS) ...