Kristel Michielsen showed the industry involvement and interest of industry in quantum computing referring to a report of McKinsey in 2019. The most important industries as of 2019 are the chemical and agriculture industries, second are the finance industries, and third are the pharmaceutical industries. There is a value creation per industry vertical that follows individual timelines. Some industries are more set up to benefit from quantum computing than others. The three industries Kristel Michielsen referred to have the same low entry barrier which makes them highly attractive for quantum computing. In terms of size of price these industries apparently can benefit a lot from quantum computing. For the chemical and agriculture industries it seems that they can benefit rather soon from quantum computing. For the finance and pharmaceutical industries, it might take a little bit longer before they have some benefit from quantum computing.
There are four distinct quantum tools which may enable use cases across the different industries. The capabilities rely on the dedicated quantum algorithms. A first quantum tool is quantum simulation for chemicals and materials. Examples constitute the simulation of quantum systems for research and development on chemicals, including molecules, solids or polymers. The scope is ranging from small molecules up to huge macro-molecules, such as proteins for example. The quantum algorithm that is used for this is variational quantum eigensolver (VQE). This is a quantum classical hybrid algorithm, meaning that for this algorithm one uses on the one hand a conventional computer and on the other hand a quantum device. The second tool is optimization. The goal is to solve numeric optimization problems with a huge number of variables. These problems can be found in manufacturing and finance industries. With this method one can also perform optimization in supply chains and logistics. A third quantum tool is provided by artificial intelligence (AI) and quantum machine learning. The idea is to improve the algorithms for conventional AI and machine learning. It is used for applications in manufacturing and the pharmaceutical industry. The fourth quantum tool is prime factorization to crack or create encryption. The goal is to handle large prime numbers by factorizing. The use will be to break encryption to get access to intelligence. Another goal is to create safer communication systems and to secure critical data for finance. The algorithm that is used is Shor's algorithm to factorize large prime numbers.
Many potential users are already positioning themselves in the world of quantum computing. They are driven by the prospect of benefiting first, out of fear for being left behind. These users already have early in-house development. Then there is a class of users who have joint development with upstream players including big companies like IBM or Google, or small start-up companies that provide consultancy to these users. Another class of users is still waiting and observing whether quantum computing will make a benefit for them. There are also users who will pass the opportunity but they constitute a minority.
Why should you as an industry consider to move early? Krisel Michielsen asked. It might be that it is highly profitable for you or it might be that quantum computing is easily accessible. Among industries, there is a high degree of competition to use this new technology. There are also strategic considerations which might warrant an early action. Chemical industries like Dow and BASF, oil and automotive industries, including Daimler and Volkswagen, insurance industries and banks, and aerospace industries such as Airbus and NASA are early adopters.
Kristel Michielsen showed the current value chain in quantum computing. First, there are the enablers. Their mission is to provide an enabling service or existing components to be used in quantum computers. The goal of the hardware players is to provide full hardware solutions or to address a specific hardware issue. This can be the control of qubits or it can be a cooling equipment. The aim of the software players and user interface providers is to provide software interfaces between industry users and the hardware. They also identify and solve concrete customer needs. They can determine, together with the industry, what kind of problems they have and which of them are suitable to solve with quantum computing.
As of 2019, the value chain is fragmented as the industry has not yet settled on one hardware concept and on one dominant business model. There are many different quantum platforms. We have superconducting qubits, we have semiconductor qubits, we have even more exotic ones, like the spins of large molecules. Especially start-ups tend to take play in several areas of the value chain, covering hardware and the ancillary software interface. Specialization and consolidation is expected once quantum computing becomes more mature.
One often poses the question of the timeline: when should we be ready to use quantum computing? There are three stages, according to the McKinsey report. The early stage is the incubation stage, running from 1981 to 2020. The target in this stage was basic research and technology development. We are now entering the second stage which is called the NISQ era, which is the stage of midterm market development. NISQ stands for Noisy Intermediate-Scale Quantum technology. This stage will last for 10 years until 2030. The target in the NISQ stage is commercialization wherever quantum computing can bring early value, such as in small prototype applications. The really interesting era will be where universal quantum computing can create full value. This will start happening from 2030 onwards. Here the target is to bring quantum computing to the masses, maximize value and reduce cost.
The commitment of the European Commission is being expressed in project calls which are ranging from really research-based to the building up of infrastructures. Europe wants to install quantum computing and simulation infrastructure. They want to build and deploy an infrastructure for Big Data, artificial intelligence, high performance computing, including quantum computing. In 2021, the EuroHPC Joint Undertaking will give support for the first hybrid HPC/Quantum computing infrastructure in Europe.
What is available nowadays in the area of quantum computing hardware? There are already various quantum computing devices available such as the IBM chip, the Google Sycamore quantum processor and a Rigetti Computing quantum chip. These three have in common that they are built from superconducting qubits. Currently, the maximum amount of qubits is about 53. There is another quantum computer built from superconducting qubits but this is one of a different type, namely an annealer, which is commercially available from D-Wave Systems. Some smaller devices are the ion cube device, which is an ion trap quantum computer, a semiconductor quantum computer and a simulator with neutral atoms which can be delivered by Pasqal. All these devices have different quantum technology readiness levels. These levels describe the maturity of quantum computing technology.
Thomas Lippert and Kristel Michielsen made a scheme with nine quantum technology readiness levels and indicated at which level the different devices are at this moment. Experimental multi-qubit systems received the level 2 or 3. These devices are usually rather small and the quality of the qubits might not be so high as the quality of the qubits that are fabricated by companies like IBM, Google and Rigetti Computing. At level 8, we find the D-Wave quantum annealer. Although this is a different type of quantum computer, it has been put in the same scheme because at the moment there is only one. It has a relatively high level. Up to now, every two years, the D-Wave Systems' quantum processor doubles in size. This makes it an attractive system for potential prototype applications. However it should be clear that this leads to huge challenges and opportunities in the development of prototype applications and use cases.
In order to solve problems optimally, to come to prototype applications, and later on to real world applications, high performance and quantum computers have to be linked. Nowadays, high performance computers are used to perform HPC simulations of the quantum computing and annealing devices. This provides insight to understand how these quantum devices operate. As such, one can also help in the design of the quantum hardware by identifying where things are not going as expected according to quantum theory. One can also use them for benchmarking. The goal however is to use these linked computers to perform hybrid simulations for real world applications.
The linking of quantum hardware and HPC computers is exactly what researchers plan to do in JUNIQ, which stands for the JÜlich UNified Infrastructure for Quantum computing. JUNIQ is a quantum computer user facility that is now set up at the Jülich Supercomputing Centre. The quantum hardware will be integrated in a modular supercomputer architecture. In this modular architecture there are different modules. One module is called the Cluster which mainly consists of CPUs. The second module is the Booster which mainly consists of GPUs. Modules 3 and 4 are a Data Analytics module and a Storage System. Finally, there are two more exotic modules. One is the resort for neuromorphic computing and the other is the resort for quantum computing, where JUNIQ is hosted. The quantum module has the perspective to develop quantum-classical hybrid computing models.
JUNIQ will provide a uniform portal for access to quantum computer simulators and quantum computer technologies at different levels of maturity. The simulator part consists of the Jülich universal quantum computer simulator which has set the world record of simulating a quantum computer with 48 qubits, and the Atos Quantum Learning Machine (QLM) which is capable of simulating a quantum computer with 30 qubits. Among the quantum computer technologies, there will be the OpenSuperQ quantum computing device with up to 100 qubits which will be developed in the OpenSuperQ project in the European Quantum Flagship. At Jülich, one will also try to provide access to IBM and Google quantum computers. By the end of 2020 or beginning of 2021, Jülich will host a D-Wave quantum annealer of the newest generation with more than 5000 qubits. JUNIQ does not only give access to hardware. The researchers also develop quantum algorithms, protocols, tools and prototype use cases. This is done together with the users who come from academia and industry in Europe. Users also receive support and training in HPC and quantum computing usage.
Kristel Michielsen went on to talk about the D-Wave quantum annealer. Up to now, researchers have not yet seen a prototype application showing quantum speed-up but this is not the only thing one has to look for. The D-Wave quantum annealer is very efficient. It only uses 25 kilowatt of energy, while a traditional supercomputer uses 2500 kilowatt of energy and an exascale computer uses even a factor of 10 more. If the D-Wave quantum annealer can be a special purpose computer that solves part of an optimisation problem well compared to a traditional computer, then already one has a huge benefit in energy consumption. This is also something one has to take into account, next to quantum speed-up.
As for the Open Superconductor Quantum Computer from the OpenSuperQ Quantum Flagship project, the main goal is to design, build and operate a quantum computer with up to 100 superconducting qubits. The researchers at Jülich have to benchmark the device; to simulate prototype applications, including the ground state calculation for small molecules which are quantum chemistry applications by using quantum-classical hybrid algorithms; to develop software code that simulates a realistic model of the hardware; and to provide Cloud-based access to the hardware. If the researchers combine three systems in JUNIQ, namely the conventional supercomputer, one of the NISQ devices and an annealer, they can perform a benchmarking analysis of an optimisation problem. For this purpose, they use on the conventional supercomputer and the NISQ devices the quantum approximate optimisation algorithm (QAOA) and on the quantum annealer, they use the quantum annealing algorithm.
QAOA is also a variational quantum algorithm. As for the quantum chemistry problems, this algorithm is also a hybrid one. The algorithm relies on iteratively applying a series of parametrized unitary transformations to a quantum register, measuring its resulting state and evaluating the energy expectation value. To solve the optimisation problem properly, the number of iterations p should always be larger than or equal to 1. A classical optimisation algorithm is used to optimize the parameters beta and gamma of the unitary transformations. From theory it is known that if p goes to infinity and beta and gamma are taken according to a quantum annealing schema, the solution is guaranteed to be found.
So for the benchmarking purposes, the researchers used the Jülich Universal Quantum Computer Simulator on the conventional supercomputer and an IBM Q device. The problem the researchers solved is hard, not for the conventional supercomputer but for the quantum annealer. There is a large difference between the simulation result and the result obtained on the quantum computing device. This is not a result which is typical for a quantum computing device made by IBM. The researchers also observed similar results on other quantum computing devices with superconducting qubits. To make the comparison with the D-Wave annealer, the researchers solved the same problems with the annealer. The quantum annealer performs better than the QAOA algorithm. However, the results show that the quantum annealer also has a hard time to solve this optimisation problem. This example clearly shows that a quantum computer cannot solve all problems.
Another prototype application is quantum machine learning. In this application, the researchers have implemented the Support Vector Machine (SVM) on a quantum annealer. Support Vector Machines are supervised machine learning algorithms for classification and regression problems. In a classical SVM the training corresponds to a convex quadratic optimisation problem. This is one of the rare minimisation problems in machine learning that has a global minimum. The challenge is that the global minimum for the training dataset may not be necessarily optimal for the test dataset. If you go to the quantum Support Vector Machine on a D-Wave quantum annealer, the advantage of the quantum annealer is that it produces an ensemble of close-to-optimal solutions for the training data. The different solutions often emphasize different features of the training data. The combination of these solutions to test data might solve the classification task better than a classical SVM.
The researchers applied the quantum SVM to a classification task in biophysical data. The question was to decide whether a certain protein binds to a certain DNA sequence. The researchers observed that the quantum SVM can produce stronger classifiers than the classical SVM for the same little training data and parameters. The quantum SVM performs better for small training datasets than its classical counterpart. If one considers all datasets, the quantum SVM performs better or comparative to the classical SVM. This is a real benefit for the quantum solution.
The hybrid usage of high performance and quantum computers will be the key to successful development of quantum computing applications.