20 Jun 2017 Frankfurt -

At http://ibm.com/ibmq , IBM launched its five-qubit system. The company also launched a 16-qubit system. Quantum computing compared to classical computing allows users to tackle certain problems that they could otherwise have no chance of being able to solve. These are problems that are scaled exponentially with their run time, for example, those types of problems like the 'traveling salesman' problem. The salesman needs to move between different cities and try to find the optimal path between those cities. This turns out to be a very complex problem to solve. If one has 57 dots on the map, that is 10^{76}different possible ways to move between all those different cities. This is something that as one continues to grow in that sense, it becomes harder and harder for a classical computer to solve to the point that it is no longer possible.

With quantum computing one can identify different types of algorithms that can be used to be able to tackle these kinds of problems. The way a quantum computer is architected, is fundamentally different. It allows developers to explore the exponential set of possibilities and be able to come upon that one correct solution to the problem. It is something that can be used in a hybrid way on top of what is being done in classical computing so that you can see it almost as an accelerator but it really needs to have that different way of interacting in different types of algorithms to take advantage of it. This is something that you can take advantage of as well. IBMQ has the systems for the public to use for free with different types of user guides. You don't need to be a quantum physicist, you don't need to know anything about linear algebra. There is a beginner's users' guide that you can step in to and learn how to use the programme set-up yourself, Christopher Schnabel explained.

IBM also has a commercial programme where the company is going to offer the most advanced quantum computing hardware to the world. The commercial programme will have the first systems delivered this year and it will be offering 50 qubits within a few years. At 50 qubits you are suddenly at that point where the world's largest supercomputer is no longer able to simulate what that system can do. This can get interesting because if one has something that can't be done, the question is: "Is it really valuable?' This is the work that needs to be done with these systems right now, which is to identify those different types of use cases where one can take something in that 50-100 qubit time frame and look at how one takes advantage of this system. How does one map real problems so that one can do something useful with these systems. At IBM, developers believe that it is not the stage of power of the systems that is going to be the real limitation. The real limitation is going to be our own imagination and our own skills in being able to map the problems from what has been traditionally done in classical computing, expand that to the space that one could not access before and figure out how to do those on these new-term quantum systems.

These systems right now are relatively small. The qubits have errors associated with them. This takes an inner time frame before one is going to move to much larger systems that are fault-tolerant, meaning that they don't have errors, with multiple - perhaps up to 1000 - physical qubits to take one logical qubit with no error. This larger system will be a universal quantum computer, meaning having a full data set that can be applied to be able to map any type of algorithm onto the system. There have been different discoveries in the field of quantum algorithms, for instance Shor's algorithm, that helps factor very large numbers to their primes. This type of factorization requires this type of fault-tolerant, universal quantum computer. IBM has a system that is moving towards that path that is still decades away and in the meantime we have approximate universal quantum computers. These quantum computers still have that universal set of dates where one can implement algorithms, mitigate and tolerate that error that is apparent in the system. The real trick right now in quantum computing and a lot of the work in research that needs to be done is around how does one take those problems, how does one map them towards approximate universal quantum systems that will be there in a year term, Christopher Schnabel explained.

Users have to opportunity to learn about that at the IBMQ website, to explore and hopefully take advantage of that because we are at the beginning of a long phase of learning how to apply quantum computing. At IBM, one is very excited to be a leader in this and to be able to help move the world into this next generation of computing.

*Primeur Magazine*asked whether a 25-qubit quantum computer is half as fast as a 50-qubit quantum computer.

Christopher Schnabel answered that because of the notion of how these quantum computers scale, as one adds qubits, it also provides an exponentially growing set of space within which to work. All of these qubits enable superposition within the different pieces. What this means, is that the maximum opportunity, each time one has a qubit, one is doubling the amount of possible paths through the system. However, that is really the maximum when you talk about what's the power of a quantum computer, particularly in an approximate universal quantum computing space, one needs to talk about a notion of quantum volume. This means: what type of computing is one able to do with the system? If one looks at the 5-qubit system compared to the 16-qubit system, then 16-qubit has about twice the quantum volume of the 5-qubit system. Why is that? Although one has grown the number of qubits, one also needs to drive the error rates down, and improve the connectivity within the system - not all qubits are connected to all other qubits. If one looks from the 16-qubit system to the 17-qubit system, there again, one is going to double the quantum volume. Quantum volume is really the measure of power one needs to use in a quantum system.

*Primeur Magazine*wanted to know what type of power consumption a quantum computer uses.

Christopher Schnabel answered that if one looks at the processor itself, the processor is superconducting. The power consumption of the processor itself is very low, though the system effectively looks like a very large cylindrical refrigerator. The processor is kept at 15 milli-Kelvin. IBM uses commercial components to be able to refrigerate that in a closed-loop system. A lot of the power consumption is really driven by the cooling to be able to maintain the system. That is why IBM can maintain the superconducting state of the processor as well as being able to keep the noise level low because a large part of the quantum volume has to be able to keep those error rates.

Christopher Schnabel hoped that everybody would take advantage of the IBM quantum experience because it is there for everyone to use.

Argonne goes deep to crack cancer code ...

Clarifiying complex chemical processes with quantum computers ...

Hazel Hen to perform millionth compute job ...

Artificial intelligence for human age-reversal ...

Discover the Mont-Blanc performance analysis tools: Paraver, Dimenas, and Extrae ...

IT4Innovations will have a new Director ...

Bright Computing supports Fayetteville State University in AI research ...

IBM sets new record for magnetic tape storage; makes tape competitive for Cloud storage ...

Research into ultrafast laser technology could increase network speeds tenfold ...

Scientists explore ocean currents through supercomputer simulations ...

Are the world's highest paid football players overpaid? Big data says yes ...

NASA enhances online scientific tool used by hundreds worldwide ...

NSF issues awards to advance a national research infrastructure for neuroscience ...

$9 million grant will create neurotech research hub at Cornell ...

Brown to lead 'NeuroNex' centre for creating bioluminescent neuroscience tools ...

Decoding a treasure trove of data from the brain ...

New simulations could help in hunt for massive mergers of neutron stars, black holes ...

The future of search engines ...

Crowdsourcing for Medical Image Analysis wins Lorentz-eScience workshop competition ...