Back to Table of contents

Primeur weekly 2014-06-30

Special

Innovative HTC facilities needed to support computational genomics at the application and data level ...

HPC-assisted cell-based immunotherapy successful in curing melanoma ...

Restless hearts are simulated with real-time modelling at IBM Research in Zurich ...

A live report from the Adapteva A-1 "smallest supercomputer in the world" launch at ISC'14 ...

The Cloud

HP launches Helion Managed Services for optimizing Cloud storage workloads ...

Oracle unveils next generation Virtual Compute Appliance ...

Desktop Grids

ATLAS@Home crowd computing project launched at CERN ...

EuroFlash

a.s.r. uses ADVA Optical Networking's GNOC for critical network monitoring and maintenance ...

ADVA Optical Networking launches new era of data centre connectivity with Big Data transport solution ...

KTH wins from crystal clear insight into application performance with Allinea Performance Reports ...

Neurological simulation milestone reached after UCL embraces Allinea's tools on UK's largest supercomputer ...

Bright Computing is building on its success in data centres across Europe ...

UK Atomic Weapons Establishment launches SGI supercomputer ...

Spectra Logic tape library to archive the UK's fastest supercomputer ...

Physicists find way to boot up quantum computers 72 times faster than previously possible ...

In the fast lane: Mediatec uses Calibre UK LEDView530 scalers at FIA World Endurance Championships ...

The upcoming cybernetic age is one of intellectual capital ...

International Supercomputing Conference moves to Frankfurt, Germany in 2015 ...

USFlash

DataDirect Networks helps EMSL speed climate, energy and bioscience discoveries with high performance and massively scalable storage ...

Spectra and the Tandy Supercomputer shorten calculation rates from days to minutes, saving time and lives ...

New A*STAR-SMU centre combines high-powered computing and behavioural sciences to study people-centric issues ...

Scheduling algorithms based on game theory makes better use of computational resources ...

National Renewable Energy Laboratory supercomputer tackles power grid problems ...

Simulations helps scientists understand and control turbulence in humans and machines ...

Stampede supercomputer enables discoveries throughout science and engineering ...

Supercomputing simulations crucial to the study of Ras protein in determining anticancer drugs ...

Stampede supercomputer powers innovations in DNA sequencing technologies ...

Stampede supercomputer helps researchers design and test improved hurricane forecasting system ...

NSF-supported Stampede supercomputer powers innovations in materials science ...

D-Wave and predecessors: From simulated to quantum annealing ...

HPC server market shrinks -9.6% in the first quarter of 2014, according to IDC ...

University of Maryland's Deepthought2 debuts in global supercomputer rankings ...

Nine ways NSF-supported supercomputers help scientists understand and treat the disease ...

CAST releases wysiwyg R33 ...

Physicists find way to boot up quantum computers 72 times faster than previously possible

23 Jun 2014 Saarbrücken - Press the start button, switch on the monitor, grab a cup of coffee and off you go. That is pretty much how most us experience booting up a computer. But with a quantum computer the situation is very different. So far, researchers have had to spend hours making dozens of adjustments and fine calibrations in order to set up a chip with just five quantum bits so that it can be used for experimental work. One quantum bit or 'qubit' is the quantum physical equivalent of a single bit in a conventional computer. Any small errors in the adjustment and calibration procedure and the chip would not work.

The problem is that, not unlike musical instruments, quantum computers react to small changes in the local environment. If, for example, it is a little warmer or a little colder or if the ambient air pressure is a little higher or a little lower than the day before then the complex network of qubits will no longer function - the computer is detuned and has to be readjusted before it can be used.

"Up until now, experimental quantum physicists have had to sit down each day and see how conditions have changed compared to the day before. They then had to remeasure each parameter and carefully recalibrate the chip", explained Professor Wilhelm-Mauch, Professor for Theoretical Quantum and Solid-State Physics at Saarland University. Only a very small error rate of less than 0.1 percent is permissible when measuring ambient conditions.

Frank Wilhelm-Mauch explained this sensitivity thus: "That means that an error can occur in only one in a thousand measurements. If just two in a thousand measurements are in error, the software will be unable to correct for the errors and the quantum computer will not operate correctly." With around 50 different parameters involved in the calibration process, one begins to get an idea of the sheer effort involved in calibrating a quantum computer.

Working together with his doctoral student, Frank Wilhelm-Mauch began to consider a fundamentally new approach to the problem. "We asked ourselves the question: Why is it necessary each and every day to understand how conditions differ from those of the day before? The answer we eventually came up with was that it isn't necessary. What's important is that the set-up procedure produces the right results. Why it produces the right results is not so relevant." It was this pragmatic approach that underlay the work carried out by Frank Wilhelm-Mauch and Egger. "For the calibration procedure we used an algorithm from engineering mathematics, strictly speaking from the field of civil and structural engineering, as that's another area in which experiments are costly", explained Professor Wilhelm-Mauch.

Using this technique, the two theoreticians were able to reduce the calibration error rate to below the required 0.1 percent threshold, while at the same time speeding up the calibration process from six hours to five minutes. The Saarbrücken methodology, which goes under the name Adaptive Hybrid Optimal Control (Ad-HOC), has now been subjected to rigorous testing by a group of experimental physicists from the University of California in Santa Barbara. Their experimental work is published in the issue ofPhysical Review Lettersthat also contains the Saarbrücken paper.

This development is of major importance for future experimental research into quantum computing. Physicists in quantum computing laboratories no longer have to spend hours every day preparing their system for just a short period of experimental work. "As many of the parameters, such as temperature, light and air pressure do not remain stable during the long calibration phase, this can further shorten the time window in which the chip is running error-free and in which it can therefore be used for experiments", stated Frank Wilhelm-Mauch, adding that the new method is scalable. Up until now, technical constraints have meant that experiments have been carried out using a single chip housing five qubits that perform the actual computational operations. The new method, in contrast, is not restricted to chips of this magnitude and can be applied to quantum processors of almost any size.

Frank Wilhelm-Mauch jokingly pointed out another appealing feature of the new methodology: 'Unlike the previous approach of manual calibration, our method is fully automated. The researcher really does just push a button like on a conventional computer. They can then go off to get themselves a coffee while the quantum computer boots up." A major improvement in the life of experimental research scientists working in the field.

The fundamental principle of quantum technology is that a particle, e.g. an atom, electron or photon, can be in two quantum-mechanical states at the same time. This is referred to as a superposition of states. In a conventional computer, information is represented by bits with each bit assuming either the value 0 or 1. In a quantum computer, in contrast, information is carried by quantum bits or 'qubits' with each qubit able to assume the values 0 or 1 or any combination ('superposition') of the two. One way of realizing a quantum computer is with a memory unit composed of atoms whose quantum states can be excited and manipulated in a controlled manner using laser light.

Computational operations can then be performed simultaneously or 'in parallel' on both parts of the superposition state (1 and 0). In the time it takes for a 32-bit conventional computer to process one of its 2 to the power of 32 possible states, a quantum computer can process all of these states in parallel. The quantum computer can therefore carry out computations orders of magnitude faster than a normal computer. However, quantum computing power can only be exploited for special problems for which appropriate quantum algorithms have been developed.

In many of the superposition states, the quantum bits are 'entangled', which means that the superposition can only be described as a whole and not in terms of the independent states of the particles involved. However, both superposed and entangled states are highly sensitive to any interaction with their environment and rapidly lose their quantum character. For quantum computing, this means that a great deal of effort has to be put into screening the system from environmental influences. In another area of quantum technology, this sensitivity to environmental factors is being specifically exploited. In the field of quantum communication, confidential information can be encoded in the form of entangled or superposed states. Anyone endeavouring to access the information would end up destroying the quantum state and the attempted interception would be discovered.

The research work "Adaptive Hybrid Optimal Quantum Control for Imprecisely Characterized Systems" was published on 20 June in the journalPhysical Review Letters- DOI: 10.1103/PhysRevLett.112.240503.
Source: Saarland University

Back to Table of contents

Primeur weekly 2014-06-30

Special

Innovative HTC facilities needed to support computational genomics at the application and data level ...

HPC-assisted cell-based immunotherapy successful in curing melanoma ...

Restless hearts are simulated with real-time modelling at IBM Research in Zurich ...

A live report from the Adapteva A-1 "smallest supercomputer in the world" launch at ISC'14 ...

The Cloud

HP launches Helion Managed Services for optimizing Cloud storage workloads ...

Oracle unveils next generation Virtual Compute Appliance ...

Desktop Grids

ATLAS@Home crowd computing project launched at CERN ...

EuroFlash

a.s.r. uses ADVA Optical Networking's GNOC for critical network monitoring and maintenance ...

ADVA Optical Networking launches new era of data centre connectivity with Big Data transport solution ...

KTH wins from crystal clear insight into application performance with Allinea Performance Reports ...

Neurological simulation milestone reached after UCL embraces Allinea's tools on UK's largest supercomputer ...

Bright Computing is building on its success in data centres across Europe ...

UK Atomic Weapons Establishment launches SGI supercomputer ...

Spectra Logic tape library to archive the UK's fastest supercomputer ...

Physicists find way to boot up quantum computers 72 times faster than previously possible ...

In the fast lane: Mediatec uses Calibre UK LEDView530 scalers at FIA World Endurance Championships ...

The upcoming cybernetic age is one of intellectual capital ...

International Supercomputing Conference moves to Frankfurt, Germany in 2015 ...

USFlash

DataDirect Networks helps EMSL speed climate, energy and bioscience discoveries with high performance and massively scalable storage ...

Spectra and the Tandy Supercomputer shorten calculation rates from days to minutes, saving time and lives ...

New A*STAR-SMU centre combines high-powered computing and behavioural sciences to study people-centric issues ...

Scheduling algorithms based on game theory makes better use of computational resources ...

National Renewable Energy Laboratory supercomputer tackles power grid problems ...

Simulations helps scientists understand and control turbulence in humans and machines ...

Stampede supercomputer enables discoveries throughout science and engineering ...

Supercomputing simulations crucial to the study of Ras protein in determining anticancer drugs ...

Stampede supercomputer powers innovations in DNA sequencing technologies ...

Stampede supercomputer helps researchers design and test improved hurricane forecasting system ...

NSF-supported Stampede supercomputer powers innovations in materials science ...

D-Wave and predecessors: From simulated to quantum annealing ...

HPC server market shrinks -9.6% in the first quarter of 2014, according to IDC ...

University of Maryland's Deepthought2 debuts in global supercomputer rankings ...

Nine ways NSF-supported supercomputers help scientists understand and treat the disease ...

CAST releases wysiwyg R33 ...