Back to Table of contents

Primeur weekly 2013-09-30

Focus

2013: Another year on the road to Exascale - An Interview with Thomas Sterling and Satoshi Matsuoka - Part I ...

The Cloud

VEP 2.1 is now available ...

Cloud-for-Europe gears up for the future ...

Cloud computing must tackle the security challenge ...

European Space Agency delivers its SuperSites Exploitation Platform on Interoute Virtual Data Centre ...

Oracle announces roadmap for Nimbula Director and OpenStack API integration with Oracle Exalogic Elastic Cloud ...

IBM announces US $17 million investment in new Cloud data centre ...

HP Enterprise Services launches new Cloud-based Analytics as a Service based on HP’s Big Data analytics platform ...

EuroFlash

Bull and Sinequa announce unmatched performance on bullion servers, enabling new real time Big Data experience ...

ADVA Optical Networking shows first SDN-automated end-to-end provisioning for dispersed data centres ...

European Landscape Study of Research Data Management ...

HPC Midlands and Bull lead the way in making supercomputing accessible to industry ...

Domain walls as new information storage medium ...

Counting on neodymium ...

USFlash

CoolIT Systems selected by CoinTerra to partner for liquid cooling solutions ...

Agilent Technologies accelerates smartphone and defense simulations by a factor of 64 ...

University of Utah's Christopher Johnson, leader in scientific visualization and computing, will receive 2013 IEEE Computer Society Sidney Fernbach Award ...

LSU researchers receive $4 million NSF grant for new supercomputing cluster ...

Convey Computer launches hybrid-core Memcached appliance - speeds performance, reduces cost of ownership ...

Terascala advances its vision of optimizing high-performance computing work flows for faster time to results ...

Teradata named in two Dow Jones Sustainability Indices for 4th consecutive year ...

Clot busting simulations test potential stroke treatment ...

Supercomputers help solve a 50-year homework assignment ...

Supercomputers help solve a 50-year homework assignment


26 Sep 2013 Brookhaven - Kids everywhere grumble about homework. But their complaints will hold no water with a group of theoretical physicists who've spent almost 50 years solving one homework problem - a calculation of one type of subatomic particle decay aimed at helping to answer the question of why the early universe ended up with an excess of matter.

Without that excess, the matter and antimatter created in equal amounts in the Big Bang would have completely annihilated one another. Our universe would contain nothing but light - no homework, no schools, but also no people, or planets, or stars.

Physicists long ago figured out something must have happened to explain the imbalance - and our very existence. "Our results will serve as a tough test for our current understanding of particle physics. The fact that we have a universe made of matter strongly suggests that there is some violation of symmetry", stated Taku Izubuchi, a theoretical physicist at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory.

The physicists call it charge conjugation-parity (CP) violation. Instead of everything in the universe behaving perfectly symmetrically, certain subatomic interactions happen differently if viewed in a mirror - violating parity - or when particles and their oppositely charged antiparticles swap each other - violating charge conjugation symmetry. Scientists at Brookhaven - James Cronin and Val Fitch - were the first to find evidence of such a symmetry "switch-up" in experiments conducted in 1964 at the Alternating Gradient Synchrotron, with additional evidence coming from experiments at CERN, the European Laboratory for Nuclear Research. Cronin and Fitch received the 1980 Nobel Prize in physics for this work.

What was observed was the decay of a subatomic particle known as a kaon into two other particles called pions. Kaons and pions - and many other particles as well - are composed of quarks. Understanding kaon decay in terms of its quark composition has posed a difficult problem for theoretical physicists.

"That was the homework assignment handed to theoretical physicists, to develop a theory to explain this kaon decay process - a mathematical description we could use to calculate how frequently it happens and whether or how much it could account for the matter-antimatter imbalance in the universe. Our results will serve as a tough test for our current understanding of particle physics", Taku Izubuchi stated.

The mathematical equations of Quantum Chromodynamics, or QCD - the theory that describes how quarks and gluons interact - have a multitude of variables and possible values for those variables. So the scientists needed to wait for supercomputing capabilities to evolve before they could actually solve them. The physicists invented the complex algorithms and wrote nifty software packages that some of the world's most powerful supercomputers used to describe the quarks' behaviour and solve the problem.

In the physicists' software, the particles are "placed" on an imaginary four-dimensional space-time lattice consisting of three spatial dimensions plus time. At one end of the time dimension lies the kaon, made of two kinds of quarks - a "strange" quark and an "anti-down" quark - held together by gluons. At the opposite end, they place the end products, the four quarks that make up the two pions. Then the supercomputer computes how the kaon transforms into two pions as it flies through space and time. Conducting these computations on the lattice greatly simplifies the problem.

"We use the supercomputers to look at how each quark is flying - its velocity, direction - in other words, the dynamics of the strong QCD interaction", Taku Izubuchi stated.

Somewhere in the middle of this complicated space-time grid, with some degree of probability, the strange quark of the kaon - which the strong force keeps strongly bound with its anti-down quark partner - suddenly starts to change into a down quark by the so-called electroweak interaction. Since a kaon is heavier than two pions, the energy released creates a new quark/anti-quark pair - an "up" and an "anti-up" quark - from the vacuum. These quarks then combine with the new down quark and the leftover anti-down quark to make the two pions.

"The experiments showed how frequently these 'K→ππ processes happen, but the part that violates CP symmetry is the strange quark converting into a down quark through the weak interaction", Taku Izubuchi stated. "That's the part we really wanted to know more about to understand the strength of this CP violation. That information will give us a hint of why the universe is matter-rich, and/or confirm the correctness of our current understanding of particle physics."

The supercomputers crunched tens of billions of numbers into the equation that describes this part of the process to find the result that should reproduce the decaying particle patterns and frequencies observed by the experiments.

"The result of the calculation tells us how frequently this CP-violating weak interaction occurs and the strength of the CP violation at the quark level", Taku Izubuchi stated. "It's a kind of reverse-engineering what experimenters have seen in kaon decays to solve the problem."

After publishing their initial results in 2012, the physicists further improved their calculation to more closely simulate what happens with these particles inNature. These new calculations allow them to directly compare their numbers with the experimental results more accurately, but they also increase the computational "cost" considerably - requiring more computing power/time. Even with the newest supercomputers, the homework would have taken many years if not for a new efficient algorithm developed by the Brookhaven group in late 2012.

"This new algorithm, called all-mode averaging (AMA), divides the whole calculation into a 'difficult' but small piece and an 'easier' large piece, and devotes more computation time to the latter part to save the total computation required", Taku Izubuchi stated. "It accelerates the speed of the computations by a factor of ten or more. This very simple idea of dividing the calculation into two pieces actually helped to reduce the statistical error of the computation by a lot."

Is the calculated strength of the weak interaction strong enough to account for the matter antimatter asymmetry in the early universe?

"That's the million-dollar question", stated Taku Izubuchi. "So far people think this is not the full answer. We cannot explain why the universe is matter-rich based solely on the amount of CP violation that this kaon decay accounts for. So there may be other sources of CP violation other than the weak interaction that would be revealed if a discrepancy were found between our calculation and the experimental results."

Then Izubuchi confessed that the theorists have only solved half of their homework problem. "When we say we theoretically understood this process, it is only half true. There are two different ways the two end-result pions can combine with each other - called isospin states, and we've only solved the problem for one combination, the isospin 2 channel."

The experiments have measurements for both isospin states, so the theorists are working on calculating the second process as well.

"The other, isospin 0, is more challenging, and we are getting there by employing the faster supercomputers and new theoretical ideas and computation algorithms. But, for now, we have finished half of 50 years' homework."

This research is part of DOE's Scientific Discovery through Advanced Computing (SciDAC-3) programme "Searching for Physics Beyond the Standard Model: Strongly-Coupled Field Theories at the Intensity and Energy Frontiers", supported by the DOE Office of Science.

The supercomputing resources used for this research included: QCDCQ, a pre-commercial version of the IBM Blue Gene supercomputers, located at the RIKEN/BNL Research Center - a centre funded by the Japanese RIKEN laboratory in a co-operative agreement with Brookhaven Lab; a Blue Gene/Q supercomputer of the New York State Center for Computational Science, hosted by Brookhaven; half a rack of an additional Blue Gene/Q funded by DOE through the US based lattice QCD consortium, USQCD; a Blue Gene/Q machine at the Edinburgh Parallel Computing Centre; the large installation of BlueGene/P (Intrepid) and Blue Gene/Q (Mira) machines at Argonne National Laboratory funded by the DOE Office of Science; and PC cluster machines at Fermi National Accelerator Laboratory and at RIKEN.

Source: Brookhaven National Laboratory

Back to Table of contents

Primeur weekly 2013-09-30

Focus

2013: Another year on the road to Exascale - An Interview with Thomas Sterling and Satoshi Matsuoka - Part I ...

The Cloud

VEP 2.1 is now available ...

Cloud-for-Europe gears up for the future ...

Cloud computing must tackle the security challenge ...

European Space Agency delivers its SuperSites Exploitation Platform on Interoute Virtual Data Centre ...

Oracle announces roadmap for Nimbula Director and OpenStack API integration with Oracle Exalogic Elastic Cloud ...

IBM announces US $17 million investment in new Cloud data centre ...

HP Enterprise Services launches new Cloud-based Analytics as a Service based on HP’s Big Data analytics platform ...

EuroFlash

Bull and Sinequa announce unmatched performance on bullion servers, enabling new real time Big Data experience ...

ADVA Optical Networking shows first SDN-automated end-to-end provisioning for dispersed data centres ...

European Landscape Study of Research Data Management ...

HPC Midlands and Bull lead the way in making supercomputing accessible to industry ...

Domain walls as new information storage medium ...

Counting on neodymium ...

USFlash

CoolIT Systems selected by CoinTerra to partner for liquid cooling solutions ...

Agilent Technologies accelerates smartphone and defense simulations by a factor of 64 ...

University of Utah's Christopher Johnson, leader in scientific visualization and computing, will receive 2013 IEEE Computer Society Sidney Fernbach Award ...

LSU researchers receive $4 million NSF grant for new supercomputing cluster ...

Convey Computer launches hybrid-core Memcached appliance - speeds performance, reduces cost of ownership ...

Terascala advances its vision of optimizing high-performance computing work flows for faster time to results ...

Teradata named in two Dow Jones Sustainability Indices for 4th consecutive year ...

Clot busting simulations test potential stroke treatment ...

Supercomputers help solve a 50-year homework assignment ...