Back to Table of contents

Primeur weekly 2017-03-20

Focus

CEO Steve Cotter elaborates on GEANT's backbone infrastructure, service deployment and KPI development ...

Exascale supercomputing

MicroVisor, a new hypervisor technology, paves the way for low-power data centres ...

Crowd computing

iEx.ec announces token crowdsale to launch the first distributed Cloud Crowdcomputing platform ...

Quantum computing

Digital pioneering work: Volkswagen uses quantum computers ...

D-Wave and Virginia Tech join forces to advance quantum computing ...

D-Wave 2000Q system to be installed at Quantum Artificial Intelligence Lab run by Google, NASA, and Universities Space Research Association ...

Focus on Europe

NVIDIA partners with Bosch for system based on next-generation DRIVE PX Xavier platform ...

New supercomputer for RWTH Aachen ...

European 'Big Data' e-infrastructure to support biodiversity research Lifewatch now ERIC legal entity ...

ISC 2017 dedicates a day to deep learning ...

Hardware

Los Alamos National Laboratory donation adding to University of New Mexico supercomputing power ...

BASF selects HPE to build supercomputer for global chemical research ...

Top weather and climate sites rely on DDN storage for more accurate, faster simulations, forecasts and predictions ...

Noodle.ai unveils BEAST Enterprise Artificial Intelligence supercomputing technology - powered by NVIDIA DGX-1 AI supercomputer ...

Simons Foundation's Flatiron Institute to repurpose SDSC's 'Gordon' supercomputer ...

Applications

A scientist and a supercomputer re-create a tornado ...

Arthritis Research UK introduces IBM Watson-powered 'virtual assistant' to provide information and advice to people with arthritis ...

High-precision calculations on supercomputers help reveal the physics of the universe ...

IDx and IBM Watson Health forge alliance for eye health ...

IBM Watson Health to integrate MedyMatch Technology into cognitive imaging offerings to help doctors identify head trauma and stroke ...

Simultaneous detection of multiple spin states in a single quantum dot ...

Computer simulation of protein synthesis reveals awesome complexity of cell machinery ...

Computer simulations first step toward designing more efficient amine chemical scrubbers ...

The prototype of a chemical computer detects a sphere ...

The Cloud

Oracle announces fiscal 2017 Q3 results ...

IBM and Red Hat Collaborate to Accelerate Hybrid Cloud Adoption with OpenStack ...

High-precision calculations on supercomputers help reveal the physics of the universe

With the theoretical framework developed at Argonne, researchers can more precisely predict particle interactions such as this simulation of a vector boson plus jet event. Image by Taylor Childers.9 Mar 2017 Argonne - On their quest to uncover what the universe is made of, researchers at the U.S. Department of Energy's (DOE) Argonne National Laboratory are harnessing the power of supercomputers to make predictions about particle interactions that are more precise than ever before. Argonne researchers have developed a new theoretical approach, ideally suited for high-performance computing systems, that is capable of making predictive calculations about particle interactions that conform almost exactly to experimental data. This new approach could give scientists a valuable tool for describing new physics and particles beyond those currently identified.

The framework makes predictions based on the Standard Model, the theory that describes the physics of the universe to the best of our knowledge. Researchers are now able to compare experimental data with predictions generated through this framework, to potentially uncover discrepancies that could indicate the existence of new physics beyond the Standard Model. Such a discovery would revolutionize our understanding of nature at the smallest measurable length scales.

"So far, the Standard Model of particle physics has been very successful in describing the particle interactions we have seen experimentally, but we know that there are things that this model doesn't describe completely. We don't know the full theory", stated Argonne theorist Radja Boughezal, who developed the framework with her team.

"The first step in discovering the full theory and new models involves looking for deviations with respect to the physics we know right now. Our hope is that there is deviation, because it would mean that there is something that we don't understand out there", she stated.

The theoretical method developed by the Argonne team is currently being deployed on Mira, one of the fastest supercomputers in the world, which is housed at the Argonne Leadership Computing Facility, a DOE Office of Science User Facility.

Using Mira, researchers are applying the new framework to analyze the production of missing energy in association with a jet, a particle interaction of particular interest to researchers at the Large Hadron Collider (LHC) in Switzerland.

Physicists at the LHC are attempting to produce new particles that are known to exist in the universe but have yet to be seen in the laboratory, such as the dark matter that comprises a quarter of the mass and energy of the universe.

Although scientists have no way today of observing dark matter directly - hence its name - they believe that dark matter could leave a "missing energy footprint" in the wake of a collision that could indicate the presence of new particles not included in the Standard Model. These particles would interact very weakly and therefore escape detection at the LHC. The presence of a "jet", a spray of Standard Model particles arising from the break-up of the protons colliding at the LHC, would tag the presence of the otherwise invisible dark matter.

In the LHC detectors, however, the production of a particular kind of interaction - called the Z-boson plus jet process - can mimic the same signature as the potential signal that would arise from as-yet-unknown dark matter particles. Radja Boughezal and her colleagues are using their new framework to help LHC physicists distinguish between the Z-boson plus jet signature predicted in the Standard Model from other potential signals.

Previous attempts using less precise calculations to distinguish the two processes had so much uncertainty that they were simply not useful for being able to draw the fine mathematical distinctions that could potentially identify a new dark matter signal.

"It is only by calculating the Z-boson plus jet process very precisely that we can determine whether the signature is indeed what the Standard Model predicts, or whether the data indicates the presence of something new", stated Frank Petriello, another Argonne theorist who helped develop the framework. "This new framework opens the door to using Z-boson plus jet production as a tool to discover new particles beyond the Standard Model."

Applications for this method go well beyond studies of the Z-boson plus jet. The framework will impact not only research at the LHC, but also studies at future colliders which will have increasingly precise, high-quality data, Radja Boughezal and Frank Petriello said.

"These experiments have gotten so precise, and experimentalists are now able to measure things so well, that it's become necessary to have these types of high-precision tools in order to understand what's going on in these collisions", Radja Boughezal stated.

"We're also so lucky to have supercomputers like Mira because now is the moment when we need these powerful machines to achieve the level of precision we’re looking for; without them, this work would not be possible."

Funding and resources for this work was previously allocated through the Argonne Leadership Computing Facility's (ALCF's) Director's Discretionary programme; the ALCF is supported by the DOE's Office of Science's Advanced Scientific Computing Research program. Support for this work will continue through allocations coming from the Innovation and Novel Computational Impact on Theory and Experiment (INCITE) programme.

Source: Argonne National Laboratory

Back to Table of contents

Primeur weekly 2017-03-20

Focus

CEO Steve Cotter elaborates on GEANT's backbone infrastructure, service deployment and KPI development ...

Exascale supercomputing

MicroVisor, a new hypervisor technology, paves the way for low-power data centres ...

Crowd computing

iEx.ec announces token crowdsale to launch the first distributed Cloud Crowdcomputing platform ...

Quantum computing

Digital pioneering work: Volkswagen uses quantum computers ...

D-Wave and Virginia Tech join forces to advance quantum computing ...

D-Wave 2000Q system to be installed at Quantum Artificial Intelligence Lab run by Google, NASA, and Universities Space Research Association ...

Focus on Europe

NVIDIA partners with Bosch for system based on next-generation DRIVE PX Xavier platform ...

New supercomputer for RWTH Aachen ...

European 'Big Data' e-infrastructure to support biodiversity research Lifewatch now ERIC legal entity ...

ISC 2017 dedicates a day to deep learning ...

Hardware

Los Alamos National Laboratory donation adding to University of New Mexico supercomputing power ...

BASF selects HPE to build supercomputer for global chemical research ...

Top weather and climate sites rely on DDN storage for more accurate, faster simulations, forecasts and predictions ...

Noodle.ai unveils BEAST Enterprise Artificial Intelligence supercomputing technology - powered by NVIDIA DGX-1 AI supercomputer ...

Simons Foundation's Flatiron Institute to repurpose SDSC's 'Gordon' supercomputer ...

Applications

A scientist and a supercomputer re-create a tornado ...

Arthritis Research UK introduces IBM Watson-powered 'virtual assistant' to provide information and advice to people with arthritis ...

High-precision calculations on supercomputers help reveal the physics of the universe ...

IDx and IBM Watson Health forge alliance for eye health ...

IBM Watson Health to integrate MedyMatch Technology into cognitive imaging offerings to help doctors identify head trauma and stroke ...

Simultaneous detection of multiple spin states in a single quantum dot ...

Computer simulation of protein synthesis reveals awesome complexity of cell machinery ...

Computer simulations first step toward designing more efficient amine chemical scrubbers ...

The prototype of a chemical computer detects a sphere ...

The Cloud

Oracle announces fiscal 2017 Q3 results ...

IBM and Red Hat Collaborate to Accelerate Hybrid Cloud Adoption with OpenStack ...