Back to Table of contents

Primeur weekly 2020-01-06

Focus

The LUMI supercomputer is not just a very fast supercomputer, it is first of all a competence development platform - Interview with Kimmo Koski, CSC, Finland ...

Quantum computing

ORNL researchers advance performance benchmark for quantum computers ...

In leap for quantum computing, silicon quantum bits establish a long-distance relationship ...

The Quantum Information Edge launches to accelerate quantum computing R&D ...

Focus on Europe

The coolest LEGO in the universe ...

Middleware

BP looks to ORNL and ADIOS to help rein in data ...

Hardware

New year brings new directory structure for OLCF's high-performance storage system ...

GIGABYTE brings AI, Cloud solutions and smart applications to CES 2020 to enable future today ...

During its final hours of operation, the Titan supercomputer simulated the birth of supernovae ...

Big iron afterlife: How ORNL's Titan supercomputer was recycled ...

Applications

Stanford researchers build a particle accelerator that fits on a chip ...

Brain-like functions emerging in a metallic nanowire network ...

Award-winning engineer helps keep US nuclear deterrent safe from radiation ...

New algorithm could mean more efficient, accurate equipment for Army ...

Paul Ginsparg named winner of the 2020 AIP Karl Compton Medal ...

'Super' simulations offer fresh insight into serotonin receptors ...

Researchers accelerate plasma turbulence simulations on Oak Ridge supercomputers to improve fusion design models ...

Researchers accelerate plasma turbulence simulations on Oak Ridge supercomputers to improve fusion design models


The ITER divertor, illustrated in red, will sustain a high heat load as particles carrying the exhaust heat bombard its surface. Image credit: ITER Organization.
2 Jan 2020 Oak Ridge - In 1934, physicist Ernest Rutherford and his colleagues produced the first fusion reaction - the fusing of light nuclei to release energy - in a laboratory by converting deuterium, a heavy hydrogen isotope, to helium. Since then, scientists have built increasingly efficient fusion energy devices with a goal to achieve net fusion energy, or useable power. Today, the world's largest fusion experiment is being built by seven international members, including the United States. The ITER fusion facility is expected to produce 10 times more power than the thermal power required to heat the plasma, thereby demonstrating the feasibility of commercial-scale fusion power.
The DIII-D National Fusion Facility, operated by General Atomics for the U.S. Department of Energy, is the largest magnetic fusion research facility operating in the U.S. Image credit: General Atomics.

If fusion power plants become a reality, they could provide nearly inexhaustible energy using fuel derived from seawater - a globally abundant source of deuterium, and a similarly abundant source of lithium.

But fusion has some stellar challenges to overcome first. Hot, gaseous plasma formed in a fusion device reaches extreme temperatures higher than the core of the Sun, nature's fusion factory. Electric currents running through the plasma rip apart hydrogen nuclei into their constituent ions and electrons.

Because of these extreme and remote conditions, plasma behaviour is difficult to study experimentally, and scientists often must fuse experiment with computational simulations to understand fusion processes.

That's why a team of scientists - including Christopher Holland of the University of California San Diego, Jeff Candy of General Atomics, and Nathan Howard of MIT - are using the world's smartest and fastest supercomputer, the 200-petaflop IBM AC922 Summit system at the Oak Ridge Leadership Computing Facility (OLCF), to better understand turbulence, an important characteristic of plasma behaviour that affects performance in fusion devices such as ITER.

The OLCF is a US Department of Energy (DOE) Office of Science User Facility located at DOE's Oak Ridge National Laboratory.

Following an Innovative and Novel Computational Impact on Theory and Experiment (INCITE) project led by Christopher Holland on the Cray XK7 Titan supercomputer - Summit's 27-petaflop predecessor at the OLCF that was decommissioned in 2019 - Jeff Candy is leading an Advanced Scientific Computing Research Leadership Computing Challenge project on Summit.

Many fusion devices use superconducting magnets to confine plasma in a tokamak, a donut-shaped vessel. The tokamak's design allows magnetic field lines to run in two directions, long and short, through the plasma.

"As charged ions and electrons move around those field lines, they spin in a helix motion. The radius of this motion is known as the gyroradius", Christopher Holland stated.

The heavier ions throw their mass around more and create a larger gyroradius than that of the much lighter electrons. But as the ions and electrons spin along the magnetic field, they also push and pull on each other across the field, leading to fluctuations in ion and electron speed and energy. These energized wobbles result in turbulence that can rapidly transport heat away from the plasma centre, reducing the number of fusion reactions that occur. Turbulence at one scale can inhibit or enhance turbulent fluctuations on other scales, impacting heat transport and, therefore, fusion performance.

"Standard plasma turbulence simulations only capture wavelengths at the ion scale, which is about 60 times bigger than the electron scale", Nathan Howard stated. "But we've found that simulating the larger scale alone is ineffective for explaining heat losses. We need both the long and short wavelengths in turbulence to explain levels of heat loss observed in experiment."

Augmenting experiment with high-performance computing is a must if researchers want to improve fusion performance in future reactors.

The team analyzes experimental data from the DIII-D National Fusion Facility tokamak, operated by General Atomics as a national user facility for DOE's Office of Science, and carries out unprecedented simulations with their new CGYRO gyrokinetic code on Titan and Summit. As much as 90 percent of the energy loss in fusion devices is due to turbulence, which makes understanding turbulence vital for designing an economically attractive fusion power plant.

"We want to be able to understand and predict levels of heat transport observed in an experiment so we can develop computational models that inform the design of future fusion devices and predict their performance", Nathan Howard stated. "Ultimately, we want to make fusion energy a reality."

Computational simulations enable researchers to overcome some experimental barriers created by the extreme environment in a fusion device.

"Because you have this superheated gas, you cannot measure things directly in the gas with a solid probe", Jeff Candy stated. "In DIII-D, we use diagnostics, which measure things like the radiated light to understand what is going on in the plasma core."

Not only is observing turbulence at the scale of particles impossible in an experiment, but there will also be key differences between current experiments like DIII-D and future devices like ITER, such as experiment duration, power, and size.

"A typical discharge on DIII-D will run for 4 or 5 seconds, whereas ITER discharges will extend for many minutes", Christopher Holland stated.

DIII-D uses 20 megawatts (MW) of power, whereas ITER is expected to use about 50 MW of power and produce up to 500 MW. Once built, ITER will be the world’s largest fusion reactor, with a plasma volume 10 times larger than that of any fusion device today.

Despite this larger volume, the small-scale turbulence of electrons does not get lost. Instead, it adds up.

"For larger ITER-sized plasma discharges, electrons become more important", Nathan Howard stated.

"Because you're simulating ion and electron scales together, you have to resolve both big and small structures at the same time, so the simulation grid is finer", Nathan Howard stated. "And because time scales tend to be slower for ions than electrons, you have to simulate more time steps."

Supercomputers like Summit provide the hundreds of thousands of processing cores needed to include all relevant time and spatial scales. Even simulating a 10- to 20-percent slice of the tokamak "donut" results in millions of grid points.

"The object - a unit of the computing programme - that describes the positions and velocities of the particles uses 300 gigabytes (GB), so we're moving a 300 GB object around for tens of thousands of time steps. The entire code usage is 50 times that, so it takes the full power of these cutting-edge computers", Jeff Candy stated.

Although Titan enabled the team to begin probing the range of spatio-temporal scales required - raising the total number of grid points to over 25 billion - and improving the performance of the new CGYRO code, they knew Summit would provide a new level of computing performance.

"At General Atomics, we purchased two nodes with an architecture similar to Summit so we could optimize the CGYRO code for Summit's V100 GPUs, which have a lot of memory", Jeff Candy stated.

Now, on Summit, the team is running its CGYRO code six to eight times faster.

"A factor of six to eight is a really big win", Nathan Howard stated. "We're simulating enough cases to make rigorous comparisons with experiments."

J. Candy, I. Sfiligoi, E. A. Belli, K. Hallatschek, C. Holland, N. T. Howard, and E. D'Azevedo are the authors of the paper titled "Multiscale-Optimized Plasma Turbulence Simulation on Petascale Architectures" published inComputers and Fluids188 (2019): 125 - doi:10.1016/j.compfluid.2019.04.016.

Source: Oak Ridge Leadership Computing Facility - OLCF

Back to Table of contents

Primeur weekly 2020-01-06

Focus

The LUMI supercomputer is not just a very fast supercomputer, it is first of all a competence development platform - Interview with Kimmo Koski, CSC, Finland ...

Quantum computing

ORNL researchers advance performance benchmark for quantum computers ...

In leap for quantum computing, silicon quantum bits establish a long-distance relationship ...

The Quantum Information Edge launches to accelerate quantum computing R&D ...

Focus on Europe

The coolest LEGO in the universe ...

Middleware

BP looks to ORNL and ADIOS to help rein in data ...

Hardware

New year brings new directory structure for OLCF's high-performance storage system ...

GIGABYTE brings AI, Cloud solutions and smart applications to CES 2020 to enable future today ...

During its final hours of operation, the Titan supercomputer simulated the birth of supernovae ...

Big iron afterlife: How ORNL's Titan supercomputer was recycled ...

Applications

Stanford researchers build a particle accelerator that fits on a chip ...

Brain-like functions emerging in a metallic nanowire network ...

Award-winning engineer helps keep US nuclear deterrent safe from radiation ...

New algorithm could mean more efficient, accurate equipment for Army ...

Paul Ginsparg named winner of the 2020 AIP Karl Compton Medal ...

'Super' simulations offer fresh insight into serotonin receptors ...

Researchers accelerate plasma turbulence simulations on Oak Ridge supercomputers to improve fusion design models ...