Back to Table of contents

Primeur weekly 2020-01-06

Focus

The LUMI supercomputer is not just a very fast supercomputer, it is first of all a competence development platform - Interview with Kimmo Koski, CSC, Finland ...

Quantum computing

ORNL researchers advance performance benchmark for quantum computers ...

In leap for quantum computing, silicon quantum bits establish a long-distance relationship ...

The Quantum Information Edge launches to accelerate quantum computing R&D ...

Focus on Europe

The coolest LEGO in the universe ...

Middleware

BP looks to ORNL and ADIOS to help rein in data ...

Hardware

New year brings new directory structure for OLCF's high-performance storage system ...

GIGABYTE brings AI, Cloud solutions and smart applications to CES 2020 to enable future today ...

During its final hours of operation, the Titan supercomputer simulated the birth of supernovae ...

Big iron afterlife: How ORNL's Titan supercomputer was recycled ...

Applications

Stanford researchers build a particle accelerator that fits on a chip ...

Brain-like functions emerging in a metallic nanowire network ...

Award-winning engineer helps keep US nuclear deterrent safe from radiation ...

New algorithm could mean more efficient, accurate equipment for Army ...

Paul Ginsparg named winner of the 2020 AIP Karl Compton Medal ...

'Super' simulations offer fresh insight into serotonin receptors ...

Researchers accelerate plasma turbulence simulations on Oak Ridge supercomputers to improve fusion design models ...

During its final hours of operation, the Titan supercomputer simulated the birth of supernovae


University of Tennessee astrophysicist Eric Lentz. Image credit: Rachel Harken.
2 Jan 2020 Oak Ridge - Titan, the groundbreaking Cray XK7 supercomputer operated by the Oak Ridge Leadership Computing Facility (OLCF) at the US Department of Energy's (DOE) Oak Ridge National Laboratory (ORNL), was officially decommissioned on August 1, 2019. The petascale machine ran countless simulations over its 7 years of service, and its sheer computational power was consistently in demand by researchers. But for a brief window, just prior to Titan's decommissioning, only one simulation was running.

This simulation - the last to ever occupy the supercomputer - examined the final moments of a star's life.

When stars like our Sun run out of fuel, they become red giants and then later white dwarfs. Larger stars, those at least 10 times more massive than the Sun, collapse into an extremely dense neutron star before launching a massive shockwave in the form of a supernova explosion.

University of Tennessee astrophysicist Eric Lentz, the last user of Titan, answered a few questions about the supercomputer's impact on his project, as well as what the future holds for his research.

Eric Lentz explained that his team has been modelling core-collapse supernovae - explosions of massive stars. The state of an exploding star is consistently changing. It starts as a relatively low-density white dwarf-like iron core, and in the first second of collapse increases in density by about four orders of magnitude. These simulations are rather computationally expensive, as most things on Titan tend to be, and researchers don't get a lot of opportunities to do a lot of runs; one or two full simulations in an allocation is the limit of what they have generally been able to do.

The simulations are part of a project to explore the variations and inputs that go into core collapse, which means the things that tend to influence the nature of stars like their composition and their mass. A lower-mass star like our Sun has a very different fate than a star that has, say, 10 or 20 times its mass at the beginning. The types of explosions researchers are looking at, which have an important impact on developing galaxies by injecting newly made elements into them, come in a fairly wide variety of initial conditions themselves. Among the simulations the team were running during the final month or so on Titan were models representing stars that had about 25, 15, and 10 times the mass of the Sun.

Eric Lentz has been on Titan sort of from the beginning of general access. The biggest change over time has been the consequence of the available node and processor count. Titan has allowed the team to really, properly capture the resolution needed for 3D runs, particularly with codes like theirs by which the physics evolves during the simulation.

The overall growth of these top-end supercomputers like Titan makes the complex, 3D simulations that the team has been doing possible. In particular, on machines like Titan they have taken advantage of the node count and that's been really critical. To be able to run on 20.000, 30.000, or 40.000 cores for 1000 hours per run is an extremely rare opportunity. It's just not something that's readily available to most researchers; it's not a level of computational power you can get on a departmental or institutional cluster, so without machines like Titan, some of this work would not be possible at all, literally. The team has been very pleased with the things it has been able to do on Titan over the years, and it has whetted their appetite to do more, as Eric Lentz stated.

While the team is still analyzing the data from the last run, it can see how significantly it shows the volatile nature of massive stars and how they collapse and explode. It also shows how much more, frankly, the team needs to run to be able to keep approximating the physics of core collapse more precisely. Studying that final simulation has also really been helping guide them forward as they prepare for the jump to Summit.

Part of the dilemma the researchers have in designing a simulation is balancing how well resolved it is with how long it takes to run. Eric Lentz thinks one of the unfortunate aspects of the prior 3D work has been the 1000-hour run time just to get an explosion started, not even to get it fully developed to an asymptotic state.

Unfortunately, all of the simulations are wrapped up in one single model, which means the team can't really isolate individual aspects for smaller runs. The researchers have been focused on doing large-scale simulations and incorporating all the physics as best as they can approximate, and then continuing to improve those approximations and the overall performance of the simulation. However, this sort of leaves them in an awkward position where, to run simulations in 3D, they need a large computer to run for a long time, Eric Lentz explained.

So, the team has slightly reoriented itself looking towards Summit. The researchers came to a really happy place that lets them run more simulations that are slightly less resolved, but that will allow them to run longer and explore more of the physics.

Eric Lentz said that right now, the team is focusing on a few more final improvements to the code to meet their goals for next year, and there is a little speed-up left to do. With the combination of Summit's GPU components and improvements to the code, he thinks they can get their run times much shorter. So, with Titan they would run simulations modelling a half second or three-quarters of a second of core collapse, and those would be about 1000-hour runs. With Summit, the team is aiming at running a full second in 700 or 800 hours.

The code is relatively mature in that the team has run simulations fairly long in 2D, so a lot of the basic physics challenges have been captured - but there's always the potential for the 3D parts to catch it. Eric Lentz thinks they have done a total of eight or nine 3D runs that have gone relatively long, and they have many more 2D runs that have gone much longer. This is what the team is hoping to do next year with Summit: to take 3D runs much longer than it has before.
Source: Oak Ridge Leadership Computing Facility - OLCF

Back to Table of contents

Primeur weekly 2020-01-06

Focus

The LUMI supercomputer is not just a very fast supercomputer, it is first of all a competence development platform - Interview with Kimmo Koski, CSC, Finland ...

Quantum computing

ORNL researchers advance performance benchmark for quantum computers ...

In leap for quantum computing, silicon quantum bits establish a long-distance relationship ...

The Quantum Information Edge launches to accelerate quantum computing R&D ...

Focus on Europe

The coolest LEGO in the universe ...

Middleware

BP looks to ORNL and ADIOS to help rein in data ...

Hardware

New year brings new directory structure for OLCF's high-performance storage system ...

GIGABYTE brings AI, Cloud solutions and smart applications to CES 2020 to enable future today ...

During its final hours of operation, the Titan supercomputer simulated the birth of supernovae ...

Big iron afterlife: How ORNL's Titan supercomputer was recycled ...

Applications

Stanford researchers build a particle accelerator that fits on a chip ...

Brain-like functions emerging in a metallic nanowire network ...

Award-winning engineer helps keep US nuclear deterrent safe from radiation ...

New algorithm could mean more efficient, accurate equipment for Army ...

Paul Ginsparg named winner of the 2020 AIP Karl Compton Medal ...

'Super' simulations offer fresh insight into serotonin receptors ...

Researchers accelerate plasma turbulence simulations on Oak Ridge supercomputers to improve fusion design models ...