The cosmological simulation team modelled nearly two trillion particles in each simulation to trace matter distribution and generated a total of eight petabytes of data. These "virtual universes" will enable scientists to compare next-generation telescope observations with some of today's reigning physics theories, including the impact of dark energy and the mass of neutrinos, once thought to be massless, on the structural formation of the universe.
"This data will be very valuable for our studies with the Large Synoptic Sky Survey Telescope", stated Katrin Heitmann, Argonne physicist and computational scientist and an Oak Ridge Leadership Computing Facility Early Science user. Currently under construction, the telescope will image about 37 billion stars and galaxies, and supercomputing simulations are preparing the project for the monumental data analysis required.
The nuclear simulation team's experiments tracked 100 billion particle histories - which are collections of unique, individual neutron interactions that occur within a reactor core - on Summit's sophisticated GPU-based architecture, taking advantage of the supercomputer's full capacity. ORNL scientist Steven Hamilton said the Monte Carlo radiation transport codes used in the simulation ran 30 to 40 times faster and with four to five times greater efficiency on Summit versus the same experiments performed on Titan.
Results of the simulations, detailed in Annals of Nuclear Energy , factor into part of an Exascale Computing Project called ExaSMR. "Our hope is to more accurately predict how the reactors will behave before they are built", Steven Hamilton stated. "This research instills more confidence that the reactor is going to behave exactly as predicted."