Back to Table of contents

Primeur weekly 2019-08-19

Focus

Close to 800 million funding allocated to 119 HPC related research projects in European H2020 programme ...

Exascale supercomputing

Cray announces Shasta software to power the exascale era ...

DOE/NNSA and Lawrence Livermore National Laboratory announce partnership with Cray to develop NNSA's first exascale supercomputer ...

Focus on Europe

GW4 supercomputer Isambard can compete with Intel ...

Middleware

Taashee Linux Services joins Bright Partner Programme to bring clustered infrastructure management to Indian customer base ...

Hardware

Huntsville Center programme procures mobile, containerized supercomputer for Defense Department ...

Blue Waters computational resources available through 2020 ...

E4 Computer Engineering delivers AMD EPYC 7002 series processor solutions to customers, setting a new standard for the modern data centre ...

ScaleMP delivers AMD EPYC 7002 series processor solutions to customers, setting a new standard for the modern data centre ...

Grant and gift totaling more than $11 million will accelerate high-performance computing at IACS ...

Tachyum joins PCI-SIG in support of mission to satisfy performance needs of data centre, AI and HPC workloads ...

GRC launches the ICEraQ Micro, an immersion cooled micro-modular data centre solution ...

Applications

Predicting the risk of cancer with computational electrodynamics ...

Virtual 'Universe Machine' sheds light on galaxy evolution ...

Simulating blood flow at the cellular level ...

CANDLE illuminates new pathways in fight against cancer ...

Breathe in, breathe out? It's complicated ...

Brookhaven Lab and University of Delaware begin joint initiative ...

US Department of Energy to offer a sneak peek of tomorrow's Artificial Intelligent technologies ...

Subaru Corporation and ANSYS power the future of hybrid electric vehicle design ...

Supercomputing prodigies win prestigious honours ...

The Cloud

Nimbix launches HyperHub to enable point-and-click supercomputing across public, hybrid, and multiple Clouds ...

Virtual 'Universe Machine' sheds light on galaxy evolution


A UA-led team of scientists generated millions of different universes on a supercomputer, each of which obeyed different physical theories for how galaxies should form. Image: NASA, ESA, and J. Lotz and the HFF Team/STScI.
9 Aug 2019 Tucson - How do galaxies such as our Milky Way come into existence? How do they grow and change over time? The science behind galaxy formation has remained a puzzle for decades, but a University of Arizona-led team of scientists is one step closer to finding answers thanks to supercomputer simulations.
The Hubble Space Telescope took this image of Abell 370, a galaxy cluster 4 billion light-years from Earth. Several hundred galaxies are tied together by gravity. The arcs of blue light are distorted images of galaxies far behind the cluster, too faint for Hubble to see directly. Image: NASA, ESA, and J. Lotz and the HFF Team/STScI.

Observing real galaxies in space can only provide snapshots in time, so researchers who want to study how galaxies evolve over billions of years have to revert to computer simulations. Traditionally, astronomers have used this approach to invent and test new theories of galaxy formation, one-by-one. Peter Behroozi, an assistant professor at the University of Arizona (UA) Steward Observatory, and his team overcame this hurdle by generating millions of different universes on a supercomputer, each of which obeyed different physical theories for how galaxies should form.

The findings, published in theMonthly Notices of the Royal Astronomical Society, challenge fundamental ideas about the role dark matter plays in galaxy formation, how galaxies evolve over time and how they give birth to stars.

"On the computer, we can create many different universes and compare them to the actual one, and that lets us infer which rules lead to the one we see", stated Peter Behroozi, the study's lead author.

The study is the first to create self-consistent universes that are such exact replicas of the real one: computer simulations that each represent a sizeable chunk of the actual cosmos, containing 12 million galaxies and spanning the time from 400 million years after the Big Bang to the present day.

Each "Ex-Machina" universe was put through a series of tests to evaluate how similar galaxies appeared in the generated universe compared to the true universe. The universes most similar to our own all had similar underlying physical rules, demonstrating a powerful new approach for studying galaxy formation.

The results from the "UniverseMachine", as the authors call their approach, have helped resolve the long-standing paradox of why galaxies cease to form new stars even when they retain plenty of hydrogen gas, the raw material from which stars are forged.

Commonly held ideas about how galaxies form stars involve a complex interplay between cold gas collapsing under the effect of gravity into dense pockets giving rise to stars, while other processes counteract star formation.

For example, it is thought that most galaxies harbour supermassive black holes in their centres. Matter falling into these black holes radiates tremendous energies, acting as cosmic blowtorches that prevent gas from cooling down enough to collapse into stellar nurseries. Similarly, stars ending their lives in supernova explosions contribute to this process. Dark matter, too, plays a big role, as it provides for most of the gravitational force acting on the visible matter in a galaxy, pulling in cold gas from the galaxy's surroundings and heating it up in the process.

"As we go back earlier and earlier in the universe, we would expect the dark matter to be denser, and therefore the gas to be getting hotter and hotter. This is bad for star formation, so we had thought that many galaxies in the early universe should have stopped forming stars a long time ago", Peter Behroozi stated. "But we found the opposite: galaxies of a given size were more likely to form stars at a higher rate, contrary to the expectation."

In order to match observations of actual galaxies, Peter Behroozi explained, his team had to create virtual universes in which the opposite was the case - universes in which galaxies kept churning out stars for much longer.

If, on the other hand, the researchers created universes based on current theories of galaxy formation - universes in which the galaxies stopped forming stars early on - those galaxies appeared much redder than the galaxies we see in the sky.

Galaxies appear red for two reasons. The first is apparent in nature and has to do with a galaxy's age - if it formed earlier in the history of the universe, it will be moving away faster, shifting the light into the red spectrum. Astronomers call this effect redshift. The other reason is intrinsic: if a galaxy has stopped forming stars, it will contain fewer blue stars, which typically die out sooner, and be left with older, redder stars.

"But we don't see that", Peter Behroozi stated. "If galaxies behaved as we thought and stopped forming stars earlier, our actual universe would be coloured all wrong. In other words, we are forced to conclude that galaxies formed stars more efficiently in the early times than we thought. And what this tells us is that the energy created by supermassive black holes and exploding stars is less efficient at stifling star formation than our theories predicted."

According to Peter Behroozi, creating mock universes of unprecedented complexity required an entirely new approach that was not limited by computing power and memory, and provided enough resolution to span the scales from the "small" - individual objects such as supernovae - to a sizeable chunk of the observable universe.

"Simulating a single galaxy requires 10 to the 48th computing operations", he explained. "All computers on Earth combined could not do this in a hundred years. So to just simulate a single galaxy, let alone 12 million, we had to do this differently."

In addition to utilizing computing resources at NASA Ames Research Center and the Leibniz-Rechenzentrum in Garching, Germany, the team used the "Ocelote" supercomputer at the UA High Performance Computing cluster. Two-thousand processors crunched the data simultaneously over three weeks. Over the course of the research project, Peter Behroozi and his colleagues generated more than 8 million universes.

"We took the past 20 years of astronomical observations and compared them to the millions of mock universes we generated", Peter Behroozi explained. "We pieced together thousands of pieces of information to see which ones matched. Did the universe we created look right? If not, we'd go back and make modifications, and check again."

To further understand how galaxies came to be, Peter Behroozi and his colleagues plan to expand the UniverseMachine to include the morphology of individual galaxies and how their shapes evolve over time.

The paper, " UNIVERSEMACHINE: The Correlation between Galaxy Growth and Dark Matter Halo Assembly from z = 0-10 ", is co-authored by Risa Wechsler at Stanford University, Andrew Hearin at Argonne National Laboratory and Charlie Conroy at Harvard University. Funding was provided by NASA, the National Science Foundation and the Munich Institute for Astro- and Particle Physics.
Source: University of Arizona

Back to Table of contents

Primeur weekly 2019-08-19

Focus

Close to 800 million funding allocated to 119 HPC related research projects in European H2020 programme ...

Exascale supercomputing

Cray announces Shasta software to power the exascale era ...

DOE/NNSA and Lawrence Livermore National Laboratory announce partnership with Cray to develop NNSA's first exascale supercomputer ...

Focus on Europe

GW4 supercomputer Isambard can compete with Intel ...

Middleware

Taashee Linux Services joins Bright Partner Programme to bring clustered infrastructure management to Indian customer base ...

Hardware

Huntsville Center programme procures mobile, containerized supercomputer for Defense Department ...

Blue Waters computational resources available through 2020 ...

E4 Computer Engineering delivers AMD EPYC 7002 series processor solutions to customers, setting a new standard for the modern data centre ...

ScaleMP delivers AMD EPYC 7002 series processor solutions to customers, setting a new standard for the modern data centre ...

Grant and gift totaling more than $11 million will accelerate high-performance computing at IACS ...

Tachyum joins PCI-SIG in support of mission to satisfy performance needs of data centre, AI and HPC workloads ...

GRC launches the ICEraQ Micro, an immersion cooled micro-modular data centre solution ...

Applications

Predicting the risk of cancer with computational electrodynamics ...

Virtual 'Universe Machine' sheds light on galaxy evolution ...

Simulating blood flow at the cellular level ...

CANDLE illuminates new pathways in fight against cancer ...

Breathe in, breathe out? It's complicated ...

Brookhaven Lab and University of Delaware begin joint initiative ...

US Department of Energy to offer a sneak peek of tomorrow's Artificial Intelligent technologies ...

Subaru Corporation and ANSYS power the future of hybrid electric vehicle design ...

Supercomputing prodigies win prestigious honours ...

The Cloud

Nimbix launches HyperHub to enable point-and-click supercomputing across public, hybrid, and multiple Clouds ...