27 Jun 2018 Frankfurt - In the session on Astrophysics and HPC at ISC'18 in Frankfurt, Germany, Eliu Huerta from the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign provided insight on what the fields of high-performance computing (HPC), high-throughput computing (HTC) and Artificial Intelligence can contribute to multimessenger astrophysics. The fusion of HPC and HTC have been essential for the gravitational wave discovery. Multimessenger astrophysics is a driver for convergence in leadership class supercomputers. Deep learning at scale is already actively used for multimessenger astrophysics.
Eliu Huerta started his talk by confirming that gravitational waves exist. Systems of two black holes form and collide within the age of the Universe. Einstein's theory of general relativity is correct in the most extreme astrophysical environments. This has been proven by the winners of the 2017 Nobel Prize in Physics. Observations by researchers lead to a theory, amounting in models and simulations that eventually emerge in scientific discovery. Currently, researchers are studying black hole and neutron star collisions. Future research will focus on supernovae and oscillating neutron stars.
Eliu Huerta then explained about the Ligo Data Grid that is part of the National Strategic Computing Initiative and which consists of nine clusters and more than 17.000 cores. It is connected to the Open Science Grid and XSEDE since 2015. There is a distribution of needs in simulation and data-driven science in the science community, starting with data generation over data processing and transformation to data analytics. Containerized LIGO workflows can seamlessly use Blue Waters computer resources for the first time now. BOSS-LDG is a novel computational framework that brings together Blue Waters and the Open Science Grid. For the first time now Blue Waters is configured as an Open Science Grid compute element, and combined with an HPC container solution for scientific discovery.
HPC is used to numerically simulate neutron stars collisions leading to a combination of Einstein's general relativity with magnetohydrodynamics and microphysics. There is now a fusion of HPC and HTC, containers, the Open Science Grid, LDG, and CVMFS.
About the gravitational waves discovery, Eliu Huerta said that the existing algorithms are computationally expensive and poorly scalable. The extension to explore a deeper parameter space is computationally prohibitive. Researchers only probe a 4-dimensional manifold out of the 9-dimensional signal manifold available to LIGO. He asked whether researchers are missing astrophsically motivated sources in the LIGO data. KAGRA and LIGO-India will eventually come online. He wondered whether researchers will go and seize all available HPC and HTC?
A wider detector network with ever increasing detection sensitivity demands more computational power. There is a six hour time-lag between gravitational wave detection and the production of a sky map. What if researchers could do this in real time? What if scientists could handle noise anomalies with no human intervention?
On disruptive changes and data revolutions, Eliu Huerta stated that HPC and Big Data Revolution coexist. There should be a roadmap for convergence. Deep Learning is evolving from optimism to breakthroughs in technology and science. Early Artificial Intelligence stirs excitement, as well as machine learning. The Open Science Grid is a universal adapter for disparate compute resources and science communities. Aurora is coming and Summit is already there.
Eliu Huerta expanded on the emergent trends for simulation and data driven science. The US Presidential Strategic Initiative is promoting the convergence of Big Data. Artificial Intelligence programmes have the ability to learn and reason like humans. Machine learning offers algorithms with the ability to learn without being explicitly programmed. Deep learning is a subset of machine learning in which artificial neural networks are adapted. Deep learning is transforming the way how we do science using very long networks of artificial neurons.
Innovations at NCSA are focused on adapting the existing deep learning paradigm to do classification and regression of time-series data, replacing pixels in images by time-series vectors. HPC is used to understand sources with numerical relativity and data sets of numerical relativity. Deep filtering is using spectrograms. The sensitivity for detection is similar to a matched filter in Gaussian noise but is orders of magnitude faster and enables detection of new types of gravitational wave sources, Eliu Huerta explained.
Eliu Huerta concluded by stating that the fusion of Artificial Intelligence, HPC and scientific visualization will enable real time detection and regression of real events in raw LIGO data. The NCSA Gravity Group envisions to fully realize multimessenger astronomy.