Back to Table of contents

Primeur live 2018-06-27

Start

Primeur Live! coverage from ISC 2018 - three issues - 50 articles ...

Focus on Europe

Atos wins deal to establish supercomputing Centre of Excellence in Wales ...

Atos delivers the most powerful supercomputer in Germany to Forschungszentrum Jülich ...

GENCI launches series of Grand Challenges to test capacity of its 9-petaflop Atos supercomputer ...

The new BSC machine is Europe's greenest supercomputer ...

Cray storage systems integrated by Atos in Joliot-Curie supercomputer for GENCI in France ...

Middleware

Storage expert DDN acquires Lustre File System capability from Intel ...

44GB/s throughput on the Boston Flash-IO Talyn - NVMeOF Solution ...

Hardware

CoolIT Systems enables liquid cooled HPC infrastructure at Dell EMC HPC and AI Innovation Lab ...

DDN announces new solutions to accelerate HPC workloads and enable the AI-ready data centre ...

Immersed Computing solution makes waves at ISC 2018 ...

Boston and Nyriad launch NVIDIA GPU‐accelerated SSD storage array at ISC 2018 ...

Applications

HPC to provide huge support in unravelling phenomena in Einstein's theories ...

Deep and machine learning, as well as advanced HPC facilities are bound to accelerate and maximize scientific discovery ...

Framework-based collaborative development key to research efficiency and sustainability in relativistic astrophysics ...

Company news

Huawei to resell Altair PBS Works for High-Performance Computing ...

RSC launches HPC-targeted hyper-converged solution based on proven RSC Tornado architecture utilizing the newest Intel SSD DC P4511 and Intel Optane SSD P4800X M.2 with IMDT ...

Framework-based collaborative development key to research efficiency and sustainability in relativistic astrophysics

27 Jun 2018 Frankfurt - In the session on Astrophysics and HPC at ISC'18 in Frankfurt, Germany, Eloisa Bentivegna from IBM Research UK addressed the topic of software between theory and observation and told a little bit more about the challenges and strategies in scientific computing for relativistic astrophysics. When we consider the gravitational Universe, respectively the most compact and most extended systems studied to date are dominated by gravity. Upcoming missions will collect information in unprecedented detail, in the quest to probe this fundamental field. Gravitation fields also create the conditions for excellent labs.

In order to model the gravitational field, scientists use the Master Equation, a single equation governing the gravitational field. In general relativity, the gravitational field is determined by solving the Einstein field equations: G = 8piG/c4 T. Here, T is the stress–energy tensor, G is the Einstein tensor, and c is the speed of light. G and T are four-dimensional tensors, which need to be suitably projected and expanded to allow for numerical integration.

Eloisa Bentivegna explained that the typical recipe consists in chosing the typology and the stress-energy content, and solve the Einstein constraints. Spatial and temporal scales are used. The radius of the compact oject R needs about 100 gridpoints. The orbital length scale and surrounding matter amount to about 10R.

Some critical challenges are presented here. Modelling the gravitational universe requires efficient and scalable parallel algorithms; correctness and optimization, as well as reproducibility. Exploring the physics of these systems requires the exploration of large parameter spaces. Realistic simulation involves multiphysics codes.

Eloisa Bentivegna described the developer community. There are less than 20 core groups worldwide, and less than five researchers per group. There is a large fraction in years in the research groups with many people having less than three years of experience. In their basic science background, there is little to no formal training in computational science. Eloisa Bentivegna stressed that cooperation is needed for code review, code reuse, as well as economy of scale in training and support.

She talked about the framework-based collaborative development that is performed by the Einstein Toolkit Consortium. This community involves approximately 100 institutions and 200 members. There is huge portability, cross-pollination, and capability to incorporate and leverage on new members fast. The consortium has a vast repertoire of tools for gravitational wave sources. New research directions are capitalizing on previous work. Schools and workgroups are being installed.

Eloisa Bentivegna also focused on the importance of reproducibility and open science. On February 11, 2016, the first detection of a gravitational wave was announced. A new kind of signal, independent from light and exhibiting different properties was discovered as the sound of the Universe. The first experimental data in this band produced a chance for code validation. The inverse problem allowed to reconstruct the system's parameters and degeneracy. The Einstein Toolkit code base is capable of modelling this event and similar ones in a transparent, ab initio way.

The simulation details involve physical parameters and physical properties. An open tutorial provided insights on visualization, database, system requirements, instructions for compiling, running and analysing the simulation. Tools also consist of a parameter file and code, as well as simulation data on Zenodo.

Code extension and new applications are needed according to Eloise Bentivegna, in order to constrain the gravitational interaction. How much does a relativistic Universe cost, she asked. The size of the smallest feature needs to be resolved. Complex spacetimes can be modelled by combining different parameters. Researchers have to construct numerical, fully relativistic spacetimes satisfying the cosmological principle above a certain threshold. There are many new insights about how nonlinear gravitational structures assemble and how light propagates through such structures.

Speaking about dust cosmologies, Eloisa Bentivegna raised the question about how universal these results are. Research is needed on the growth of structures, light propagation, gravitational lensing, local versus global expansion rates, a linear versus a non-linear regime. Interaction between theory and computation is required to test long-standing conjectures in specific cases and to perform code comparisons between Newtonian and relativistic codes.

Eloisa Bentivegna concluded that computational modelling is essential in relativistic astrophysics and cosmology. Currently there is a unique set of challenges and strategies that presents itself.
Leslie Versweyveld

Back to Table of contents

Primeur live 2018-06-27

Start

Primeur Live! coverage from ISC 2018 - three issues - 50 articles ...

Focus on Europe

Atos wins deal to establish supercomputing Centre of Excellence in Wales ...

Atos delivers the most powerful supercomputer in Germany to Forschungszentrum Jülich ...

GENCI launches series of Grand Challenges to test capacity of its 9-petaflop Atos supercomputer ...

The new BSC machine is Europe's greenest supercomputer ...

Cray storage systems integrated by Atos in Joliot-Curie supercomputer for GENCI in France ...

Middleware

Storage expert DDN acquires Lustre File System capability from Intel ...

44GB/s throughput on the Boston Flash-IO Talyn - NVMeOF Solution ...

Hardware

CoolIT Systems enables liquid cooled HPC infrastructure at Dell EMC HPC and AI Innovation Lab ...

DDN announces new solutions to accelerate HPC workloads and enable the AI-ready data centre ...

Immersed Computing solution makes waves at ISC 2018 ...

Boston and Nyriad launch NVIDIA GPU‐accelerated SSD storage array at ISC 2018 ...

Applications

HPC to provide huge support in unravelling phenomena in Einstein's theories ...

Deep and machine learning, as well as advanced HPC facilities are bound to accelerate and maximize scientific discovery ...

Framework-based collaborative development key to research efficiency and sustainability in relativistic astrophysics ...

Company news

Huawei to resell Altair PBS Works for High-Performance Computing ...

RSC launches HPC-targeted hyper-converged solution based on proven RSC Tornado architecture utilizing the newest Intel SSD DC P4511 and Intel Optane SSD P4800X M.2 with IMDT ...