Back to Table of contents

Primeur weekly 2017-04-03

Focus

Luxembourg capital of supercomputing in Europe ...

Quantum computing

Quantum communication: How to outwit noise ...

Rigetti Computing raises $64 million in Series A and B funding, led by Andreessen Horowitz and Vy Capital ...

Focus on Europe

HPC Summit Week 2017 in Barcelona to gather main HPC stakeholders in Europe ...

Six High Performance Computing centres in the UK to be officially launched ...

It is easier for a DNA knot ...

Prof. Dieter Kranzlmüller is new Chairman of the Board of Directors at Leibniz Supercomputing Centre ...

GW4 world-first ARM based production supercomputer launched at national exhibition in the UK ...

Middleware

New version of Univa Grid Engine 2x faster than Sun Grid Engine 6.2u5 ...

Huawei and Altair sign MoU to jointly pursue High-Performance Computing (HPC) opportunities and build industrial simulation Cloud solutions ...

Hardware

New HPC installation to deploy Asetek liquid cooling solution ...

HPE to acquire Nimble Storage to strengthen leadership in hybrid IT ...

New Supermicro rack scale design brings open management for on-demand Cloud scale data centres ...

Supermicro launches new Intel Optane SSD optimized platforms ...

GBP 3 million awarded to Oxford-led consortium for national machine learning computing facility ...

EPCC invests GB 2.4 million in Tier-2 Cirrus system ...

Eurotech continues to lead innovation in embedded and high-performance computing: New designs featuring 2nd generation Intel Xeon Phi and Intel Atom E39XX ...

Officials dedicate Ohio OSC's newest, most powerful supercomputer ...

USC's Information Sciences Institute tapped to produce computing chips technology ...

Applications

Supercomputer puts Colombia on fast track to designing new HIV drugs, science and engineering discovery ...

SDSC announces Annual Supercomputing & Data Science Workshop ...

IBM patents cognitive system to manage self-driving vehicles ...

Cryo-electron microscopy achieves unprecedented resolution using new computational methods ...

Psychologists enlist machine learning to help diagnose depression ...

A seismic mapping milestone ...

Information storage with a nanoscale twist ...

Nanomagnets for future data storage ...

Researchers shoot for success with simulations of laser pulse-material interactions ...

A system based on data science to predict student dropouts ...

The Cloud

Oracle unveils industry-first Cloud Converged Storage to help organisations bridge on-premises and Oracle Cloud storage ...

A seismic mapping milestone


This visualization is the first global tomographic model constructed based on adjoint tomography, an iterative full-waveform inversion technique. The model is a result of data from 253 earthquakes and 15 conjugate gradient iterations with transverse isotropy confined to the upper mantle. Credit: David Pugmire, ORNL.
28 Mar 2017 Oak Ridge - Because of Earth's layered composition, scientists have often compared the basic arrangement of its interior to that of an onion. There's the familiar thin crust of continents and ocean floors; the thick mantle of hot, semisolid rock; the molten metal outer core; and the solid iron inner core.

But unlike an onion, peeling back Earth's layers to better explore planetary dynamics isn't an option, forcing scientists to make educated guesses about our planet's inner life based on surface-level observations. Clever imaging techniques devised by computational scientists, however, offer the promise of illuminating Earth's subterranean secrets.

Using advanced modelling and simulation, seismic data generated by earthquakes, and one of the world's fastest supercomputers, a team led by Jeroen Tromp of Princeton University is creating a detailed 3D picture of Earth's interior. Currently, the team is focused on imaging the entire globe from the surface to the core-mantle boundary, a depth of 1800 miles.

These high-fidelity simulations add context to ongoing debates related to Earth's geologic history and dynamics, bringing prominent features like tectonic plates, magma plumes, and hotspots into view. In 2016, the team released its first-generation global model. Created using data from 253 earthquakes captured by seismograms scattered around the world, the team's model is notable for its global scope and high scalability.

"This is the first global seismic model where no approximations--other than the chosen numerical method - were used to simulate how seismic waves travel through the Earth and how they sense heterogeneities", stated Ebru Bozdag, a coprincipal investigator of the project and an assistant professor of geophysics at the University of Nice Sophia Antipolis. "That's a milestone for the seismology community. For the first time, we showed people the value and feasibility of running these kinds of tools for global seismic imaging."

The project's genesis can be traced to a seismic imaging theory first proposed in the 1980s. To fill in gaps within seismic data maps, the theory posited a method called adjoint tomography, an iterative full-waveform inversion technique. This technique leverages more information than competing methods, using forward waves that travel from the quake's origin to the seismic receiver and adjoint waves, which are mathematically derived waves that travel from the receiver to the quake.

The problem with testing this theory? "You need really big computers to do this", Ebru Bozdag stated, "because both forward and adjoint wave simulations are performed in 3D numerically."

In 2012, just such a machine arrived in the form of the Titan supercomputer, a 27-petaflop Cray XK7 managed by the US Department of Energy's (DOE's) Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at DOE's Oak Ridge National Laboratory. After trying out its method on smaller machines, Jeroen Tromp's team gained access to Titan in 2013 through the Innovative and Novel Computational Impact on Theory and Experiment, or INCITE, programme.

Working with OLCF staff, the team continues to push the limits of computational seismology to deeper depths.

When an earthquake strikes, the release of energy creates seismic waves that often wreak havoc for life at the surface. Those same waves, however, present an opportunity for scientists to peer into the subsurface by measuring vibrations passing through the Earth.

As seismic waves travel, seismograms can detect variations in their speed. These changes provide clues about the composition, density, and temperature of the medium the wave is passing through. For example, waves move slower when passing through hot magma, such as mantle plumes and hotspots, than they do when passing through colder subduction zones, locations where one tectonic plate slides beneath another.

Each seismogram represents a narrow slice of the planet's interior. By stitching many seismograms together, researchers can produce a 3D global image, capturing everything from magma plumes feeding the Ring of Fire, to Yellowstone's hotspots, to subducted plates under New Zealand.

This process, called seismic tomography, works in a manner similar to imaging techniques employed in medicine, where 2D x-ray images taken from many perspectives are combined to create 3D images of areas inside the body.

In the past, seismic tomography techniques have been limited in the amount of seismic data they can use. Traditional methods forced researchers to make approximations in their wave simulations and restrict observational data to major seismic phases only. Adjoint tomography based on 3D numerical simulations employed by Jeroen Tromp's team isn't constrained in this way. "We can use the entire data - anything and everything", Ebru Bozdag stated.

Running its GPU version of the SPECFEM3D_GLOBE code, Jeroen Tromp's team used Titan to apply full-waveform inversion at a global scale. The team then compared these "synthetic seismograms" with observed seismic data supplied by the Incorporated Research Institutions for Seismology (IRIS), calculating the difference and feeding that information back into the model for further optimization. Each repetition of this process improves global models.

"This is what we call the adjoint tomography workflow, and at a global scale it requires a supercomputer like Titan to be executed in reasonable timeframe", Ebru Bozdag stated. "For our first-generation model, we completed 15 iterations, which is actually a small number for these kinds of problems. Despite the small number of iterations, our enhanced global model shows the power of our approach. This is just the beginning, however."

For its initial global model, Jeroen Tromp's team selected earthquake events that registered between 5.8 and 7 on the Richter scale - a standard for measuring earthquake intensity. That range can be extended slightly to include more than 6000 earthquakes in the IRIS database - about 20 times the amount of data used in the original model.

Getting the most out of all the available data requires a robust automated workflow capable of accelerating the team's iterative process. Collaborating with OLCF staff, Jeroen Tromp's team has made progress toward this goal.

For the team's first-generation model, Ebru Bozdag carried out each step of the workflow manually, taking about a month to complete one model update. Team members Matthieu Lefebvre, Wenjie Lei, and Youyi Ruan of Princeton University and the OLCF's Judy Hill developed new automated workflow processes that hold the promise of reducing that cycle to a matter of days.

"Automation will really make it more efficient, and it will also reduce human error, which is pretty easy to introduce", Ebru Bozdag stated.

Additional support from OLCF staff has contributed to the efficient use and accessibility of project data. Early in the project's life, Jeroen Tromp's team worked with the OLCF's Norbert Podhorszki to improve data movement and flexibility. The end result, called Adaptable Seismic Data Format (ASDF), leverages the Adaptable I/O System (ADIOS) parallel library and gives Jeroen Tromp's team a superior file format to record, reproduce, and analyze data on large-scale parallel computing resources.

In addition, the OLCF's David Pugmire helped the team implement in situ visualization tools. These tools enabled team members to check their work more easily from local workstations by allowing visualizations to be produced in conjunction with simulation on Titan, eliminating the need for costly file transfers.

"Sometimes the devil is in the details, so you really need to be careful and know what you're looking at", Ebru Bozdag stated. "David's visualization tools help us to investigate our models and see what is there and what is not."

With visualization, the magnitude of the team's project comes to light. The billion-year cycle of molten rock rising from the core-mantle boundary and falling from the crust - not unlike the motion of globules in a lava lamp - takes form, as do other geologic features of interest.

At this stage, the resolution of the team's global model is becoming advanced enough to inform continental studies, particularly in regions with dense data coverage. Making it useful at the regional level or smaller, such as the mantle activity beneath Southern California or the earthquake-prone crust of Istanbul, will require additional work.

"Most global models in seismology agree at large scales but differ from each other significantly at the smaller scales", Ebru Bozdag stated. "That's why it's crucial to have a more accurate image of Earth's interior. Creating high-resolution images of the mantle will allow us to contribute to these discussions."

To improve accuracy and resolution further, Jeroen Tromp's team is experimenting with model parameters under its most recent INCITE allocation. For example, the team's second-generation model will introduce anisotropic inversions, which are calculations that better capture the differing orientations and movement of rock in the mantle. This new information should give scientists a clearer picture of mantle flow, composition, and crust-mantle interactions.

Additionally, team members Dimitri Komatitsch of Aix-Marseille University in France and Daniel Peter of King Abdullah University in Saudi Arabia are leading efforts to update SPECFEM3D_GLOBE to incorporate capabilities such as the simulation of higher-frequency seismic waves. The frequency of a seismic wave, measured in Hertz, is equivalent to the number of waves passing through a fixed point in one second. For instance, the current minimum frequency used in the team's simulation is about 0.05 hertz (1 wave per 20 seconds), but Ebru Bozdag said the team would also like to incorporate seismic waves of up to 1 hertz (1 wave per second). This would allow the team to model finer details in the Earth's mantle and even begin mapping the Earth's core.

To make this leap, Jeroen Tromp's team is preparing for Summit, the OLCF's next-generation supercomputer. Set to arrive in 2018, Summit will provide at least five times the computing power of Titan. As part of the OLCF's Center for Accelerated Application Readiness, Jeroen Tromp's team is working with OLCF staff to take advantage of Summit's computing power upon arrival.

"With Summit, we will be able to image the entire globe from crust all the way down to Earth's centre, including the core", Ebru Bozdag stated. "Our methods are expensive - we need a supercomputer to carry them out - but our results show that these expenses are justified, even necessary."

Source: DOE/Oak Ridge National Laboratory

Back to Table of contents

Primeur weekly 2017-04-03

Focus

Luxembourg capital of supercomputing in Europe ...

Quantum computing

Quantum communication: How to outwit noise ...

Rigetti Computing raises $64 million in Series A and B funding, led by Andreessen Horowitz and Vy Capital ...

Focus on Europe

HPC Summit Week 2017 in Barcelona to gather main HPC stakeholders in Europe ...

Six High Performance Computing centres in the UK to be officially launched ...

It is easier for a DNA knot ...

Prof. Dieter Kranzlmüller is new Chairman of the Board of Directors at Leibniz Supercomputing Centre ...

GW4 world-first ARM based production supercomputer launched at national exhibition in the UK ...

Middleware

New version of Univa Grid Engine 2x faster than Sun Grid Engine 6.2u5 ...

Huawei and Altair sign MoU to jointly pursue High-Performance Computing (HPC) opportunities and build industrial simulation Cloud solutions ...

Hardware

New HPC installation to deploy Asetek liquid cooling solution ...

HPE to acquire Nimble Storage to strengthen leadership in hybrid IT ...

New Supermicro rack scale design brings open management for on-demand Cloud scale data centres ...

Supermicro launches new Intel Optane SSD optimized platforms ...

GBP 3 million awarded to Oxford-led consortium for national machine learning computing facility ...

EPCC invests GB 2.4 million in Tier-2 Cirrus system ...

Eurotech continues to lead innovation in embedded and high-performance computing: New designs featuring 2nd generation Intel Xeon Phi and Intel Atom E39XX ...

Officials dedicate Ohio OSC's newest, most powerful supercomputer ...

USC's Information Sciences Institute tapped to produce computing chips technology ...

Applications

Supercomputer puts Colombia on fast track to designing new HIV drugs, science and engineering discovery ...

SDSC announces Annual Supercomputing & Data Science Workshop ...

IBM patents cognitive system to manage self-driving vehicles ...

Cryo-electron microscopy achieves unprecedented resolution using new computational methods ...

Psychologists enlist machine learning to help diagnose depression ...

A seismic mapping milestone ...

Information storage with a nanoscale twist ...

Nanomagnets for future data storage ...

Researchers shoot for success with simulations of laser pulse-material interactions ...

A system based on data science to predict student dropouts ...

The Cloud

Oracle unveils industry-first Cloud Converged Storage to help organisations bridge on-premises and Oracle Cloud storage ...