Back to Table of contents

Primeur weekly 2016-11-07

Focus

EGI to finalize service catalogue and ISO certification ...

Exascale supercomputing

SLAC and Berkeley Lab researchers prepare for scientific computing on the exascale ...

Quantum computing

Researchers nearly reached quantum limit with nanodrums ...

Focus on Europe

New approach for ARM-based technology to halve the cost of powering data centres ...

PRACE to award contracts in third and final phase of Pre-Commercial Procurement (PCP) ...

PRACE welcomes new Managing Director Serge Bogaerts ...

PRACE 2016 Digest Special Edition on Industry is out ...

Supercomputer comes up with a profile of dark matter ...

Middleware

Bright Computing supplies Bright OpenStack to Stony Brook University ...

DDN Annual High Performance Computing Trends survey reveals rising deployment of flash tiers and private/hybrid Clouds versus public for HPC ...

With Corral 3, TACC provides a more unified data structure and increased space ...

Hardware

Mellanox launches open source software initiative for routers, load balancers, and firewalls ...

Mellanox Multi-Host technology reshapes data centre economics ...

Cray awarded $26 million contract from the Department of Defense High Performance Computing Modernization Programme ...

Hewlett Packard Enterprise completes acquisition of SGI ...

Centre for Modelling & Simulation in Bristol launches new supercomputer ...

Baylor University selects Cray CS400 cluster supercomputer to power innovative research ...

SGI awarded $27 million systems contract with the Army Research Laboratory Defense Supercomputing Resource Center ...

Applications

XSEDE spins off annual conference to unite research computing community ...

Researchers at UCSB explore the delicate balance between coherence and control with a simple but complete platform for quantum processing ...

Cosmic connection: KITP's Greg Huber worked with nuclear physicists to confirm a structural similarity found in both human cells and neutron stars ...

New technique for creating NV-doped nanodiamonds may be boost for quantum computing ...

New bacteria groups, and stunning diversity, discovered underground ...

The Cloud

IBM drives Cloud storage with new all-flash and software defined solutions ...

Capital markets firms continue to invest in hardware for compute Grids alongside growing Cloud adoption, according to TABB Group Research ...

SLAC and Berkeley Lab researchers prepare for scientific computing on the exascale


Development and testing of future exascale computing tools for X-ray laser data analysis and the simulation of plasma wakefield accelerators will be done on the Cori supercomputer at NERSC, the National Energy Research Scientific Computing Center at Lawrence Berkely National Laboratory. Credit: NERSC.
3 Nov 2016 Menlo Park - Researchers at the Department of Energy's SLAC National Accelerator Laboratory are playing key roles in two recently funded computing projects with the goal of developing cutting-edge scientific applications for future exascale supercomputers that can perform at least a billion billion computing operations per second - 50 to 100 times more than the most powerful supercomputers in the world today.

The first project, led by SLAC, will develop computational tools to quickly sift through enormous piles of data produced by powerful X-ray lasers. The second project, led by DOE's Lawrence Berkeley National Laboratory, will reengineer simulation software for a potentially transformational new particle accelerator technology, called plasma wakefield acceleration.

The projects, which will each receive $10 million over four years, are among 15 fully-funded application development proposals and seven proposals selected for seed funding by the DOE's Exascale Computing Project (ECP). The ECP is part of President Obama's National Strategic Computing Initiative and intends to maximize the benefits of high-performance computing for U.S. economic competiveness, national security and scientific discovery.

"Many of our modern experiments generate enormous quantities of data", stated Alex Aiken, professor of computer science at Stanford University and director of the newly formed SLAC Computer Science division, who is involved in the X-ray laser project. "Exascale computing will create the capabilities to handle unprecedented data volumes and, at the same time, will allow us to solve new, more complex simulation problems."

X-ray lasers, such as SLAC's Linac Coherent Light Source (LCLS) have been proven to be extremely powerful "microscopes" that are capable of glimpsing some of nature's fastest and most fundamental processes on the atomic level. Researchers use LCLS, a DOE Office of Science User Facility, to create molecular movies, watch chemical bonds form and break, follow the path of electrons in materials and take 3D snapshots of biological molecules that support the development of new drugs.

At the same time X-ray lasers also generate giant amounts of data. A typical experiment at LCLS, which fires 120 flashes per second, fills up hundreds of thousands of gigabytes of disk space. Analyzing such a data volume in a short amount of time is already very challenging. And this situation is set to become dramatically harder: The next-generation LCLS-II X-ray laser will deliver 8,000 times more X-ray pulses per second, resulting in a similar increase in data volumes and data rates. Estimates are that the data flow will greatly exceed a trillion data ‘bits’ per second, and require hundreds of petabytes of online disk storage.

As a result of the data flood even at today's levels, researchers collecting data at X-ray lasers such as LCLS presently receive only very limited feedback regarding the quality of their data.

"This is a real problem because you might only find out days or weeks after your experiment that you should have made certain changes", stated Berkeley Lab's Peter Zwart, one of the collaborators on the exascale project, who will develop computer algorithms for X-ray imaging of single particles. "If we were able to look at our data on the fly, we could often do much better experiments."

Amedeo Perazzo, director of the LCLS Controls & Data Systems Division and principal investigator for this "ExaFEL" project, stated: "We want to provide our users at LCLS, and in the future LCLS-II, with very fast feedback on their data so that can make important experimental decisions in almost real time. The idea is to send the data from LCLS via DOE’s broadband science network ESnet to NERSC, the National Energy Research Scientific Computing Center, where supercomputers will analyze the data and send the results back to us - all of that within just a few minutes." NERSC and ESnet are DOE Office of Science User Facilities at Berkeley Lab.

X-ray data processing and analysis is quite an unusual task for supercomputers. "Traditionally these high-performance machines have mostly been used for complex simulations, such as climate modelling, rather than processing real-time data", Amedeo Perazzo stated. "So we're breaking completely new ground with our project, and foresee a number of important future applications of the data processing techniques being developed."

This project is enabled by the investments underway at SLAC to prepare for LCLS-II, with the installation of new infrastructure capable of handling these enormous amounts of data.

A number of partners will make additional crucial contributions.

"At Berkeley Lab, we’ll be heavily involved in developing algorithms for specific use cases", stated James Sethian, a professor of mathematics at the University of California, Berkeley, and head of Berkeley Lab's Mathematics Group and the Center for Advanced Mathematics for Energy Research Applications (CAMERA). "This includes work on two different sets of algorithms. The first set, developed by a team led by Nick Sauter, consists of well-established analysis programmes that we'll reconfigure for exascale computer architectures, whose larger computer power will allow us to do better, more complex physics. The other set is brand new software for emerging technologies such as single-particle imaging, which is being designed to allow scientists to study the atomic structure of single bacteria or viruses in their living state."

The "ExaFEL" project led by Perazzo will take advantage of Aiken’s newly formed Stanford/SLAC team, and will collaborate with researchers at Los Alamos National Laboratory to develop systems software that operates in a manner that optimizes its use of the architecture of the new exascale computers.

"Supercomputers are very complicated, with millions of processors running in parallel", Alex Aiken stated. "It's a real computer science challenge to figure out how to use these new architectures most efficiently."

Finally, ESnet will provide the necessary networking capabilities to transfer data between the LCLS and supercomputing resources. Until exascale systems become available in the mid-2020s, the project will use NERSC's Cori supercomputer for its developments and tests.

The second project aims at making use of exascale computing capabilities to simulate plasma wakefield accelerators. In this technology, charged particles such as electrons and their antimatter siblings, positrons, gain energy as they "surf" on waves of hot, ionized gas - known as 'plasma'.

The plasma wave is created by passing either an extremely powerful laser beam or a very energetic beam of electrons through the plasma. The first approach is being tested at facilities like Berkeley Lab's BELLA Center, the other at SLAC's Facility for Advanced Experimental Tests (FACET), also a DOE Office of Science User Facility. Plans for a FACET-II upgrade are currently under DOE review.

Due to the extremely large accelerating electric fields in plasma waves, which are 1,000 times stronger than in conventional accelerators, the technology could lead to much more compact and potentially less expensive particle accelerators for X-ray laser science and particle physics.

"To design plasma accelerators and better understand the complex processes inside plasmas, we frequently do demanding computer simulations", stated Jean-Luc Vay, a senior physicist at Berkeley Lab, who is the principal investigator for this exascale project. "Simulations of a single plasma stage, which do not even take into account all complexity, take days to weeks on today's supercomputers. If we ever want to be able to run simulations for future particle colliders, which will require up to 100 stages chained together, exascale computing is absolutely necessary."

The goal of the project is to take Berkeley Lab's existing accelerator simulation code "Warp" and combine it with a high-resolution refinement tool named “BoxLib” to form a new code "WarpX" designed for the future exascale computing architectures. Until these become available, tests will be performed on NERSC's Cori supercomputer.

"The code developments will be largely done at Berkeley with contributions from Lawrence Livermore National Laboratory", stated Cho-Kuen Ng, head of the Computational Electrodynamics Department in SLAC's Technology Innovation Directorate. "SLAC has decades of experience with high-performance computing in the simulation of accelerators, and we'll contribute, among other activities, by helping optimize the new code and making sure it runs fast and efficiently."

Mark Hogan, who leads SLAC's plasma acceleration efforts, stated: "The approaches at SLAC and Berkeley of creating plasma waves are quite different. Our lab's role in this project is also to ensure that Berkeley's codes will be applicable to particle-driven plasma accelerators - a technology that we'll continue to develop at FACET-II."
Source: SLAC National Accelerator Laboratory

Back to Table of contents

Primeur weekly 2016-11-07

Focus

EGI to finalize service catalogue and ISO certification ...

Exascale supercomputing

SLAC and Berkeley Lab researchers prepare for scientific computing on the exascale ...

Quantum computing

Researchers nearly reached quantum limit with nanodrums ...

Focus on Europe

New approach for ARM-based technology to halve the cost of powering data centres ...

PRACE to award contracts in third and final phase of Pre-Commercial Procurement (PCP) ...

PRACE welcomes new Managing Director Serge Bogaerts ...

PRACE 2016 Digest Special Edition on Industry is out ...

Supercomputer comes up with a profile of dark matter ...

Middleware

Bright Computing supplies Bright OpenStack to Stony Brook University ...

DDN Annual High Performance Computing Trends survey reveals rising deployment of flash tiers and private/hybrid Clouds versus public for HPC ...

With Corral 3, TACC provides a more unified data structure and increased space ...

Hardware

Mellanox launches open source software initiative for routers, load balancers, and firewalls ...

Mellanox Multi-Host technology reshapes data centre economics ...

Cray awarded $26 million contract from the Department of Defense High Performance Computing Modernization Programme ...

Hewlett Packard Enterprise completes acquisition of SGI ...

Centre for Modelling & Simulation in Bristol launches new supercomputer ...

Baylor University selects Cray CS400 cluster supercomputer to power innovative research ...

SGI awarded $27 million systems contract with the Army Research Laboratory Defense Supercomputing Resource Center ...

Applications

XSEDE spins off annual conference to unite research computing community ...

Researchers at UCSB explore the delicate balance between coherence and control with a simple but complete platform for quantum processing ...

Cosmic connection: KITP's Greg Huber worked with nuclear physicists to confirm a structural similarity found in both human cells and neutron stars ...

New technique for creating NV-doped nanodiamonds may be boost for quantum computing ...

New bacteria groups, and stunning diversity, discovered underground ...

The Cloud

IBM drives Cloud storage with new all-flash and software defined solutions ...

Capital markets firms continue to invest in hardware for compute Grids alongside growing Cloud adoption, according to TABB Group Research ...