Back to Table of contents

Primeur weekly 2016-06-13

Exascale supercomputing

Paul Messina discusses rewards, challenges for new exascale project ...

Quantum computing

Analogue quantum computation has been universally digitized using superconducting circuits ...

World-first pinpointing of atoms at work for quantum computers ...

Controlling quantum states atom by atom ...

Focus on Europe

Euro 2016: Computer predicts football results ...

HPC community gears up for ISC 2016 - record number of attendees expected ...

Middleware

Bright Computing partners with ProfitBricks to offer customers an elastic high performance computing solution ...

One Stop Systems introduces the GPUltima with Bright Computing's HPC Cluster Manager ...

BeeGFS Omni-Path certification: 12GB/s per server ...

Bright Computing unveils its plans for the International Supercomputing Expo ...

Mellanox paves the way to higher efficiency data centres with 25 Gb/s Ethernet ...

IBM invests to accelerate development and commercialization of disruptive technologies in Asia ...

Hardware

DVV data centre to host new magnitUDE supercomputer ...

CSIR's Centre for High Performance Computing unveils the fastest computer in Africa ...

PNNL appoints Liyuan Liang director of Environmental Molecular Sciences Laboratory ...

Forrester Research names Cray a strong performer in Big Data Hadoop Optimized Systems ...

PRACE to issue Newsletter 17 ...

Dell enables customers to maximize performance of enterprise applications and core business workloads with PowerEdge four-socket servers ...

Michigan State University celebrates arrival of new supercomputer ...

Applications

Materials project releases massive trove of battery and molecule data ...

NERSC fields its first Student Cluster Competition team ...

Simulating chemical evolution with Piz Daint ...

Skyrmions à la carte ...

The Cloud

IBM launches industry's first development environment for Apache Spark - delivered in the Cloud for rapid adoption ...

Leadtek looks to build on healthcare solutions leverage its longtime high-performance visual computing technologies by eyeing medical IoT opportunities ...

Paul Messina discusses rewards, challenges for new exascale project


Argonne Distinguished Fellow Paul Messina has been tapped to lead the DOE and NNSA’s Exascale Computing Project with the goal of paving the way toward exascale supercomputing.
8 Jun 2016 Argonne - The U.S. exascale initiative has an ambitious goal: to develop supercomputers a hundred times more powerful than today's systems. That's the kind of speed that can help scientists make serious breakthroughs in solar and sustainable energy technology, weather forecasting, batteries and more.

Last year, President Obama announced a unified National Strategic Computing Initiative to support U.S. leadership in high-performance computing; one key objective is to pave the road toward an exascale computing system.

The U.S. Department of Energy (DOE) has been charged with carrying out that role in an initiative called the Exascale Computing Project.

Argonne National Laboratory Distinguished Fellow Paul Messina has been tapped to lead the project, heading a team with representation from the six major participating DOE national laboratories: Argonne, Los Alamos, Lawrence Berkeley, Lawrence Livermore, Oak Ridge and Sandia. The project programme office is located at Oak Ridge.

Paul Messina, who has made fundamental contributions to modern scientific computing and networking and previously served as the Director of Science for the Argonne Leadership Computing Facility, a DOE Office of Science User Facility, for eight years, will now help usher in a new generation of supercomputers with the capabilities to change our everyday lives.

Exascale-level computing could have an impact on almost everything, Paul Messina said. It can help increase the efficiency of wind farms by determining the best locations and arrangements of turbines, as well as optimizing the design of the turbines themselves. It can also help severe weather forecasters make their models more accurate and could boost research in solar energy, nuclear energy, biofuels and combustion, among many other fields.

"For example, it's clear from some of our pilot projects that exascale computing power could help us make real progress on batteries", Paul Messina stated.

Brute computing force is not sufficient, however, Paul Messina stated: "We also need mathematical models that better represent phenomena and algorithms that can efficiently implement those models on the new computer architectures."

Given those advances, researchers will be able to sort through the massive number of chemical combinations and reactions to identify good candidates for new batteries.

"Computing can help us optimize. For example, let's say that we know we want a manganese cathode with this electrolyte; with these new supercomputers, we can more easily find the optimal chemical compositions and proportions for each", he stated.

Exascale computing will help researchers get a handle on what's happening inside systems where the chemistry and physics are extremely complex. To stick with the battery example: the behaviour of liquids and components within a working battery is intricate and constantly changing as the battery ages.

"We use approximations in many of our calculations to make the computational load lighter", Paul Messina stated, "but what if we could afford to use the more accurate - but more computationally expensive - methods?"

In addition, Paul Messina said that one of the project's goals is to boost U.S. industry, so the Exascale Computing Project will be working with companies to make sure the project is in step with their goals and needs.

Paul Messina spoke further on the four areas where the project will focus its efforts.

The applications software to tackle these larger computing challenges will often evolve from current codes, but will need substantial work, Paul Messina said.

First, simulating more challenging problems will require some brand-new methods and algorithms. Second, the architectures of these new computers will be different from the ones we have today, so to be able to use existing codes effectively, the codes will have to be modified. This is a daunting task for many of the teams that use scientific supercomputers today.

"These are huge, complex applications, often with literally millions of lines of code", Paul Messina stated. "Maybe they took the team 500 person-years to write, and now you need to modify it to take advantage of new architectures, or even translate it into a different programming language."

The project will support teams that can provide the people-power to tackle a number of applications of interest, he said. For example, data-intensive calculations are expected to be increasingly important and will require new software and hardware features.

The goal is to have "mission-critical" applications to be ready when the first exascale systems are deployed, Paul Messina said.

The teams will also identify both what new supporting software is needed, and ways that the hardware design could be improved to work with that software before the computers themselves are ever built. This "co-design" element is central for reaching the full potential of exascale, he said.

"The software ecosystem will need to evolve both to support new functionality demanded by applications and to use new hardware features efficiently", Paul Messina stated.

The project will enhance the software stack that DOE Office of Science and NNSA applications rely on and evolve it for exascale, as well as conduct R&D on tools and methods to boost productivity and portability between systems.

For example, many tasks are the same from scientific application to application and are embodied as elements of software libraries. Teams writing new code use the libraries for efficiency - "so you don't have to be an expert in every single thing", Paul Messina explained.

"Thus, improving libraries that do numerical tasks or visualizations, data analytics and program languages, for example, would benefit many different users", he stated.

Teams working on these components will work closely with the applications taskforce, he said. "We'll need good communication between these teams so everyone knows what's needed and how to use the tools provided."

In addition, as researchers are able to get more and more data from experiments, they'll need software infrastructure to more effectively deal with that data.

While the computers themselves are massive, they aren't a big part of the commercial market.

"Scientific computers are a niche market, so we make our own specs to get the best results for computational science applications", Paul Messina stated. "That's what we do with most of our scientific supercomputers, including here at Argonne when we collaborated with IBM and Lawrence Livermore National Laboratory on the design of Mira, and we believe it really paid off."

For example, companies are used to building huge banks of servers for business computing applications, for which it's not usually important for one cabinet's worth of chips to be able to talk to another one. "For us, it matters a lot", he stated.

This segment will work with computer vendors and hardware technology providers to accelerate the development of particular features for scientific and engineering applications - not just those DOE is interested in, but also priorities for other federal agencies, academia and industry, Paul Messina said.

Supercomputers need very special accommodations - you can't stick one just anywhere. They need a good deal of electricity and cooling infrastructure; they take up a fair amount of square footage, and all of the flooring needs to be reinforced. This effort will work to develop sites for computers with this kind of footprint.

The Exascale Computing Project is a complex project with many stakeholders and moving parts, Paul Messina said. "The challenge will be to effectively coordinate activities in many different sites in a relatively short time frame - but the rewards are clear."

The project will be jointly funded by the U.S. Department of Energy's Office of Science and the National Nuclear Security Administration's Office of Defense Programmes.
Source: Argonne National Laboratory

Back to Table of contents

Primeur weekly 2016-06-13

Exascale supercomputing

Paul Messina discusses rewards, challenges for new exascale project ...

Quantum computing

Analogue quantum computation has been universally digitized using superconducting circuits ...

World-first pinpointing of atoms at work for quantum computers ...

Controlling quantum states atom by atom ...

Focus on Europe

Euro 2016: Computer predicts football results ...

HPC community gears up for ISC 2016 - record number of attendees expected ...

Middleware

Bright Computing partners with ProfitBricks to offer customers an elastic high performance computing solution ...

One Stop Systems introduces the GPUltima with Bright Computing's HPC Cluster Manager ...

BeeGFS Omni-Path certification: 12GB/s per server ...

Bright Computing unveils its plans for the International Supercomputing Expo ...

Mellanox paves the way to higher efficiency data centres with 25 Gb/s Ethernet ...

IBM invests to accelerate development and commercialization of disruptive technologies in Asia ...

Hardware

DVV data centre to host new magnitUDE supercomputer ...

CSIR's Centre for High Performance Computing unveils the fastest computer in Africa ...

PNNL appoints Liyuan Liang director of Environmental Molecular Sciences Laboratory ...

Forrester Research names Cray a strong performer in Big Data Hadoop Optimized Systems ...

PRACE to issue Newsletter 17 ...

Dell enables customers to maximize performance of enterprise applications and core business workloads with PowerEdge four-socket servers ...

Michigan State University celebrates arrival of new supercomputer ...

Applications

Materials project releases massive trove of battery and molecule data ...

NERSC fields its first Student Cluster Competition team ...

Simulating chemical evolution with Piz Daint ...

Skyrmions à la carte ...

The Cloud

IBM launches industry's first development environment for Apache Spark - delivered in the Cloud for rapid adoption ...

Leadtek looks to build on healthcare solutions leverage its longtime high-performance visual computing technologies by eyeing medical IoT opportunities ...