Back to Table of contents

Primeur weekly 2017-02-20

Focus

HPC expert Genias Benelux to show its skillful expertise in brandnew website ...

Are billion Euro Flagships the right way to finance innovative areas like graphene, human brain research and quantum computing? ...

Exascale supercomputing

Advanced fusion code led by PPPL selected to participate in Early Science Programmes on three new DOE Office of Science pre-exascale supercomputers ...

Focus on Europe

From robotics to particle physics: Data analytics gets the spotlight in Distinguished Talk series at ISC 2017 ...

A new spin on electronics ...

Data mining tools for personalized cancer treatment ...

Why host HPC in Iceland to tackle Big Data for life sciences at Earlham Insititute ...

Biological experiments become transparent - anywhere, any time ...

Middleware

IBM delivers new platform to help clients address storage challenges at massive scale ...

Hewlett Packard Enterprise unveils most significant 3PAR Flash storage innovations to date ...

Hardware

Tokyo Institute of Technology partners with DDN on Tsubame3.0 to build forward-looking AI and Big Data computing infrastructure ...

Mellanox demonstrates four times improvement in crypto performance with Innova IPsec 40G Ethernet network adapter ...

Supermicro launches BigTwin - the industry's highest performing Twin multi-node system supporting the full range of CPUs, maximum memory and all-flash NVMe ...

Applications

Researchers catch extreme waves with higher-resolution modelling ...

Researchers are creating software to 'clean' large datasets, making it easier for scientists and the public to use Big Data ...

Designing new materials from 'small' data ...

Success by deception ...

DNA computer brings 'intelligent drugs' a step closer ...

'Lossless' metamaterial could boost efficiency of lasers and other light-based devices ...

Perimeter Institute researchers apply machine learning to condensed matter physics ...

When treating brain aneurysms, two isn't always better than one ...

Real-time MRI analysis powered by supercomputers ...

Analyzing data for transportation systems using TACC's Rustler, XSEDE ECSS support ...

NCSA facilitates performance comparisons with China's nr. 1 supercomputer ...

IBM delivers Watson for cyber security to power cognitive security operations centres ...

The Cloud

Optimizing data centre placement and network design to strengthen Cloud computing ...

Dutch start-up solution impacts data centres ...

OpenFog Consortium releases landmark reference architecture for Fog computing ...

IBM brings machine learning to the private Cloud ...

IBM accelerates hybrid Cloud adoption by enabling channel partners to offer VMware solutions ...

Oracle launches Cloud service to help organisations integrate disparate data and drive real-time analytics ...

Researchers catch extreme waves with higher-resolution modelling

The maximum wave height in the time series above show differences in storm characteristics, including the presence or absence of tropical cyclones, when different resolutions are used. At resolutions of 25-km (bottom panel), the dark storm track lines are much narrower and more frequent, particularly in areas such as the central and western Pacific where tropical cyclones are influential. Many of these storm lines are wider or even absent in the 100-km case (top panel). Credit: Ben Timmermans/Berkeley Lab.15 Feb 2017 Berkeley - Surfers aren't the only people trying to catch big waves. Scientists at the Department of Energy's Lawrence Berkeley National Laboratory are trying to do so, too, at least in wave climate forecasts.

Using decades of global climate data generated at a spatial resolution of about 25 kilometers squared, researchers were able to capture the formation of tropical cyclones, also referred to as hurricanes and typhoons, and the extreme waves that they generate. Those same models, when run at resolutions of about 100 kilometers, missed the tropical cyclones and the big waves up to 30 meters high.

Their findings, published in the February 16 issue of Geophysical Research Letters , demonstrate the importance of running climate models at higher resolution. Better predictions of how often extreme waves will hit are important for coastal cities, the military, and industries that rely upon shipping and offshore oil platforms. And, of course, for surfers.

"It's well known that to study tropical cyclones using simulations, the models need to be run at high resolution", stated study lead author and postdoctoral fellow Ben Timmermans. "The majority of existing models used to study the global climate are run at resolutions that are insufficient to predict tropical cyclones. The simulations in our study are the first long-duration global data sets to use a resolution of 25 kilometers. It's also the first time a study has specifically examined the impact of resolution increase for ocean waves at a global climatological scale."

The other authors on this study are Dáithί Stone, Michael Wehner, and Harinarayan Krishnan. All authors are scientists in Berkeley Lab's Computational Research Division (CRD).

Climate models work by simulating the exchange of air, water, and energy between the grid "boxes". In today's state-of-the-art climate models, these boxes are typically 100 to 200 kilometers wide. That level of detail is good enough to catch the formation and movement of midlatitude storms, the researchers said, because such systems tend to be quite large.

In contrast, tropical cyclones tend to cover a smaller area. While the overall footprint of a hurricane can be broad, the eye of a hurricane can be very compact and well defined, the researchers noted.

"The problem with that 100-kilometer resolution is that it misses key details of the hurricanes and tropical cyclones, which are clearly relevant to the generation of extreme waves", stated Dáithί Stone. "But going to a 25-kilometer resolution data set is computationally challenging. It requires 64 times more computational resources than a 100-kilometer simulation."

The study relied upon the data-crunching power of the National Energy Research Scientific Computing Center (NERSC), a scientific computing user facility funded by the DOE Office of Science and based at Berkeley Lab.

The researchers ran the Community Atmosphere Model version 5 (CAM5) climate model with data collected in three-hour increments at a low resolution of 100 kilometers and at a high resolution of 25 kilometers. They found that the high-resolution simulations included tropical cyclones where the low-resolution ones did not.

To see if the cyclones had an effect on waves, they then ran global wave models at both resolutions. They saw extreme waves in the high-resolution model that did not appear in the low-resolution ones.

"Hurricanes are tricky things to model", stated Dáithί Stone. "We've shown the importance of using a high-resolution data set for producing hurricanes. But the characteristics of hurricanes could change with the climate. People are making projections of changes in ocean waves in a future, warmer world. It's not clear if the 25-kilometer resolution is sufficient for capturing all of the processes involved in the development of a hurricane. But we do know that it's better than 100 kilometers."

While additional high-resolution simulations of the future are on the way, the researchers were able to take a first look at possible conditions at the end of the 21st century. Michaek Wehner noted that the biggest waves in Hawaii are projected to be substantially larger in a much warmer future world.

The researchers added that this study only looked at averages of wind-generated waves. One-off "rogue" or "freak" waves cannot be reproduced in these kinds of models, and large waves such as tsunamis are very different since they are caused by seismological activity, not the wind.

The data from this study will be made freely available for use by the wider scientific community.

"In the same way that weather patterns are part of the climate, ocean wave patterns are also part of the 'wave' climate", stated Ben Timmermans. "Ocean waves are relevant to the interaction between the ocean and the atmosphere, which affects the planet's climate as a whole."

This work was supported by DOE's Office of Science.
Source: DOE/Lawrence Berkeley National Laboratory

Back to Table of contents

Primeur weekly 2017-02-20

Focus

HPC expert Genias Benelux to show its skillful expertise in brandnew website ...

Are billion Euro Flagships the right way to finance innovative areas like graphene, human brain research and quantum computing? ...

Exascale supercomputing

Advanced fusion code led by PPPL selected to participate in Early Science Programmes on three new DOE Office of Science pre-exascale supercomputers ...

Focus on Europe

From robotics to particle physics: Data analytics gets the spotlight in Distinguished Talk series at ISC 2017 ...

A new spin on electronics ...

Data mining tools for personalized cancer treatment ...

Why host HPC in Iceland to tackle Big Data for life sciences at Earlham Insititute ...

Biological experiments become transparent - anywhere, any time ...

Middleware

IBM delivers new platform to help clients address storage challenges at massive scale ...

Hewlett Packard Enterprise unveils most significant 3PAR Flash storage innovations to date ...

Hardware

Tokyo Institute of Technology partners with DDN on Tsubame3.0 to build forward-looking AI and Big Data computing infrastructure ...

Mellanox demonstrates four times improvement in crypto performance with Innova IPsec 40G Ethernet network adapter ...

Supermicro launches BigTwin - the industry's highest performing Twin multi-node system supporting the full range of CPUs, maximum memory and all-flash NVMe ...

Applications

Researchers catch extreme waves with higher-resolution modelling ...

Researchers are creating software to 'clean' large datasets, making it easier for scientists and the public to use Big Data ...

Designing new materials from 'small' data ...

Success by deception ...

DNA computer brings 'intelligent drugs' a step closer ...

'Lossless' metamaterial could boost efficiency of lasers and other light-based devices ...

Perimeter Institute researchers apply machine learning to condensed matter physics ...

When treating brain aneurysms, two isn't always better than one ...

Real-time MRI analysis powered by supercomputers ...

Analyzing data for transportation systems using TACC's Rustler, XSEDE ECSS support ...

NCSA facilitates performance comparisons with China's nr. 1 supercomputer ...

IBM delivers Watson for cyber security to power cognitive security operations centres ...

The Cloud

Optimizing data centre placement and network design to strengthen Cloud computing ...

Dutch start-up solution impacts data centres ...

OpenFog Consortium releases landmark reference architecture for Fog computing ...

IBM brings machine learning to the private Cloud ...

IBM accelerates hybrid Cloud adoption by enabling channel partners to offer VMware solutions ...

Oracle launches Cloud service to help organisations integrate disparate data and drive real-time analytics ...