Back to Table of contents

Primeur weekly 2016-03-29

Crowd computing

Tuberculosis to be tackled using crowd computing power ...

Quantum computing

Unlocking the gates to quantum computing ...

Focus on Europe

ISC to announce the 2016 Hans Meuer Award winning research paper ...

PRACE Ada Lovelace Award for HPC nominations in now open ...

Middleware

Bright Computing names Dan Kuczkowski as Senior Vice President of Worldwide Sales ...

Hardware

Intersect360 Research to present 8th HPC Budget Allocation Map: Budget Expectations ...

SC16 Student Cluster Competition still open till April 15, 2016 ...

Mellanox announces first 200Gb/s silicon photonics devices, doubling the performance in the QSFP form factor ...

Mellanox and InnoLight announce the availability and interoperability of 100Gb/s PSM4 transceivers at 1310 and 1550nm wavelengths ...

Lawrence Livermore National Laboratory and IBM collaborate to build brain-inspired supercomputer ...

Applications

KAUST IT Research Computing implements HPC Add-ons for research applications in order to automate the access to HPC resources ...

Could material defects actually improve solar cells? ...

NERSC announces 4th annual HPC Achievement Award winners ...

Research powers ahead with new supercomputer at University of Adelaide ...

Predicting severe hail storms ...

Honda selects IBM Watson IoT technology enabling real-time racing decisions for Formula One power unit operation ...

New computational method reveals significant degeneration of knee cartilage in overweight people ...

Breakthrough technology to improve cyber security ...

Researchers to develop new computing technology to minimize risks of data breaches ...

Record-speed data transmission could make Big Data more accessible ...

Robust network of connections between neurons performing similar tasks shows fundamentals of how brain circuits are wired ...

The Cloud

IBM opens Bluemix Garage in Europe to fuel Cloud development ...

GEANT and Amazon Web Services - breaking down barriers to cloud services adoption ...

Oracle unveils suite of breakthrough services to help simplify Cloud adoption by global corporations ...

Predicting severe hail storms


Radar imagery from 6:56 p.m. shows a close-up of the Mayfest supercell centred west of Benbrook, Texas. The pink and darkest red colours represent radar indications of large hail with this storm. The storm impacted the Mayfest festival at 7:10 p.m.
22 Mar 2016 Arlington - When a hail storm moved through Fort Worth, Texas on May 5, 1995, it battered the highly populated area with hail up to 4 inches in diameter and struck a local outdoor festival known as the Fort Worth Mayfest. The Mayfest storm was one of the costliest hailstorms in U.S history, causing more than $2 billion in damage and injuring at least 100 people.

Scientists know that storms with a rotating updraft on their southwestern sides - which are particularly common in the spring on the U.S. southern plains - are associated with the biggest, most severe tornadoes and also produce a lot of large hail. However, clear ideas on how they form and how to predict these events in advance have proven elusive.

A team based at University of Oklahoma (OU) working on the Severe Hail Analysis, Representation and Prediction (SHARP) project works to solve that mystery, with support from the National Science Foundation (NSF).

Performing experimental weather forecasts using the Stampede supercomputer at the Texas Advanced Computing Center, researchers have gained a better understanding of the conditions that cause severe hail to form, and are producing predictions with far greater accuracy than those currently used operationally.

To predict hail storms, or weather in general, scientists have developed mathematically based physics models of the atmosphere and the complex processes within, and computer codes that represent these physical processes on a grid consisting of millions of points. Numerical models in the form of computer codes are integrated forward in time starting from the observed current conditions to determine how a weather system will evolve and whether a serious storm will form.

Because of the wide range of spatial and temporal scales that numerical weather predictions must cover and the fast turnaround required, they are almost always run on powerful supercomputers. The finer the resolution of the grid used to simulate the phenomena, the more accurate the forecast; but the more accurate the forecast, the more computation required.

The highest-resolution National Weather Service's official forecasts have grid spacing of one point for every three kilometers. The model the Oklahoma team is using in the SHARP project, on the other hand, uses one grid point for every 500 meters - six times more resolved in the horizontal directions.

"This lets us simulate the storms with a lot higher accuracy", stated Nathan Snook, an OU research scientist. "But the trade-off is, to do that, we need a lot of computing power - more than 100 times that of three-kilometer simulations. Which is why we need Stampede."

Stampede is currently one of the most powerful supercomputers in the U.S. for open science research and serves as an important part of NSF's portfolio of advanced cyberinfrastructure resources, enabling cutting-edge computational and data-intensive science and engineering research nationwide.

According to Nathan Snook, there's a major effort underway to move to a "warning on forecast" paradigm - that is, to use computer-model-based, short-term forecasts to predict what will happen over the next several hours and use those predictions to warn the public, as opposed to warning only when storms form and are observed.

"How do we get the models good enough that we can warn the public based on them?" Nathan Snook asked. "That's the ultimate goal of what we want to do - get to the point where we can make hail forecasts two hours in advance. 'A storm is likely to move into downtown Dallas, now is a good time to act'."

With such a system in place, it might be possible to prevent injuries to vulnerable people, divert or move planes into hangers and protect cars and other property.

To study the problem, the team first reviews the previous season's storms to identify the best cases to study. They then perform numerical experiments to see if their models can predict these storms better than the original forecasts using new, improved techniques. The idea is to ultimately transition the higher-resolution models they are testing into operation in the future.

Now in the third year of their hail forecasting project, the researchers are getting promising results. Studying the storms that produced the May 20, 2013 Oklahoma-Moore tornado that led to 24 deaths, destroyed 1,150 homes and resulted in an estimated $2 billion in damage, they developed zero to 90 minute hail forecasts that captured the storm's impact better than the National Weather Service forecasts produced at the time.

"The storms in the model move faster than the actual storms", Nathan Snook stated. "But the model accurately predicted which three storms would produce strong hail and the path they would take."

The models required Stampede to solve multiple fluid dynamics equations at millions of grid points and also incorporate the physics of precipitation, turbulence, radiation from the sun and energy changes from the ground. Moreover, the researchers had to simulate the storm multiple times - as an ensemble - to estimate and reduce the uncertainty in the data and in the physics of the weather phenomena themselves.

"Performing all of these calculations on millions of points, multiple times every second, requires a massive amount of computing resources", Nathan Snook stated.

The team used more than a million computing hours on Stampede for the experiments and additional time on the Darter system at the National Institute for Computational Science for more recent forecasts. The resources were provided through the NSF-supported Extreme Science and Engineering Discovery Environment (XSEDE) programme, which acts as a single virtual system that scientists can use to interactively share computing resources, data and expertise.

Though the ultimate impacts of the numerical experiments will take some time to realize, its potential motivates Nathan Snook and the severe hail prediction team.

"This has the potential to change the way people look at severe weather predictions", Nathan Snook stated. "Five or 10 years down the road, when we have a system that can tell you that there's a severe hail storm coming hours in advance, and to be able to trust that - it will change how we see severe weather. Instead of running for shelter, you'll know there's a storm coming and can schedule your afternoon."

Ming Xue, the leader of the project and director of the Center for Analysis and Prediction of Storms (CAPS) at OU, gave a similar assessment.

"Given the promise shown by the research and the ever increasing computing power, numerical prediction of hailstorms and warnings issued based on the model forecasts, with a couple of hours of lead time, may indeed be realized operationally in a not-too-distant future, and the forecasts will also be accompanied by information on how certain the forecasts are."

The team published its results in the proceedings of the 20th Conference on Integrated Observing and Assimilation Systems for Atmosphere, Oceans and Land Surface (IOAS-AOLS); they will also be published in an upcoming issue of the American Meteorological Society journalWeather and Forecasting.

"Severe hail events can have significant economic and safety impacts", stated Nicholas F. Anderson, programme officer in NSF's Division of Atmospheric and Geospace Sciences. "The work being done by SHARP project scientists is a step towards improving forecasts and providing better warnings for the public."
Source: National Science Foundation

Back to Table of contents

Primeur weekly 2016-03-29

Crowd computing

Tuberculosis to be tackled using crowd computing power ...

Quantum computing

Unlocking the gates to quantum computing ...

Focus on Europe

ISC to announce the 2016 Hans Meuer Award winning research paper ...

PRACE Ada Lovelace Award for HPC nominations in now open ...

Middleware

Bright Computing names Dan Kuczkowski as Senior Vice President of Worldwide Sales ...

Hardware

Intersect360 Research to present 8th HPC Budget Allocation Map: Budget Expectations ...

SC16 Student Cluster Competition still open till April 15, 2016 ...

Mellanox announces first 200Gb/s silicon photonics devices, doubling the performance in the QSFP form factor ...

Mellanox and InnoLight announce the availability and interoperability of 100Gb/s PSM4 transceivers at 1310 and 1550nm wavelengths ...

Lawrence Livermore National Laboratory and IBM collaborate to build brain-inspired supercomputer ...

Applications

KAUST IT Research Computing implements HPC Add-ons for research applications in order to automate the access to HPC resources ...

Could material defects actually improve solar cells? ...

NERSC announces 4th annual HPC Achievement Award winners ...

Research powers ahead with new supercomputer at University of Adelaide ...

Predicting severe hail storms ...

Honda selects IBM Watson IoT technology enabling real-time racing decisions for Formula One power unit operation ...

New computational method reveals significant degeneration of knee cartilage in overweight people ...

Breakthrough technology to improve cyber security ...

Researchers to develop new computing technology to minimize risks of data breaches ...

Record-speed data transmission could make Big Data more accessible ...

Robust network of connections between neurons performing similar tasks shows fundamentals of how brain circuits are wired ...

The Cloud

IBM opens Bluemix Garage in Europe to fuel Cloud development ...

GEANT and Amazon Web Services - breaking down barriers to cloud services adoption ...

Oracle unveils suite of breakthrough services to help simplify Cloud adoption by global corporations ...