Back to Table of contents

Primeur weekly 2012-03-05

The Cloud

Australian National University selects SGI to help harness explosion in data ...

Big science teams up with big business to kick‐start European Cloud computing ...

AMD to acquire SeaMicro: accelerates disruptive server strategy ...

HP launches Enterprise Security Cloud Connections partner programme ...

Dell advances public Cloud security with Trend Micro SecureCloud solutions ...

EuroFlash

Stories from the Grid launched at International Symposium on Grids and Clouds ...

IBM selected to build new supercomputer of the Max Planck Society ...

A supercomputer for research: University of Oldenburg to upgrade technical infrastructure ...

Workshop Call for Abstracts: Collaborative research using eScience infrastructure and high speed networks ...

USFlash

Cray's YarcData Division launches new Big Data graph appliance ...

Cray reports 2011 full year and fourth quarter financial results ...

National Institute of Genetics adopts SGI HPC solution for new supercomputer system ...

NASA scales SGI Pleiades InfiniBand Cluster to 25,000 Intel Xeon processor cores ...

SGI announces new President and CEO Jorge Luis Titinger ...

IBM Research advances device performance for quantum computing ...

ORNL completes first phase of Titan supercomputer transition ...

Researchers harness Kraken supercomputer to model explosions via transport ...

Climate scientists compute in concert ...

High-Performance Computing customers gain greater capabilities to address Big Data requirements through NetApp's collaboration with OpenSFS and EOFS communities ...

New supercomputer to accelerate Taiwan's Render Farm ...

On the path to 1 terabit-per-second networks ...

XSEDE's Extended Collaborative Support programme shares insights via symposium series ...

IBM captures leadership position in worldwide server market in fourth quarter of 2011 ...

Illinois supercomputers' expertise to help determine winner of genomics prize ...

SDSC and UC Santa Cruz to host summer school on astroinformatics ...

Climate scientists compute in concert

27 Feb 2012 Oak Ridge - Researchers at Oak Ridge National Laboratory (ORNL) are sharing computational resources and expertise to improve the detail and performance of a scientific application code that is the product of one of the world's largest collaborations of climate researchers. The Community Earth System Model (CESM) is a mega-model that couples components of atmosphere, land, ocean, and ice to reflect their complex interactions. By continuing to improve science representations and numerical methods in simulations, and exploiting modern computer architectures, researchers expect to further improve the CESM's accuracy in predicting climate changes. Achieving that goal requires teamwork and coordination rarely seen outside a symphony orchestra.

"Climate is a complex system. We're not solving one problem, but a collection of problems coupled together", stated ORNL computational Earth scientist Kate Evans. Of all the components contributing to climate, ice sheets such as those covering Greenland and Antarctica are particularly difficult to model - so much so that the Intergovernmental Panel on Climate Change (IPCC) could not make any strong claim about the future of large ice sheets in its 2007 Assessment Report, the most recent to date.

Kate Evans and her team began the Scalable, Efficient, and Accurate Community Ice Sheet Model (SEACISM) project in 2010 in an effort to fully incorporate a three-dimensional, thermomechanical ice sheet model called Glimmer-CISM into the greater CESM. The research is funded by the Department of Energy's (DOE's) Office of Advanced Scientific Computing Research (ASCR). Once fully integrated, Glimmer-CISM will be able to send information back and forth among other CESM codes, making it the first fully coupled ice sheet model in the CESM.

Through the ASCR Leadership Computing Challenge, the team of computational climate experts at multiple national labs and universities received allocations of processor hours on the Oak Ridge Leadership Computing Facility's (OLCF's) Jaguar supercomputer, which is capable of 2.3 petaflops, or 2.3 thousand trillion calculations per second.

Kate Evans said the team is on track to have the code running massively parallel by October of 2012. Currently, simulations of a small test problem have employed 1,600 of Jaguar's 224,000 processors. Kate Evans said the team expects that number to expand substantially in the near future when they begin simulating larger problems with greater realism.

The CESM began as the Community Climate Model in 1983 at the National Center for Atmospheric Research (NCAR) as a means to model the atmosphere computationally. In 1994, NCAR scientists pitched to the National Science Foundation (NSF) the idea of expanding their model to include realistic simulations of other components of the climate system. The result was the Climate System Model - adding land, ocean, and sea ice component models - which was renamed the Community Climate System Model (CCSM) to recognize the many contributors to the project. Development of the CCSM also benefitted from DOE and National Aeronautics and Space Administration (NASA) expertise and resources.

The CCSM developed into the CESM as its complexity increased. Today the model is a computational collection of the Earth's oceans, atmosphere, and land, as well as ice covering land and sea. Its components calculate in chorus. "The model is about getting a higher level of detail, improving our accuracy, and decreasing the uncertainty in our estimates of future changes", stated Los Alamos National Laboratory climate scientist Phil Jones, who leads that laboratory's Climate, Ocean, and Sea Ice Modelling group. The group develops the firstprinciples ocean, sea ice, and ice sheet components of CESM. Its members are interested in sea-level rise, high-latitude climate, and changes in ocean thermohaline circulation - aqueous transport of heat and minerals around the globe.

The team is changing its ocean code, the Parallel Ocean Programme, to become the Model for Prediction Across Scales-Ocean (MPAS-Ocean) code. Unlike the Parallel Ocean Programme, MPAS-Ocean is a variable-resolution, unstructured grid model. It will allow researchers to sharpen simulation resolution on a regional scale when they want to look at climate impacts in particular localities.

The CESM has continually grown in intricacy, enabling researchers to calculate more detail over larger spatial scales and longer time scales. Further, researchers are able to introduce more complex physics variables and simulate in greater detail Earth's biogeochemical components - chemical and ecosystem impacts on the climate. But these advances do not come without a price.

"At any given time when we're integrating the model forward, it's very (computationally) expensive - very time consuming and using large amounts of memory", Kate Evans stated. "It all has to work in concert to generate huge amounts of data that we then need to analyze afterward."

Researchers and code developers for the CESM are scattered around the United States. One of their biggest challenges is tying together separate climate code components created in different places on different computer architectures. What's more, climate researchers usually focus on a specific aspect of the climate, such as ocean or atmosphere. An atmosphere scientist, for example, needs the ability to raise resolution in the atmosphere but may want to lower the resolution used in the ocean to minimize the computational cost of the simulation. ORNL computer scientist Patrick Worley helps researchers optimize their codes. That makes him one in a small group in the CESM community who conduct the climate code orchestra.

"There are many scientific issues with getting simulations right, and the computer scientists are involved with helping the scientists test and optimize them", stated Patrick Worley. He serves as a co-chair in the CESM software engineering working group, which is dedicated to solving the unique challenges climate research imposes on computing resources.

A number of computational issues distinguish climate science from other scientific disciplines that make heavy use of simulations, Patrick Worley said. First, if researchers want to change their problem size, they must rework the simulations' new physical processes in correspondingly increased or decreased resolution. Second, many researchers focus only on particular areas of the Earth during their simulations, meaning they need only a particular part of the CESM framework running in high resolution. Finally, climate simulations run at varying time scales, sometimes spanning several thousand years. The required time to solution and available computing resources can force researchers to choose between high spatial resolution in their models or an extended observation period.

According to Kate Evans, computer scientists like Patrick Worley help climate scientists deal with that complexity in the coupled model. "Pat does bridge across many components. He can look at an ice code, which solves very different equations and is structured in a very different way than an atmospheric model, and be able to help us run both systems not only individually at their maximum ability but coupled together", she stated. "There are few in the climate community that have an understanding of how all of it works."

With software improvements and increasingly more powerful supercomputers, resolution and realism in climate simulations have reached new heights. But there is still work to be done. "What we can't do yet in the current model is get down to regional spatial scales so we can tell people what specific impacts are going to happen locally", stated Phil Jones. "The current IPCC simulations are at coarse enough resolution that we can only give people general trends."

Climate research that concluded in 2010 makes up the final pieces of information that will go into the next IPCC Assessment Report, due for release in 2013. Meanwhile, climate researchers are preparing for the future. "The climate research community doesn't have a single climate center where they run all of their climate simulations", Patrick Worley stated. "They run wherever there is supercomputing time available. So it's important for the codes to run on as many different platforms as possible." Currently, US climate codes are running on National Oceanic and Atmospheric Administration, NASA, DOE, and NSF supercomputing resources.

One of the biggest challenges facing climate research, according to Worley, is writing the computational score for hybrid architectures. Next-generation supercomputers, such as the OLCF's Titan, a 10-20 petaflop machine, will use both central processing units and graphics processing units to share the computational workload. This novel approach will require closer attention to all levels of parallelism and will alter the approach to computing climate. "There are a number of people that want to make sure we are not surprised by the new machines", Patrick Worley stated. If all goes well, CESM researchers may hear calls for an encore.
Source: Oak Ridge National Laboratory

Back to Table of contents

Primeur weekly 2012-03-05

The Cloud

Australian National University selects SGI to help harness explosion in data ...

Big science teams up with big business to kick‐start European Cloud computing ...

AMD to acquire SeaMicro: accelerates disruptive server strategy ...

HP launches Enterprise Security Cloud Connections partner programme ...

Dell advances public Cloud security with Trend Micro SecureCloud solutions ...

EuroFlash

Stories from the Grid launched at International Symposium on Grids and Clouds ...

IBM selected to build new supercomputer of the Max Planck Society ...

A supercomputer for research: University of Oldenburg to upgrade technical infrastructure ...

Workshop Call for Abstracts: Collaborative research using eScience infrastructure and high speed networks ...

USFlash

Cray's YarcData Division launches new Big Data graph appliance ...

Cray reports 2011 full year and fourth quarter financial results ...

National Institute of Genetics adopts SGI HPC solution for new supercomputer system ...

NASA scales SGI Pleiades InfiniBand Cluster to 25,000 Intel Xeon processor cores ...

SGI announces new President and CEO Jorge Luis Titinger ...

IBM Research advances device performance for quantum computing ...

ORNL completes first phase of Titan supercomputer transition ...

Researchers harness Kraken supercomputer to model explosions via transport ...

Climate scientists compute in concert ...

High-Performance Computing customers gain greater capabilities to address Big Data requirements through NetApp's collaboration with OpenSFS and EOFS communities ...

New supercomputer to accelerate Taiwan's Render Farm ...

On the path to 1 terabit-per-second networks ...

XSEDE's Extended Collaborative Support programme shares insights via symposium series ...

IBM captures leadership position in worldwide server market in fourth quarter of 2011 ...

Illinois supercomputers' expertise to help determine winner of genomics prize ...

SDSC and UC Santa Cruz to host summer school on astroinformatics ...