Back to Table of contents

Primeur weekly 2011-10-03

Special

The perfect data manager ...

The Cloud

Cloud computing - current scenario, trends & key players, according to ReportsnReports ...

ROLF Group signs with HP to move technology infrastructure into the Cloud ...

IBM expands business partner initiative with new Cloud channel offering ...

Red Lambda's MetaGrid software transforms security and operations for customers with big data IT, network and Cloud infrastructures ...

Desktop Grids

Desktop Grid middleware XtremWeb-HEP 7.6.0 released ...

EuroFlash

Bull launches its new mainframe family gcos 7 systems leveraging Extreme Computing technologies ...

Airbus completes data centre transformation with HP PODs ...

"Efficient use of GPU-accelerators to solve large problems" contest third tage starts ...

Unique supercomputer complex presentation held at the Tomsk State University ...

T-Platforms Company has become a Top50 leader in the Russian supercomputer list ...

USFlash

Scientists release most accurate simulation of the universe to date ...

Mongolia's National Agency of Meteorology and Environmental Monitoring orders a Cray XE6m supercomputer ...

How graphene's electrical properties can be tuned ...

Japan's KEK Research and IBM agree to develop powerful KEK central computer system ...

Intel Labs announces latest Science and Technology Center focused on next generation of pervasive computing ...

U.S. Department of Energy selects NetApp as the storage foundation for one of the world's most powerful supercomputers ...

Oracle achieves world record result with SPECjEnterprise2010 benchmark ...

Canon and Oracle join forces to integrate Canon's imaging technologies with Oracle ...

Oracle Utilities Meter Data Management running with Oracle Exadata Database Machine and Oracle Exalogic Elastic Cloud demonstrates extreme performance in processing data from smart meters ...

Oracle announces Hybrid Columnar Compression support for ZFS Storage Appliances and Pillar Axiom Storage Systems ...

Oracle unveils the world's fastest general purpose engineered system - the SPARC SuperCluster T4-4 ...

Oracle launches next generation SPARC T4 servers ...

Oracle's SPARC T4 servers deliver record-breaking performance results ...

SDSC and SDSU share in $4.6 million NSF grant to simulate earthquake faults ...

Stampede charges computational science forward in tackling complex societal challenges ...

SDSC and SDSU share in $4.6 million NSF grant to simulate earthquake faults

23 Sep 2011 San Diego - Researchers from the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, and San Diego State University (SDSU) will be assisting researchers from six other universities and the U.S. Geological Survey (USGS) to develop detailed, large-scale computer simulations of earthquake faults under a new $4.6 million National Science Foundation (NSF) grant.

The computer simulations will use Gordon, SDSC's innovative new supercomputer to officially enter production in January. The result of a five-year, $20 million NSF award, Gordon is the first high-performance supercomputer to use large amounts of flash-based SSD (solid state drive) memory. Flash memory is more common in smaller devices such as mobile phones and laptop computers, but unique for supercomputers, which generally use slower spinning-disk technology.

SDSC recently took delivery of Gordon's flash-based I/O nodes and is providing access to early users for benchmarking and testing.

The five-year earthquake simulation project is being led by the University of California, Riverside (UCR), and also includes researchers from the University of Southern California (USC), Brown University, and Columbia University. Scientists will develop and apply the most capable earthquake simulators to investigate these fault systems, focusing first on the North American plate boundary and the San Andreas system of Northern and Southern California.

Such systems occur where the world's tectonic plates meet, and control the occurrence and characteristics of the earthquakes they generate. The simulations can also be performed for other earthquake-prone areas where there is sufficient empirical knowledge of the fault system geology, geometry and tectonic loading.

"Observations of earthquakes go back to only about 100 years, resulting in a relatively short record", stated James Dieterich, a distinguished professor of geophysics in UCR's Department of Earth Sciences, and principal investigator of the project. "If we get the physics right, our simulations of plate boundary fault systems - at a one-kilometer resolution for California - will span more than 10,000 years of plate motion and consist of up to a million discrete earthquake events, giving us abundant data to analyze."

The simulations will provide the means to integrate a wide range of observations from seismology and earthquake geology into a common framework, according to James Dieterich. "The simulations will help us better understand the interactions that give rise to observable effects", he stated. "They are computationally fast and efficient, and one of the project goals is to improve our short- and long-term earthquake forecasting capabilities."

More accurate forecasting has practical advantages - earthquake insurance, for example, relies heavily on forecasts. More importantly, better forecasting can save lives and prevent injuries.

SDSC/UC San Diego researchers participating in this project include Yifeng Cui, director of the High Performance GeoComputing Laboratory at SDSC, and Dong Ju Choi, a senior computational scientist with the same laboratory. Other researchers include Steve Day and Kim Olsen from SDSU, David Oglesby and Keith Richards-Dinger from UCR, Terry Tullis from Brown, Bruce Shaw from Columbia, Thomas Jordan from USC, Ray Wells and Elizabeth Cochran from USGS, and Michael Barall from Invisible Software.

"The primary computational development for this effort is to enable an existing earthquake simulator developed at UCR to run efficiently on supercomputers with tens of thousands of cores", stated SDSC's Yifeng Cui, who recently participated in a project to create the most detailed simulation ever of a Magnitude 8.0 earthquake in California, whose related code will be used in the UCR project for detailed single event rupture calculations.

"Simulations on this scale have been made possible by concurrent advances on two fronts: in our scientific understanding of the geometry, physical properties, and dynamic interactions of fault systems across a wide range of spatial and temporal scales; and in the scale of available computational resources and the methodologies to use them efficiently", stated SDSU's Steve Day, a researcher specializing in dynamic rupture simulation. Steve Day also noted that the project will enrich research opportunities for students in the SDSU/UCSD Joint Doctoral Program in Geophysics, an important focus of which is to advance the scientific understanding of earthquake hazards.

The overall earthquake simulation project will require tens of millions of supercomputer processing time, according to Yifeng Cui, who plans to conduct some of their simulations on Gordon, slated to go into production in January 2012.

Capable of performing in excess of 200 teraflops (TF) with a total of 64 terabytes TB (terabytes) of memory and 300TB of high performance solid state drives served via 64 I/O nodes, Gordon is designed for data-intensive applications spanning domains such as genomics, graph problems, and data mining, in addition to geophysics. Gordon will be capable of handling massive databases while providing up to 100 times faster speeds when compared to hard drive disk systems for some queries.
Source: University of California - San Diego

Back to Table of contents

Primeur weekly 2011-10-03

Special

The perfect data manager ...

The Cloud

Cloud computing - current scenario, trends & key players, according to ReportsnReports ...

ROLF Group signs with HP to move technology infrastructure into the Cloud ...

IBM expands business partner initiative with new Cloud channel offering ...

Red Lambda's MetaGrid software transforms security and operations for customers with big data IT, network and Cloud infrastructures ...

Desktop Grids

Desktop Grid middleware XtremWeb-HEP 7.6.0 released ...

EuroFlash

Bull launches its new mainframe family gcos 7 systems leveraging Extreme Computing technologies ...

Airbus completes data centre transformation with HP PODs ...

"Efficient use of GPU-accelerators to solve large problems" contest third tage starts ...

Unique supercomputer complex presentation held at the Tomsk State University ...

T-Platforms Company has become a Top50 leader in the Russian supercomputer list ...

USFlash

Scientists release most accurate simulation of the universe to date ...

Mongolia's National Agency of Meteorology and Environmental Monitoring orders a Cray XE6m supercomputer ...

How graphene's electrical properties can be tuned ...

Japan's KEK Research and IBM agree to develop powerful KEK central computer system ...

Intel Labs announces latest Science and Technology Center focused on next generation of pervasive computing ...

U.S. Department of Energy selects NetApp as the storage foundation for one of the world's most powerful supercomputers ...

Oracle achieves world record result with SPECjEnterprise2010 benchmark ...

Canon and Oracle join forces to integrate Canon's imaging technologies with Oracle ...

Oracle Utilities Meter Data Management running with Oracle Exadata Database Machine and Oracle Exalogic Elastic Cloud demonstrates extreme performance in processing data from smart meters ...

Oracle announces Hybrid Columnar Compression support for ZFS Storage Appliances and Pillar Axiom Storage Systems ...

Oracle unveils the world's fastest general purpose engineered system - the SPARC SuperCluster T4-4 ...

Oracle launches next generation SPARC T4 servers ...

Oracle's SPARC T4 servers deliver record-breaking performance results ...

SDSC and SDSU share in $4.6 million NSF grant to simulate earthquake faults ...

Stampede charges computational science forward in tackling complex societal challenges ...