Back to Table of contents

Primeur weekly 2013-08-05

Special

There is room for two major supercomputer conferences each year ...

Historic Heidelberg setting for new Big Data Conference ...

Exascale supercomputing

Largest neuronal network simulation to date achieved using Japanese supercomputer ...

Designing a new operating system for exascale architectures ...

The Cloud

Over a hundred vulnerabilities were found in browsers in the course of The University of Oulu's effective data security testing programme ...

IBM unveils new PowerLinux system for analytics and Cloud computing ...

ESDS selects IBM PureSystems over HP and Dell for Cloud and Big Data offerings ...

EuroFlash

Walking on Water: projectiondesign powers interactive portable 360-degree Igloo experience at La Biennale di Venezia 2013 ...

SUSE predicts supercomputer capabilities to become part of mainstream IT for enterprise customers ...

GEANT's terabit upgrade gives European science the data network of the future ...

ADVA FSP 150 delivers sub-microsecond timing for high-frequency trading ...

Scientists realize quantum bit with a bent nanotube ...

USFlash

Stanford engineers receive award to improve supercomputing and solar efficiency ...

Fujitsu PRIMERGY computational power at Australian National University takes high capability Australian research to the world stage ...

NOAA's National Weather Service more than doubles computing capacity ...

Cray Inc. reports second quarter 2013 results ...

UCSC acquires powerful new astrophysics supercomputer system ...

20 years of TOP500 data show Linux's role in supercomputing breakthroughs ...

Secretary Moniz dedicates new supercomputer at the National Energy Technology Laboratory ...

Online tools accelerating earthquake-engineering progress ...

CSIR to launch new supercomputer ...

NIH commits $24 million annually for Big Data Centres of Excellence ...

NASA relies on RTI Connext DDS for Human Exploration Telerobotics Project ...

Omni Circuit Boards produces working aluminum trace circuit board for quantum computing applications ...

Computer scientists develop mathematical jigsaw puzzles to encrypt software ...

Stanford engineers receive award to improve supercomputing and solar efficiency


1 Aug 2013 Stanford - Some mathematical simulations used to predict the outcomes of real events are so complex that they'll stump even today's top supercomputers. To incubate the next generation of supercomputers for tackling real-world problems, the National Nuclear Security Administration (NNSA) has selected Stanford as one of its three new Multidisciplinary Simulation Centers. The Stanford effort, under the leadership of Gianluca Iaccarino, an associate professor of Mechanical Engineering and the Institute for Computational Mathematical Engineering, will receive $3.2 million per year for the next five years under the NNSA's Predictive Science Academic Alliance Programme II (PSAAP II). The University of Utah and the University of Illinois-Urbana-Champaign were selected to house the other two centres.

Participants in PSAAP II will devise new computing paradigms within the context of solving a practical engineering problem. The Stanford project will work on predicting the efficiency of a relatively untested and poorly understood method of harvesting solar energy. The project will draw on expertise from the Mechanical Engineering, Aeronautics and Astronautics, Computer Science and Math departments.

Traditional solar-thermal systems use mirrors to concentrate solar radiation on a solid surface and transfer energy to a fluid, the first step toward generating electricity. In the proposed system, fine particles suspended within the fluid would absorb sunlight and directly transfer the heat evenly throughout the fluid. This technique would allow for higher energy absorption and transfer rates, which would ultimately increase the efficiency of the overall system.

A critical aspect of assessing this technique involves predicting uncertainty within the system. For example, the alignment of the mirrors is imprecise and the suspended particles aren't of uniform size. "We need to rigorously assess the impacts of these sensitivities to be able to compute the efficiency of a system like this", Gianluca Iaccarino stated. "There is currently no supercomputer in the world that can do this, and no physical model."

In order to crack this problem, Stanford and the other programme participants will need to address the other part of the NNSA's directive: develop programming environments and computational approaches that target an exascale computer, a computer that is 1,000 times faster than today's fastest supercomputers, by 2018. The overall task presents several challenges.

"The supercomputer paradigm has reached a physical apex", Gianluca Iaccarino stated. "Energy consumption is too high, the computers get too hot, and it's too expensive to compute with millions of commodity computers bundled together. Next generation supercomputers will have completely different architectures."

A particular design challenge facing engineers is that just one faulty processing unit will halt a simulation. An exascale computer consisting of millions of units will be prone to frequent failure, and so scientists will need to design intelligent backup systems that will allow the simulation to continue running even when one portion of the computer has crashed.

The ambiguity surrounding exascale architecture complicates matters on the software development end: Gianluca Iaccarino and his colleagues will essentially have to program blind for such a system. The programming models they develop will need to have enough flexibility to work on whatever computer model emerges as viable in 2018.

Stanford is particularly well suited to handling the task, Gianluca Iaccarino said. "Fifteen years ago, the Computer Science Department and the Mechanical Engineering Department joined forces to embrace massively parallel computing in solving challenging multi-physics engineering problems", he stated.

That relationship, Gianluca Iaccarino said, contributed to the NNSA continuing its 15-year collaboration with the university; Stanford is the only university to receive funding in every round since the programme's inception.

In concert with the simulation work, the researchers will operate a physical experiment of the solar collector to test the predictions and identify other critical sensitivities. Beyond the lab, Gianluca Iaccarino said that the overall effort will spur new graduate-level courses and workshops dedicated to exploring the intersection of computational science and engineering.

Stanford will also collaborate with five universities on the project: the University of Michigan, the University of Minnesota, the University of Colorado-Boulder, the University of Texas-Austin and the State University of New York-Stony Brook.
Source: Stanford University

Back to Table of contents

Primeur weekly 2013-08-05

Special

There is room for two major supercomputer conferences each year ...

Historic Heidelberg setting for new Big Data Conference ...

Exascale supercomputing

Largest neuronal network simulation to date achieved using Japanese supercomputer ...

Designing a new operating system for exascale architectures ...

The Cloud

Over a hundred vulnerabilities were found in browsers in the course of The University of Oulu's effective data security testing programme ...

IBM unveils new PowerLinux system for analytics and Cloud computing ...

ESDS selects IBM PureSystems over HP and Dell for Cloud and Big Data offerings ...

EuroFlash

Walking on Water: projectiondesign powers interactive portable 360-degree Igloo experience at La Biennale di Venezia 2013 ...

SUSE predicts supercomputer capabilities to become part of mainstream IT for enterprise customers ...

GEANT's terabit upgrade gives European science the data network of the future ...

ADVA FSP 150 delivers sub-microsecond timing for high-frequency trading ...

Scientists realize quantum bit with a bent nanotube ...

USFlash

Stanford engineers receive award to improve supercomputing and solar efficiency ...

Fujitsu PRIMERGY computational power at Australian National University takes high capability Australian research to the world stage ...

NOAA's National Weather Service more than doubles computing capacity ...

Cray Inc. reports second quarter 2013 results ...

UCSC acquires powerful new astrophysics supercomputer system ...

20 years of TOP500 data show Linux's role in supercomputing breakthroughs ...

Secretary Moniz dedicates new supercomputer at the National Energy Technology Laboratory ...

Online tools accelerating earthquake-engineering progress ...

CSIR to launch new supercomputer ...

NIH commits $24 million annually for Big Data Centres of Excellence ...

NASA relies on RTI Connext DDS for Human Exploration Telerobotics Project ...

Omni Circuit Boards produces working aluminum trace circuit board for quantum computing applications ...

Computer scientists develop mathematical jigsaw puzzles to encrypt software ...