Back to Table of contents

Primeur weekly 2012-07-02

Exascale supercomputing

Ultra fast supercomputers at UK lab will better prepare us for severe weather and save millions of pounds ...

The Cloud

HP expands t410 Smart Zero Client family ...

Opscode announces integration with Google Compute Engine ...

Red Hat Cloud Ecosystem Gains Global Momentum ...

Red Hat to acquire FuseSource ...

Imperial College London and University of Cambridge launch CORE to deliver unrivalled UK e-Infrastructure capability to industry ...

Valeo choses Agarik to host secure portal to its office automation Cloud ...

IEEE brings Cloud computing expertise and user resources together to foster worldwide collaboration and innovation ...

Desktop Grids

One billion results returned by World Community Grid volunteers ...

EuroFlash

French Ministry of Culture and Communication gives Bull its approval to preserve public archives on digital media ...

STFC's Joule in the crown is UK's most powerful supercomputer ...

PRACE looks back on successful Scientific Conference at ISC'12 ...

CERN to give update on Higgs search as curtain raiser to ICHEP conference ...

Bull-Joseph Fourier Prize 2012 recognizes three scientific teams for their advances in research and innovation ...

USFlash

Cray to add Intel Xeon Phi coprocessors to its next-generation Cascade supercomputer ...

Cray signs $40 million supercomputer agreement with the National Energy Research Scientific Computing Center (NERSC) ...

YarcData kicks off the $100,000 Graph Analytics Challenge and announces contest judges ...

Graph500 adds new measurement of supercomputing performance ...

Health care publisher Lifescript selects HP 3PAR Storage to expand offerings and enhance customer service ...

IBM and Lawrence Livermore researchers form Deep Computing Solutions Collaboration to help boost industrial competitiveness ...

RIKEN and Fujitsu complete operational testing of the K computer ...

DataDirect Networks powers industrial innovation with NCSA Private Sector Programme ...

Fujitsu wins supercomputer bid from Taiwan's Central Weather Bureau ...

Reaching and researching between stars ...

BGI demonstrated genomic data transfer at nearly 10 gigabits per second between US and China ...

Reaching and researching between stars

26 Jun 2012 Austin - From Earth, observers use telescopes to look and learn about the distant luminous spheres. But the telescope often isn't the only instrument used. Karl Gebhardt, professor of astrophysics at the University of Texas at Austin and one of the principal investigators for the Hobby-Eberly Telescope Dark Energy Experiment (HETDEX) project, makes revolutionary discoveries about dark matter by combining deep-space observations with the powerful Lonestar supercomputer at the Texas Advanced Computing Center (TACC).

Dark matter exerts a gravitational pull on matter in a galaxy, including stars, which orbit the centre of the galaxy. Since dark matter neither emits nor absorbs light or other electromagnetic radiation, it cannot be seen directly with telescopes. However, through indirect evidence, scientists estimate that dark matter constitutes 83% of the matter in the universe and 23% of the mass-energy.

This represents a significant portion of the universe. For that reason, astronomers like Karl Gebhardt feel compelled to learn more about dark matter, its influences on the formation of galaxies, and its effects on the structure of the cosmos.

"We believe dark matter is a new type of particle that has yet to be discovered", Karl Gebhardt stated. "In a lot of our experiments, we hone in on it, even though we don't know its nature yet."

To detect dark matter, researchers collect data on the motions of stars. This data drives simulations and provides a means of distinguishing the effects of dark matter on a galaxy.

Karl Gebhardt works with two teams, one at the McDonald Observatory, a research unit of the University of Texas at Austin, and the other at the National Aeronautics and Space Administration (NASA). The data collection involves the Mitchell Spectrograph, a 2.7-meter telescope at the McDonald Observatory, and the NASA Hubble Space Telescope. Based on the data he receives, Karl Gebhardt builds computer models and maps to represent the distribution of dark matter throughout different galaxies.

Telescopes are time-travelling devices, enabling scientists to see earlier eras of the cosmos. But astronomers can't look back enough light-years to actually view the development of the early universe directly, so theoretical models and computer simulations continue to be a significant element in current research.

For a long time a discrepancy persisted between what observers and theorists found from observations of dark matter and the computational models of dark matter. "We are trying to put that to rest by making a definitive study of how the dark matter is distributed", Karl Gebhardt stated.

Dark matter tends to lie at the edge of the galaxy, beyond the visible components of the galaxy. This means simulations to explore dark matter cannot be too localized and need to account for an almost unfathomable number of elements. About a hundred billion galaxies can be seen from the observatory. Each galaxy has of an order of ten billion stars. So there are a lot of elements to study, Karl Gebhardt said.

"The large number of data sets require a huge computer programme that can basically mimic a galaxy", Karl Gebhardt stated. "That's why we need a supercomputer."

In 2004, Karl Gebhardt received his first allocation on the original Lonestar supercomputer at TACC. As TACC's computational resources have grown, Karl Gebhardt's simulations have also continued to advance. Now, his research teams include about a dozen researchers around the world.

"Before using TACC resources, I would run the data on my computer, crunching continuously, but it would take me a month just to process the data sets of one galaxy", Karl Gebhardt stated. "Now it takes about two hours."

Using Lonestar, Karl Gebhardt creates nearly 100,000 different models of one galaxy, representing the range of possible ways stars can move throughout a galaxy.

The stars orbit the centre of a galaxy, and the orbital speed remains equal among all stars, regardless of the distance from the centre. Those findings led to the idea that dark matter acts as an attracting force, pulling matter toward it.

"We are learning a lot and are finding a different answer than what most theorists had predicted", Karl Gebhardt stated. Through the simulations, Karl Gebhardt has determined that the dark matter is more spread out at the edge of the galaxy than considered in the past.

"The total amount of dark matter is the same as previously assumed, but it is fluffier (more distributed) than we thought", Karl Gebhardt stated.

Karl Gebhardt's research process works by trying to mimic the galaxy on the computer. He then compares the simulation to reality by using observations - from the Mitchell Spectrograph - of how the stars are moving. Next, he repeats the process 100,000 times with different simulations. From the whole set of simulations, he finally selects the one model that is the best representation of the data.

"The model that best mimics the data then determines the structure of the dark matter and how the stars orbit in the galaxy", Karl Gebhardt explained.

Initial results of Karl Gebhardt's research were published in theAstrophysical Journalin January 2012.

Karl Gebhardt's studies of dark matter provide information about its fundamental properties, which may help scientists substantiate previous theories or generate new findings about the functioning of the universe.

Next year, the National Science Foundation and academic partners will deploy Hobby-Eberly Telescope Dark Energy Experiment (HETDEX), the first major experiment to search for the evolution of dark energy, the mysterious force causing the expansion of the universe to speed up over time.

Over three years, HETDEX will collect data on at least one million galaxies that are nine billion to 11 billion light-years away, yielding the largest map of the universe ever produced. The map will allow astronomers to measure how fast the universe was expanding at different times in history. The project will use the giant Hobby-Eberly Telescope at McDonald Observatory and a set of spectrographs to map the three-dimensional positions of one million galaxies. HETDEX will generate about one petabyte (one million gigabytes) of data and require a lot of computer processing cycles.

"It will be a huge amount of data", Karl Gebhardt stated. "So we will continue to be large users of TACC allocations."

Karl Gebhardt will carry on investigating galaxies, searching for the next discovery about the matter that exists beyond the stars. "I want to understand how the entire universe works", Karl Gebhardt stated. "And no other field but astronomy can say that its answers are out of this world."
Source: University of Texas at Austin, Texas Advanced Computing Center

Back to Table of contents

Primeur weekly 2012-07-02

Exascale supercomputing

Ultra fast supercomputers at UK lab will better prepare us for severe weather and save millions of pounds ...

The Cloud

HP expands t410 Smart Zero Client family ...

Opscode announces integration with Google Compute Engine ...

Red Hat Cloud Ecosystem Gains Global Momentum ...

Red Hat to acquire FuseSource ...

Imperial College London and University of Cambridge launch CORE to deliver unrivalled UK e-Infrastructure capability to industry ...

Valeo choses Agarik to host secure portal to its office automation Cloud ...

IEEE brings Cloud computing expertise and user resources together to foster worldwide collaboration and innovation ...

Desktop Grids

One billion results returned by World Community Grid volunteers ...

EuroFlash

French Ministry of Culture and Communication gives Bull its approval to preserve public archives on digital media ...

STFC's Joule in the crown is UK's most powerful supercomputer ...

PRACE looks back on successful Scientific Conference at ISC'12 ...

CERN to give update on Higgs search as curtain raiser to ICHEP conference ...

Bull-Joseph Fourier Prize 2012 recognizes three scientific teams for their advances in research and innovation ...

USFlash

Cray to add Intel Xeon Phi coprocessors to its next-generation Cascade supercomputer ...

Cray signs $40 million supercomputer agreement with the National Energy Research Scientific Computing Center (NERSC) ...

YarcData kicks off the $100,000 Graph Analytics Challenge and announces contest judges ...

Graph500 adds new measurement of supercomputing performance ...

Health care publisher Lifescript selects HP 3PAR Storage to expand offerings and enhance customer service ...

IBM and Lawrence Livermore researchers form Deep Computing Solutions Collaboration to help boost industrial competitiveness ...

RIKEN and Fujitsu complete operational testing of the K computer ...

DataDirect Networks powers industrial innovation with NCSA Private Sector Programme ...

Fujitsu wins supercomputer bid from Taiwan's Central Weather Bureau ...

Reaching and researching between stars ...

BGI demonstrated genomic data transfer at nearly 10 gigabits per second between US and China ...