Back to Table of contents

Primeur weekly 2014-04-28

Exascale supercomputing

MONTBLANC scaling new heights of computing performance ...

The Cloud

HP extends mission-critical enterprise environments ...

UTSA hosts Open BigCloud Symposium and OCP Workshop May 7-8 ...

IBM provides SoftLayer ecosystem partners with new resources for generating Cloud revenue ...

Desktop Grids

Parabon launches crowdsourcing initiative to tackle Alzheimer's Disease with computing capacity donated by concerned citizens ...

SETI.Germany to organize BOINC Pentathlon ...

New BOINC server VM image available ...

EuroFlash

Tower International selects Altair's Compute Manager ...

Forschungszentrum Jülich Joins the OpenPOWER Foundation ...

CSC leads a Nordic project on sensitive data ...

CSC joins Intel Parallel Computing Center programme ...

Earthquake simulation tops 1 quadrillion flops ...

Master in High Performance Computing (MHPC) programme at ICTP and the International School for Advanced Studies (SISSA) ...

Catherine Rivière, PRACE Council Chair, speaks at ICRI 2014 on the way forward for PRACE and HPC in Europe ...

PRACE Spring School: Researchers from all over the world come to school in Austria ...

USFlash

Third Person, first rate projection: TDC delivers tech and crew to ground-breaking Australian movie production for TEDx Sydney ...

OpenPOWER Foundation unveils first innovations and roadmap ...

NCAR & CU join Intel Parallel Computing Centers programme ...

AltaSim Technologies wins DOE grant for additive manufacturing ...

Georgia Tech researchers use NSF XSEDE supercomputers to understand and predict how black holes swallow stars ...

NEC selects Chelsio adapters for Vector Supercomputer ...

IBM Watson Group invests in Fluid to transform the consumer shopping experience ...

HP introduces next-generation enterprise array for mission-critical workloads ...

Big data poses great challenges and opportunities for databases ...

IBM tackles Big Data challenges with Open Server Innovation Model ...

Georgia Tech researchers use NSF XSEDE supercomputers to understand and predict how black holes swallow stars


NASA; S. Gezari, The Johns Hopkins University and J. Guillochon, University of California, Santa Cruz
14 Apr 2014 Austin - Somewhere in the cosmos an ordinary galaxy spins, seemingly at slumber. Then all of a sudden, a flash of light explodes from the galaxy's centre. A star orbiting too close to the event horizon of the galaxy's central supermassive black hole is torn apart by the force of gravity, heating up its gas and sending out a beacon to the far reaches of the universe.

In a universe with tens of billions of galaxies, how would we see it? What would such a beacon look like? How would we distinguish it from other bright, monumental intergalactic events, like supernovas?

"Black holes by themselves do not emit light", stated Tamara Bogdanovic, Assistant Professor of Physics at the Georgia Institute of Technology. "Our best chance to discover them in distant galaxies is if they interact with stars and gas that are around them."

In recent decades, with improved telescopes and observational techniques designed to repeatedly survey the vast numbers of galaxies on the sky, scientists noticed that some galaxies that previously looked inactive would suddenly light up at their very centre.

"This flare of light was found to have a characteristic behaviour as a function of time", Tamara Bogdanovic explained. "It starts very bright and its luminosity then decreases in time in a particular way. Astronomers have identified those as galaxies where a central black hole just disrupted and 'ate' a star. It's like a black hole putting up a sign that says: 'Here I am'."

Tamara Bogdanovic relies on National Science Foundation-funded supercomputers like Stampede at the Texas Advanced Computing Center and Kraken at the National Institute for Computational Sciences. Using these systems, she and her collaborators recently simulated the dynamics of these super powerful forces and charted their behaviour using numerical models. Stampede and Kraken are part of the Extreme Science and Engineering Discovery Environment (XSEDE), a single virtual system that scientists use to interactively share computing resources, data and expertise.

Using a mix of theoretical and computational approaches, Tamara Bogdanovic tries to predict the observational signatures of events like the black-hole-devouring-star scenario described above, also known as a "tidal disruption" - or two supermassive black holes merging, another of her interests. Such events would have a distinct signature to someone analyzing data from a ground-based or a space-based observatory.

Tidal disruptions are rare cosmic occurrences.

Astrophysicists have calculated that a Milky Way-like galaxy stages the disruption of a star only once in about 10,000 years. The luminous flare of light, on the other hand, can fade away in only a few years. This difference in timescale highlights the observational challenge in pinpointing such events in the sky and underlines the importance of astronomical surveys that monitor vast numbers of galaxies at the same time.

So far, only a few dozen of these characteristic flare signatures have been observed and deemed "candidates" for tidal disruptions. But with data from PanSTARRS, Galex, the Palomar Transient Factory and other upcoming astronomical surveys becoming available to scientists, Tamara Bogdanovic believes this scarcity will change dramatically.

"As opposed to a few dozen that have been found over the past 10 years, now imagine hundreds per year - that's a huge difference", she stated. "It means that we will be able to build a varied sample of stars of different types being disrupted by supermassive black holes."

With hundreds of such events to explore, astrophysicists' understanding of black holes and the stars around them would advance by leaps and bounds, helping determine some key aspects of galactic physics.

"A diversity in the type of disrupted stars tells us something about the makeup of the star clusters in the centers of galaxies", Tamara Bodganovic stated. "It may give us an idea about how many main sequence stars, how many red giants, or white dwarf stars are there on average."

It also tells us something about the population and properties of supermassive black holes that are doing the disrupting.

"We use these observations as a window of opportunity to learn important things about the black holes and their host galaxies", she continued. "Once the tidal disruption flare dims below some threshold luminosity that can be seen in observations, the window closes for that particular galaxy."

In a recent paper submitted to theAstrophysical Journal, Tamara Bogdanovic, working with Roseanne Cheng, Center for Relativistic Astrophysics at Georgia Tech, and Pau Amaro-Seoane, Albert Einstein Institute in Potsdam, Germany, considered the tidal disruption of a red giant star by a supermassive black hole using computer modelling.

The paper comes on the heels of the discovery of a tidal disruption event in which a black hole disrupted a helium-rich stellar core, thought to be a remnant of a red giant star, named PS1-10jh, 2.7 billion light years from Earth.

The sequence of events they described aims to explain some unusual aspects of the observational signatures associated with this event, such as the absence of the hydrogen emission lines from the spectrum of PS1-10jh.

As a follow-up to this theoretical study, the team has been running simulations on Georgia Tech's Keeneland supercomputer, in addition to as Stampede and Kraken. The simulations reconstruct the chain of events by which a stellar core, similar to the remnant of a tidally disrupted red giant star, might evolve under the gravitational tides of a massive black hole.

"Calculating the messy interplay between hydrodynamics and gravity is feasible on a human timescale only with a supercomputer", Roseanne Cheng stated. "Because we have control over this virtual experiment and can repeat it, fast forward, or rewind as needed, we can examine the tidal disruption process from many perspectives. This in turn allows us to determine and quantify the most important physical processes at play."

The research shows how computer simulations complement and constrain theory and observation.

"There are many situations in astrophysics where we cannot get insight into a sequence of events that played out without simulations", Tamara Bogdanovic stated. "We cannot stand next to the black hole and look at how it accretes gas. So we use simulations to learn about these distant and extreme environments."

One of Tamara Bogdanovic's goals is to use the knowledge gained from simulations to decode the signatures of observed tidal disruption events.

"The most recent data on tidal disruption events is already outpacing theoretical understanding and calling for the development of a new generation of models", she explained. "The new, better quality data indicates that there is a great diversity among the tidal disruption candidates. This is contrary to our perception, based on earlier epochs of observation, that they are a relatively uniform class of events. We are yet to understand what causes these differences in observational appearance and computer simulations are guaranteed to be an important part of this journey."
Source: University of Texas at Austin, Texas Advanced Computing Center

Back to Table of contents

Primeur weekly 2014-04-28

Exascale supercomputing

MONTBLANC scaling new heights of computing performance ...

The Cloud

HP extends mission-critical enterprise environments ...

UTSA hosts Open BigCloud Symposium and OCP Workshop May 7-8 ...

IBM provides SoftLayer ecosystem partners with new resources for generating Cloud revenue ...

Desktop Grids

Parabon launches crowdsourcing initiative to tackle Alzheimer's Disease with computing capacity donated by concerned citizens ...

SETI.Germany to organize BOINC Pentathlon ...

New BOINC server VM image available ...

EuroFlash

Tower International selects Altair's Compute Manager ...

Forschungszentrum Jülich Joins the OpenPOWER Foundation ...

CSC leads a Nordic project on sensitive data ...

CSC joins Intel Parallel Computing Center programme ...

Earthquake simulation tops 1 quadrillion flops ...

Master in High Performance Computing (MHPC) programme at ICTP and the International School for Advanced Studies (SISSA) ...

Catherine Rivière, PRACE Council Chair, speaks at ICRI 2014 on the way forward for PRACE and HPC in Europe ...

PRACE Spring School: Researchers from all over the world come to school in Austria ...

USFlash

Third Person, first rate projection: TDC delivers tech and crew to ground-breaking Australian movie production for TEDx Sydney ...

OpenPOWER Foundation unveils first innovations and roadmap ...

NCAR & CU join Intel Parallel Computing Centers programme ...

AltaSim Technologies wins DOE grant for additive manufacturing ...

Georgia Tech researchers use NSF XSEDE supercomputers to understand and predict how black holes swallow stars ...

NEC selects Chelsio adapters for Vector Supercomputer ...

IBM Watson Group invests in Fluid to transform the consumer shopping experience ...

HP introduces next-generation enterprise array for mission-critical workloads ...

Big data poses great challenges and opportunities for databases ...

IBM tackles Big Data challenges with Open Server Innovation Model ...