Back to Table of contents

Primeur weekly 2020-03-16

Focus

EuroHPC work programme 2020 calls for consolidation type of projects with a budget of 170 million euro ...

Details of the three upcoming 2020 EuroHPC JU Calls for Proposals ...

EuroHPC JU will move to the Drosbach building in Luxembourg ...

Exascale supercomputing

MEEP Project: A flexible system supporting next generation European open source software and hardware ...

Ying-Chih Yang joins SiPearl as CTO ...

Quantum computing

Preparing Ireland for quantum - Irish Centre for High-End Computing brings prestigious European Quantum Technologies Conference to Dublin ...

Finnish researchers look at noisy quantum computer ...

NSF CAREER Award supports framework for photons as quantum transistors ...

Engineers crack 58-year-old puzzle on way to quantum breakthrough ...

New error correction method provides key step toward quantum computing ...

IDC survey finds optimism that quantum computing will result in competitive advantage ...

Focus on Europe

New HLRS "Hawk" supercomputer to deliver unparalleled performance, capacity, and density for science and research ...

It is time to register for the Supercomputing Frontiers 2020 virtual conference ...

Middleware

SDSC announces comprehensive data sharing resource ...

Hardware

Supermicro unveils MegaDC servers - The first commercial off-the-shelf systems designed exclusively for hyperscale data centres ...

Mellanox delivers Spectrum-3 based Ethernet switches - First 12,8 Tbps networking platforms optimized for Cloud, storage, and AI ...

Semtech announces production of Tri-Edge, a PAM4 CDR platform for 200G and 400G data centre applications ...

Innovium delivers production grade SONiC/SAI for TERALYNX based switch systems ...

Innovium and Credo announce interoperability of production TERALYNX 7 switch family with Credo's Dual 400G MACsec solution ...

Kingston Technology releases enterprise-grade data centre NVMe SSD for mixed use ...

MaxLinear's 2nd generation PAM4 DSP selected by Centera Photonics to deliver sub-3,5W 100G optical modules for hyperscale data centres ...

Applications

UK supercomputer to combat Africa's worst locust outbreak in decades ...

Formula for possible treatment of coronavirus developed by innovative Bulgarian company ...

CSC has selected Mahti supercomputer Pilot Projects ...

HETDEX experiment, led by UT Austin, uses advanced computing resources at TACC to pin down the expansion rate of the universe ...

Oden Institute's Feliciano Giustino applies TACC's supercomputing power to the development of novel materials at the quantum scale ...

Computer model solves mystery of how gas bubbles build big methane hydrate deposits ...

A flexible brain for AI ...

AI assist CT analysis in identifying COVID-19 patients ...

The Cloud

Centre-wide support for, and R&D around, containers helps researchers compute with ease at TACC and elsewhere ...

Oracle announces fiscal 2020 third quarter financial results ...

Hydro66 and maincubes sign partnership for European coverage ...

HETDEX experiment, led by UT Austin, uses advanced computing resources at TACC to pin down the expansion rate of the universe


HETDEX is the first major experiment to search for dark energy. It uses the giant Hobby-Eberly Telescope at McDonald Observatory and a set of spectrographs to map the three-dimensional positions of one million galaxies.
11 Mar 2020 Austin - Two decades ago, Saul Perlmutter, Brian Schmidt, and Adam Reiss shocked the world when they published research showing not only that the Universe was expanding, but that the expansion was occurring at an accelerating rate. The discovery came as a complete surprise even to the astronomers themselves, and netted them a Nobel Prize in 2011.
HETDEX searches a large region of the sky that overlaps the Big Dipper. While the Dipper's stars are only a few dozen light-years away, though, the galaxies that HETDEX will target are around 10 billion light-years away.

The accelerating expansion was attributed to "dark energy", which is neither dark, nor energy, but which represents some force fighting gravity's pull, causing galaxies to speed apart from one another faster than would be expected by prevailing theories. What this force is remains unknown.

The discovery of accelerating expansion also had another consequence. It spurred efforts to better quantify this expansion in the hopes of understanding what was at the root of it.

With that goal in mind, Karl Gebhardt, a professor of Astronomy at the University of Texas at Austin and an expert at uncovering the dynamics of distant invisible phenomenon, came up with an experiment that would look deeper into the past of the cosmos than ever before to determine with great accuracy how fast the universe was accelerating.

"We're struggling on the theory side to explain what's going on", Karl Gebhardt stated. "There's a lot of ideas out there, but we have a lack of observations."

The experiment - known at Hobby-Eberly Telescope Dark Energy Experiment, or HETDEX - involved upgrading an optical telescope of extreme sensitivity to survey the sky for galaxies that were active 10 billion years ago, and determine how fast they were traveling outward. The project is supported by more than $42 million in grants from the state of Texas, the United States Air Force, the National Science Foundation (NSF), and the contributions of many private foundations and individuals.

"The expansion rate that had been used was only done on data that was relatively recent in history of the universe", Karl Gebhardt stated. "HETDEX is looking back in time where no one else has looked before. We don't know if the rate has been consistent. If it's a constant, then that calls for a specific physical interpretation. If it changes, it's a very different physical interpretation."

Today, more than 12 years after the project was first proposed, the experiment is underway and the team is gathering one of the largest data collections in astrophysics to answer one of the most essential questions in astronomy.

Based at the McDonald Observatory in West Texas, HETDEX equips one of the largest telescopes in the world with the largest spectrograph on the planet - actually a set of 156 spectrographs. Each spectrograph is fed by 225 fibers that take the light from a patch of sky 1/3600 of the size of the full moon. When completed, the instrument will have 35.000 optical fibers that all focus the light from a large portion of the sky into the spectrometers measuring thousands of objects simultaneously. It currently has 25.000.

The five-year HETDEX survey will create a spectral map of a region of the northern sky near the Big Dipper equivalent to the size of 1000 moons. If a galaxy aligns with one of the fibers, its light will be carried to a spectrograph for analysis. This survey will produce not only a map of the region, but also spectra of all objects within it, allowing the survey to measure their velocities and other physical characteristics.

As complicated as the experiment is, Karl Gebhardt said in the end the exploration comes down to two factors: the velocity the galaxies are moving, and the distance they have travelled from their origin following the Big Bang.

The velocity can be gleaned by the redshift of spectral lines toward longer wavelengths - the red end of the spectrum, which is proportional to the velocity. The distance is determined by the difference from a baseline structure of the universe that has long been established by astronomers and has been the basis for most cosmology prior to the discovery of dark energy.

"When galaxies are made, they're made in a pattern, like a fingerprint", Karl Gebhardt stated. "If we can measure that pattern, it's the same as measuring the distance between the ridges in your fingerprint. You can tell how much the universe has expanded. We do this for millions of galaxies and from those millions of galaxies what you get is a map."

This map tells astronomers how far those galaxies had traveled from the Big Bang until the light left those galaxies and traveled 10 billion light-years to the telescope. By identifying and mapping millions of galaxies from an early period of the universe's expansion and then determining the velocity and distance travelled by them, they believe they can determine the rate of expansion to within 1% of error at this previous phase of expansion.

Establishing that rate and how it has changed over time will help solve the mystery of dark energy and what's causing the expansion.

"It could be that we don't understand gravity on large scales - it could go from an attractive to a repulsive force. Or the expansion could be caused by the energy of empty space", Karl Gebhardt stated. "There are five or six other hypotheses - extremely different ideas. Measurements of the expansion rate at different times in the universe is how we limit these models."

Over the course of the experiment, HETDEX will capture 400 billion resolution elements - a mountain of data that travels continuously from a mountain-top in West Texas straight to the Texas Advanced Computing Center (TACC) where some of most powerful academic supercomputers in the world analyze it. In particular, Karl Gebhardt has relied heavily on TACC's Wrangler supercomputer, a powerful data analysis system supported by NSF.

"We're running a code to find individual point sources: a filter over all of the spatial elements to find individual galaxies", Karl Gebhardt stated. "Most of the CPU time is used to perform that analysis."

As with any idea formulated years before implementation, some aspects of the experiment have proved tricky. In particular, Karl Gebhardt underestimated the noisiness of the data generated by the optical fibers.

He had anticipated a straightforward analysis, but found that he first needed a way to separate real target galaxies from false positives. Strangely enough, humans can readily detect the difference, but most computational algorithms cannot.

So, to address this problem, he is training a machine learning algorithm using human-labelled readings to make the distinction. Working with students in the UT Computer Science department, he created an app that he calls 'AstroTinder' to assist in the process.

Individuals with minimal training are able to look at spectral lines and images of point sources and swipe left or right, depending on whether they believe it is a real galaxy or something else - an artifact of the algorithm or a speck of dust on the sensor.

After enough of these determinations are made, Karl Gebhardt will use TACC's machine learning-centric Maverick supercomputer to train the system to make the distinction itself. The system will then be off to the races, sifting through the billions of data points to identify and map the 100.000 target galaxies.

From those, further analysis will establish the velocity and distance, and then the rate of acceleration of the expansion.

After three years of operation, the survey is about 20 percent complete. Karl Gebhardt anticipates his team will need one-third of the data before they can say anything definitive about the expansion rate. In the meanwhile, the survey is collecting lots of interesting, unintended data about astronomical objects including naked black holes, highly active star-forming galaxies, asteroids, and meteorites.

"No one's looked at the universe in this way", he stated. "We're finding things that couldn't be discovered in any other way."

The data processing and data management challenges for HETDEX harken back to a statement by Tony Tyson, chief scientist for the Large Synoptic Survey Telescope - recently renamed the Vera C. Rubin Observatory, that "the telescope is just a peripheral to the data management system".

"Great care has gone into the design of the data management system such that both the HETDEX science team and the astronomical research community in general will be able to exploit the data, both for the dark energy studies that are the core focus of the project, as well as for other purposes", stated Bob Hanisch, director of the Office of Data and Informatics within the Material Measurement Laboratory at the National Institute for Standard and Technology (NIST), who is not directly involved in the project.

"It is great to see the close collaboration between the HETDEX astronomers and the computational scientists at TACC, such that HETDEX data can be moved to the HPC and data storage facilities efficiently and made available for analysis and distribution."

"Astronomical data science has evolved in amazing ways over the past 20 years", stated Niall Gaffney, director of Data Intensive Computing at TACC and former designer of the archives at the Space Telescope Science Institute which holds the data from the Hubble Space Telescope. "HETDEX is taking the lessons learned from missions like the Hubble Space Telescope and Kepler and combining them with modern machine learning techniques pioneered in industry to better understand a fundamental force in nature in ways we could not 20 years ago. Having a facility and staff like those at TACC help bridge these technologies to bring about these new discoveries."

In terms of science outcomes, Karl Gebhardt believes the experiment will have a major impact, either by helping astronomers understand how gravity works or how the Big Bang occurred. For non-scientists, the research helps resolve our place in the universe.

"We are completely insignificant as humans in the universe, but we're able to understand how the universe evolved", Karl Gebhardt stated. "Being able to do that, I think, is amazing."
To differentiate real target galaxies from false positives, Karl Gebhardt is training a machine learning algorithm using human-labelled readings. Working with students in the UT Computer Science department, he created an app called 'AstroTinder' (a screenshot of which is seen here) to assist in the process.
Source: University of Texas at Austin, Texas Advanced Computing Center - TACC

Back to Table of contents

Primeur weekly 2020-03-16

Focus

EuroHPC work programme 2020 calls for consolidation type of projects with a budget of 170 million euro ...

Details of the three upcoming 2020 EuroHPC JU Calls for Proposals ...

EuroHPC JU will move to the Drosbach building in Luxembourg ...

Exascale supercomputing

MEEP Project: A flexible system supporting next generation European open source software and hardware ...

Ying-Chih Yang joins SiPearl as CTO ...

Quantum computing

Preparing Ireland for quantum - Irish Centre for High-End Computing brings prestigious European Quantum Technologies Conference to Dublin ...

Finnish researchers look at noisy quantum computer ...

NSF CAREER Award supports framework for photons as quantum transistors ...

Engineers crack 58-year-old puzzle on way to quantum breakthrough ...

New error correction method provides key step toward quantum computing ...

IDC survey finds optimism that quantum computing will result in competitive advantage ...

Focus on Europe

New HLRS "Hawk" supercomputer to deliver unparalleled performance, capacity, and density for science and research ...

It is time to register for the Supercomputing Frontiers 2020 virtual conference ...

Middleware

SDSC announces comprehensive data sharing resource ...

Hardware

Supermicro unveils MegaDC servers - The first commercial off-the-shelf systems designed exclusively for hyperscale data centres ...

Mellanox delivers Spectrum-3 based Ethernet switches - First 12,8 Tbps networking platforms optimized for Cloud, storage, and AI ...

Semtech announces production of Tri-Edge, a PAM4 CDR platform for 200G and 400G data centre applications ...

Innovium delivers production grade SONiC/SAI for TERALYNX based switch systems ...

Innovium and Credo announce interoperability of production TERALYNX 7 switch family with Credo's Dual 400G MACsec solution ...

Kingston Technology releases enterprise-grade data centre NVMe SSD for mixed use ...

MaxLinear's 2nd generation PAM4 DSP selected by Centera Photonics to deliver sub-3,5W 100G optical modules for hyperscale data centres ...

Applications

UK supercomputer to combat Africa's worst locust outbreak in decades ...

Formula for possible treatment of coronavirus developed by innovative Bulgarian company ...

CSC has selected Mahti supercomputer Pilot Projects ...

HETDEX experiment, led by UT Austin, uses advanced computing resources at TACC to pin down the expansion rate of the universe ...

Oden Institute's Feliciano Giustino applies TACC's supercomputing power to the development of novel materials at the quantum scale ...

Computer model solves mystery of how gas bubbles build big methane hydrate deposits ...

A flexible brain for AI ...

AI assist CT analysis in identifying COVID-19 patients ...

The Cloud

Centre-wide support for, and R&D around, containers helps researchers compute with ease at TACC and elsewhere ...

Oracle announces fiscal 2020 third quarter financial results ...

Hydro66 and maincubes sign partnership for European coverage ...