Back to Table of contents

Primeur weekly 2016-12-05

Exascale supercomputing

Hewlett Packard Enterprise demonstrates world's first Memory-Driven Computing architecture ...

Crowd computing

Einstein@home discovers new gamma-ray pulsar ...

Computing tackles the mystery of the dark universe ...

Quantum computing

Construction of practical quantum computers radically simplified ...

Researchers take first look into the 'eye' of Majoranas ...

More reliable way to produce single photons for quantum information imprinting ...

Focus on Europe

Deadline for ISC research paper submission extended to December 16 ...

PRACE to hold awards ceremony at Cineca for Summer of HPC 2016 ...

An open harbour for research data ...

New MareNostrum4 supercomputer to be 12 times more powerful than MareNostrum3 ...

Hardware

Lincoln Laboratory's supercomputing system ranked most powerful in New England ...

PolyU and Huawei jointly set up the first lab in optical communication and advanced computing system in Hong Kong ...

DDN Infinite Memory Engine burst buffer exceeds 1 TB per second file system performance for Japan's fastest supercomputer ...

Fujitsu develops in-memory deduplication technology to accelerate response for large-scale storage ...

Fujitsu announces start of operations for Japan's fastest supercomputer ...

Applications

Using supercomputers to illuminate the renaissance ...

IBM unveils Watson-powered imaging solutions for health care providers ...

New algorithm could explain human face recognition ...

Researchers use Stampede supercomputer to study new chemical sensing methods, desalination and bacterial energy production ...

Physicists spell 'AV' by manipulating Abrikosov vortices ...

BSC researchers to study the response of European climate to Arctic sea ice depletion ...

IBM and Pfizer to accelerate immuno-oncology research with Watson for Drug Discovery ...

Fujitsu offers deep learning platform with world-class speed and AI services that support industries and operations ...

The Cloud

Cloud Systems Management Software Market: Global Industry Analysis and Opportunity Assessment 2016-2026 ...

Juniper Networks simplifies Cloud transition for enterprises with carrier-grade routing and unified security for AWS marketplace ...

An open harbour for research data


Hard disk memories of petabyte capacity for research data will be part of the Helmholtz Data Federation. Photo: KIT/SCC.
2 Dec 2016 Karlsruhe - Large-scale experiments and simulations in science supply an increasing amount of data. The way from data to knowledge, however, also needs a new quality of memory and analysis options. The Helmholtz Association now assumes a pioneer role in the permanent, secure, and usable storage of data. For managing Big Data in science, it has established the Helmholtz Data Federation (HDF) coordinated by KIT.

A viable data infrastructure is the backbone of Germany as a location of research, the President of KIT, Professor Holger Hanselka, emphasized. "To master the big challenges of energy, mobility, and information, we have to be capable of turning Big Data rapidly into smart data. At KIT, the research university in the Helmholtz Association, we pool the competencies necessary for this purpose."

The Helmholtz Centres are prepared to preserve research data in suitable data infrastructures in the long term and to make them as open as possible for later use by science and the society, Professor Otmar D. Wiestler, President of the Helmholtz Association, said.

Germany's leading data centres join the Helmholtz Data Federation in order to store the flows of research data from various scientific disciplines in an ordered manner, to interconnect them with each other, and to make them available for joint use, Professor Achim Streit of KIT, coordinator of the HDF, pointed out. The HDF might serve as a blueprint for data-intensive research in Germany and Europe, an open harbour for access to and turnover of research data.

The HDF is a central element of the recently adopted position paper of the Helmholtz Association on the handling of research data, which is entitled "Die Ressource Information besser nutzbar machen" - Improving the usability of information resources. Thanks to its secure federation structure and the set-up of multi-thematic data centres, the HDF will enable data-intensive science communities to make their scientific data visible, to share their data while retaining data sovereignty, to use them across disciplines, and to archive these data reliably.

The federation is based on three key elements: Innovative software for research data management, excellent user support, and latest storage and analysis hardware. The partners plan medium-term investments into memory systems of double-digit petabyte capacity and into ten thousands of processor cores for data analysis and management. Until 2021, a total of 49.5 million euro is planned to be financed from the strategic development funds of the Helmholtz Association.

The HDF partners in the first phase are these six centres focusing on five research fields of the Helmholtz Association: Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research - Earth and Environment, Deutsches Elektronen-Synchrotron DESY and GSI Helmholtz Centre of Heavy Ion Research - both research field Matter, German Cancer Research Centre - Health, Forschungszentrum Jülich, and Karlsruhe Institute of Technology - both Energy, Key Technologies, Matter, Earth and Environment. The HDF represents the nucleus of a national research data infrastructure across science organisations, which is open to users in the whole German science community. International connections will make it compatible with the future European Science Cloud (EOSC).

KIT already operates several infrastructures for Big Data. The Smart Data Innovation Lab (SDIL) provides a Germany-wide research platform with latest analysis functions for companies. The Smart Data Solution Center Baden-Württemberg (SDSC) supports small and medium-sized enterprises of the region in accessing smart data technologies. The GridKa data center is part of the worldwide distributed network for the European particle accelerator center CERN. With the Large-Scale Data Facility - LSDF for science in Baden-Württemberg and the Large-Scale Data Management and Analysis Initiative - LSDMA of the Helmholtz Association, KIT has already established the basis for coordinating the HDF. In addition, KITs informatics institutes study analysis methods, evaluation algorithms and data security.

Source: Karlsruher Institut für Technologie - KIT

Back to Table of contents

Primeur weekly 2016-12-05

Exascale supercomputing

Hewlett Packard Enterprise demonstrates world's first Memory-Driven Computing architecture ...

Crowd computing

Einstein@home discovers new gamma-ray pulsar ...

Computing tackles the mystery of the dark universe ...

Quantum computing

Construction of practical quantum computers radically simplified ...

Researchers take first look into the 'eye' of Majoranas ...

More reliable way to produce single photons for quantum information imprinting ...

Focus on Europe

Deadline for ISC research paper submission extended to December 16 ...

PRACE to hold awards ceremony at Cineca for Summer of HPC 2016 ...

An open harbour for research data ...

New MareNostrum4 supercomputer to be 12 times more powerful than MareNostrum3 ...

Hardware

Lincoln Laboratory's supercomputing system ranked most powerful in New England ...

PolyU and Huawei jointly set up the first lab in optical communication and advanced computing system in Hong Kong ...

DDN Infinite Memory Engine burst buffer exceeds 1 TB per second file system performance for Japan's fastest supercomputer ...

Fujitsu develops in-memory deduplication technology to accelerate response for large-scale storage ...

Fujitsu announces start of operations for Japan's fastest supercomputer ...

Applications

Using supercomputers to illuminate the renaissance ...

IBM unveils Watson-powered imaging solutions for health care providers ...

New algorithm could explain human face recognition ...

Researchers use Stampede supercomputer to study new chemical sensing methods, desalination and bacterial energy production ...

Physicists spell 'AV' by manipulating Abrikosov vortices ...

BSC researchers to study the response of European climate to Arctic sea ice depletion ...

IBM and Pfizer to accelerate immuno-oncology research with Watson for Drug Discovery ...

Fujitsu offers deep learning platform with world-class speed and AI services that support industries and operations ...

The Cloud

Cloud Systems Management Software Market: Global Industry Analysis and Opportunity Assessment 2016-2026 ...

Juniper Networks simplifies Cloud transition for enterprises with carrier-grade routing and unified security for AWS marketplace ...