Back to Table of contents

Primeur weekly 2013-11-12

Special

e-Infrastructure Commons should connect the supercomputing, Big Data, networking and HPC islands in Europe ...

Big European research centres define road map to e-Infrastructure Commons ...

Bert's Bike measures Dutch cities' microclimates and other stories from the Dutch eScience conference ...

The Cloud

New publication: Developing Cloud Software - Algorithms, Applications, and Tools ...

Amazon Web Services deploys NVIDIA GRID GPUs ...

New computing model could lead to quicker advancements in medical research ...

Red Hat brings OpenShift Online Platform-as-a-Service to 14 new countries and expands gear size offerings for developers ...

Desktop Grids

BOINC 7.2.28 released to the public ...

EuroFlash

EUDAT and RDA/RDA Europe go to China ...

ESI releases a new version of ProCAST, the leading software solution for casting process simulation ...

SysFera to showcase strength in HPC management and remote visualization at Supercomputing Conference 13 ...

DLR takes new supercomputer into operation ...

New project "Execution Models for Energy-Efficient Computing Systems" has launched ...

Fortissimo project to issue open call for proposal submissions for HPC-Cloud-based application experiments ...

PRACE Summer of HPC Awards Excellence ...

Allinea DDT leaps ahead with support for NVIDIA CUDA 5.5 and CUDA on ARM ...

USFlash

Blue Waters and XSEDE sign collaborative agreement ...

Cray expands data management portfolio with new Tiered Adaptive Storage solution ...

Lustre file system version 2.5 released ...

Los Alamos National Laboratory selects DataDirect Networks' high-performance storage to support petascale collaboration among scientists & researchers ...

Micron's Hybrid Memory Cube earns high praise in next-generation supercomputer ...

Lawrence Livermore, Intel, and Cray produce Big Data machine to serve as catalyst for next-generation HPC clusters ...

SDSC's high-performance computing systems benefit San Diego-area companies ...

Kemal Ebcioglu named recipient of 2013 IEEE Computer Society B. Ramakrishna Rau Award ...

University of New Hampshire celebrates first-in-the-state supercomputing capabilities ...

Innodisk to debut 1M+ IOPS FlexiRemap technology at Supercomputing Conference 2013 ...

Suzuki accelerates the speed of vehicle design and innovation with DataDirect Networks high performance scale-out storage ...

DataCore Software's latest SANsymphony-V release magnifies the scope and sets the new standard for software-defined storage platforms ...

Bert's Bike measures Dutch cities' microclimates and other stories from the Dutch eScience conference


7 Nov 2013 Amsterdam - The Dutch eScience community met in Amsterdam last week for a conference on "Optimizing Discovery in the Era of Big-Data". It was the first national conference organised by the Dutch national eScience centre which is called "Netherlands eScience Center". The centre was established two years ago and currently employs about 25 persons, the core being formed by eScience engineers that help scientist to integrate computing Big Data into scientific applications. The centre is in the same building as the supercomputer centre SURFsara and the EGI.

The key note at the conference was by Peter Coveney, University College London. He is a big supercomputer user and active in many UK and European eScience projects. Key to making supercomputing, Big Data and other resources available to scientists are work flows that provide easy access. This way ensemble molecular dynamics can be done and multi-scale physics, using a variety of resources, becomes feasible. When you need to analyse big scientific data, you often need big computers. Hence it is useful that a number of EUDAT Big Data centres are also supercomputer centres. Even though supercomputers are expensive, it is chicken feed compared to the real large scientific instruments like the Square Kilometre Array. Peter Coveney does not understand why on a European scale we have separate infrastructures for Big Data, HPC and supercomputing.

Big Data over the Internet? The real Big Data can be found in scientfic instruments, said Rob van Nieuwpoort in his presentation. For instance the LOFAR telescope produces 20 Tbit/s of raw data. The Amsterdam Internet Exchange (AMS-IX), manages only about 1,5 Tbit/s. Especially in astronomy Big Data gets so large you cannot store the raw data. Different algorithms are needed to process the streaming. Often "old" algorithms are used that were replaced some decades ago by more efficient ones that unfortunately require the data to be in memory.

Next, Willem Bouten explained e-Ecology which currently is at the level of data driven statistical science. Ecological systems too are just complex for full simulations.

Piek Vossen explained next that actually sensor data is not the raw, but the real data. It is an interpretation of the data already. In humanities it is even worse: researchers have their own, often subjective, interpretation of data. Surprisingly, that does not hinder uptake of Big Data analytics techniques, said Piek Vossen. It is not a problem that researchers have a subjective interpretion of data as long as they can be formalised and compared. He is working in a project, called News Reader to prove the usefulness of Big Data analytics in news collection and interpretation. News Reader ( http://www.newsreader-project.eu/ ) tries to collect and interpret all the news in the world. These are millions of items each day. On a local university cluster at the VU Amsterdam, processing one day of news would take some 15 years.

The Netherlands Forensic Institute ("Dutch CSI") needs to process terabytes of digital media data in an hour. The time constraint is the challenge, explained Erwin van Eijk. It is often necessary because of the legal requirements: the judge needs it. There is a complex situation with provenance, with access to the data and the interpretation of the data.

What does the average Dutch man/woman look like? Look at the genes of course. Swertz explained who they created an "ultra-sharp genetic group portrait of the Dutch". They sued a light-path to transport data between Groningen, the data repository and Amsterdam, the computing infrastructure used.

Do you believe the weather forecast? It is OK when you live in a rural area. But in cities there are micro climates that can be rather different. Bert Holtslag is working on a project "Summer in the City" to measure and model these micro climates. He started in his home town Wageningen. Next year he will model Amsterdam.

How does Bert Holtslag collect measurements in a city? Typically Dutch: with a bike packed with measurement equipment.
Ad Emmen

Back to Table of contents

Primeur weekly 2013-11-12

Special

e-Infrastructure Commons should connect the supercomputing, Big Data, networking and HPC islands in Europe ...

Big European research centres define road map to e-Infrastructure Commons ...

Bert's Bike measures Dutch cities' microclimates and other stories from the Dutch eScience conference ...

The Cloud

New publication: Developing Cloud Software - Algorithms, Applications, and Tools ...

Amazon Web Services deploys NVIDIA GRID GPUs ...

New computing model could lead to quicker advancements in medical research ...

Red Hat brings OpenShift Online Platform-as-a-Service to 14 new countries and expands gear size offerings for developers ...

Desktop Grids

BOINC 7.2.28 released to the public ...

EuroFlash

EUDAT and RDA/RDA Europe go to China ...

ESI releases a new version of ProCAST, the leading software solution for casting process simulation ...

SysFera to showcase strength in HPC management and remote visualization at Supercomputing Conference 13 ...

DLR takes new supercomputer into operation ...

New project "Execution Models for Energy-Efficient Computing Systems" has launched ...

Fortissimo project to issue open call for proposal submissions for HPC-Cloud-based application experiments ...

PRACE Summer of HPC Awards Excellence ...

Allinea DDT leaps ahead with support for NVIDIA CUDA 5.5 and CUDA on ARM ...

USFlash

Blue Waters and XSEDE sign collaborative agreement ...

Cray expands data management portfolio with new Tiered Adaptive Storage solution ...

Lustre file system version 2.5 released ...

Los Alamos National Laboratory selects DataDirect Networks' high-performance storage to support petascale collaboration among scientists & researchers ...

Micron's Hybrid Memory Cube earns high praise in next-generation supercomputer ...

Lawrence Livermore, Intel, and Cray produce Big Data machine to serve as catalyst for next-generation HPC clusters ...

SDSC's high-performance computing systems benefit San Diego-area companies ...

Kemal Ebcioglu named recipient of 2013 IEEE Computer Society B. Ramakrishna Rau Award ...

University of New Hampshire celebrates first-in-the-state supercomputing capabilities ...

Innodisk to debut 1M+ IOPS FlexiRemap technology at Supercomputing Conference 2013 ...

Suzuki accelerates the speed of vehicle design and innovation with DataDirect Networks high performance scale-out storage ...

DataCore Software's latest SANsymphony-V release magnifies the scope and sets the new standard for software-defined storage platforms ...