Back to Table of contents

Primeur weekly 2013-10-14

Special

Big Data analytics: a complex story of disruptive hype countered with converging technologies ...

Big Data forcing enterprises to look into the direction of HPC solutions ...

River no longer too deep between HPC and data intensive computing ...

HPC is HPC, and enterprise is enterprise, and never the twain shall meet? Can Big Data be the catalyst? ...

High Performance Data Analysis ecosystem to grow to more than $2 billion by 2016 ...

Focus

2013: Another year on the road to Exascale - An Interview with Thomas Sterling and Satoshi Matsuoka - Part III ...

The Cloud

Contrail project partners to release version R1.3 of the Contrail software ...

Fujitsu begins global packaged sales of "SPATIOWL" location data Cloud service ...

Dynamically managing network bandwidth in a Cloud ...

EuroFlash

The Transinsight Award for Semantic Intelligence goes to the "Wishart" team from the University of Alberta, Canada ...

Final Call for the HPCAC- ISC 2014 Student Cluster Competition Submission ...

CGG slashes development time with Allinea DDT ...

Bull launches GCOS7 V12 for its large mainframes ...

Bull launches Bull Optimal Database Booster, to optimize the performance of Oracle databases running on its Escala servers ...

Adept project: investigating energy efficiency in parallel technologies ...

DataDirect Networks' scalable, high-performance storage powers Wellcome Trust Sanger Institute's worldwide research efforts to reduce global health burden ...

The Human Brain Project has begun ...

Intel and Janssen Pharmaceutica to collaborate with imec and 5 Flemish universities to open ExaScience Life Lab ...

PRACE to showcase the principles of HPC at the European Union Contest for Young Scientists (EUCYS) ...

The 2013 Nobel Prize in Chemistry goes for multiscale models development ...

USFlash

Jack Dongarra receives high honour for supercomputing accomplishments ...

Cray enhances coprocessor and accelerator programming with support for OpenACC 2.0 ...

NCSA joins OpenSFS ...

Juniper Networks enables the discovery of new data insights in IBM Research Accelerated Discovery Lab ...

Louisiana State University researchers awarded nearly $1 million for Big Data research ...

Winchester Systems introduces FlashDisk RAID arrays with iSCSI 10Gb ...

HPC is HPC, and enterprise is enterprise, and never the twain shall meet? Can Big Data be the catalyst?


26 Sep 2013 Heidelberg - At ISC'13 Big Data, held in Heidelberg, Germany, September 25-26, 2013, Addison Snell from Intersect360 Research moderated a vendor panel providing vendors the opportunity to present short summaries of their actual HPC Big Data offerings and customer case studies. Before the interactive discussion with the audience and the vendors, Addison Snell offered a short introduction showing where HPC meets enterprise when it comes to Big Data.

Between April and August 2013, Intersect360 Research held a survey among end users where they discuss their environments, challenges, solutions and 'satisfaction gaps' in addressing their Big Data challenges.

Their is a different mindset between technical and enterprise computing, according to Addison Snell. Technical computing on the one hand is driven by price/performance. It involves a fast adoption of new technologies, algorithms, and approaches. Enterprise computing on the other keeps the business running and is used for communicating and collaborating; marketing and selling products; accounting, HR and finance. It is drives by reliability, availability and serviceability (RAS). Because of this, there is a slow adoption of new

technologies, algorithms, and approaches in enterprise computing.

Addison Snell told the audience that Big Data constitutes a big opportunity. A lot of money is being spent on Big Data. 60% of enterprises that responded to the survey will spend more than 10% of their IT budget on technology relating to Big Data. However, we should use caution in describing 'the Big Data market'.

In 2012 in a earlier survey, only 17% of the respondents mentioned Hadoop when describing their Big Data applications. In 2013, this went down, driven by the enterprise respondents. Deployments might be based on Hadoop, but the majority of Big Data implementations are on in-house applications and algorithms. The most common source of data is also 'in-house'. ISV software for Big Data is thinly scattered. So there is more to Big Data than just Hadoop, Addison Snell showed.

Metrics of performance show up as key factors in enterprise as well as technical computing. Big Data will be a driver for expanded usage of HPC, if HPC developers can still meet enterprise requirements.

For the vendor panel, the following companies were invited: SAS, Intel, SAP, SGI, and Quantum.