Back to Table of contents

Primeur weekly 2016-11-07

Focus

EGI to finalize service catalogue and ISO certification ...

Exascale supercomputing

SLAC and Berkeley Lab researchers prepare for scientific computing on the exascale ...

Quantum computing

Researchers nearly reached quantum limit with nanodrums ...

Focus on Europe

New approach for ARM-based technology to halve the cost of powering data centres ...

PRACE to award contracts in third and final phase of Pre-Commercial Procurement (PCP) ...

PRACE welcomes new Managing Director Serge Bogaerts ...

PRACE 2016 Digest Special Edition on Industry is out ...

Supercomputer comes up with a profile of dark matter ...

Middleware

Bright Computing supplies Bright OpenStack to Stony Brook University ...

DDN Annual High Performance Computing Trends survey reveals rising deployment of flash tiers and private/hybrid Clouds versus public for HPC ...

With Corral 3, TACC provides a more unified data structure and increased space ...

Hardware

Mellanox launches open source software initiative for routers, load balancers, and firewalls ...

Mellanox Multi-Host technology reshapes data centre economics ...

Cray awarded $26 million contract from the Department of Defense High Performance Computing Modernization Programme ...

Hewlett Packard Enterprise completes acquisition of SGI ...

Centre for Modelling & Simulation in Bristol launches new supercomputer ...

Baylor University selects Cray CS400 cluster supercomputer to power innovative research ...

SGI awarded $27 million systems contract with the Army Research Laboratory Defense Supercomputing Resource Center ...

Applications

XSEDE spins off annual conference to unite research computing community ...

Researchers at UCSB explore the delicate balance between coherence and control with a simple but complete platform for quantum processing ...

Cosmic connection: KITP's Greg Huber worked with nuclear physicists to confirm a structural similarity found in both human cells and neutron stars ...

New technique for creating NV-doped nanodiamonds may be boost for quantum computing ...

New bacteria groups, and stunning diversity, discovered underground ...

The Cloud

IBM drives Cloud storage with new all-flash and software defined solutions ...

Capital markets firms continue to invest in hardware for compute Grids alongside growing Cloud adoption, according to TABB Group Research ...

DDN Annual High Performance Computing Trends survey reveals rising deployment of flash tiers and private/hybrid Clouds versus public for HPC


2 Nov 2016 Santa Clara - DataDirect Networks (DDN) has issued the results of its annual High Performance Computing (HPC) Trends survey, which showed that end users in the world's most data-intense environments, like those in many general IT environments, are increasing their use of Cloud. However, unlike general IT environments, the HPC sector is overwhelmingly opting for private and hybrid Clouds instead of public. They are also increasingly choosing to upgrade specific parts of their environments with flash as they modernize their data centres. Managing mixed I/O performance and rapid data growth remain the biggest challenges for HPC organisations driving these infrastructure changes according to the survey.

Conducted by DDN for the fourth consecutive year, the survey polled a cross-section of 143 end-users managing data intensive infrastructures worldwide and representing hundreds of petabytes of storage investment. Respondents included individuals responsible for high performance computing and also networking and storage systems from financial services, government, higher education, life sciences, manufacturing, national labs, and oil and gas organisations. The volume of data under management in each of these organisations is staggering and steadily increasing each year. Of organisations surveyed:

  • 73 percent manage or use more than one petabyte of data storage; and
  • 30 percent manage or use more than 10 PBs of data storage, up 5 percentage points year-over-year.

Data and data storage remain the most strategic part of the HPC data centre, according to an overwhelming majority of survey respondents (77 percent), as end users seek to solve data access, workflow and analytics challenges to accelerate time to results.

Survey respondents revealed the rising use of private and hybrid Clouds within HPC data centres. Respondents planning to leverage a Cloud for at least part of their data in 2017 rose to 37 percent, up almost 10 percentage points year-over-year. Of those, more than 80 percent are choosing private or hybrid Clouds versus a public Cloud option. "These responses are consistent with the trends DDN observes in customer conversations, the most prevalent of which is organizations rebounding from public Cloud due to cost, poor latency and sheer data immobility issues", stated Laura Shepard, senior director of marketing, DDN.

Use of flash in HPC data centres has intensified with more than 90 percent of respondents using flash storage at some level within their data centres today. Perhaps surprisingly, while all-flash arrays are perceived by many to be the fastest storage available in the market, only 10 percent of surveyed users from these most data-intense environments are using an all-flash array. The vast majority of respondents (80 percent) are using hybrid flash arrays either as an extension to storage-level cache, to accelerate metadata, or to accelerate data sets associated with key or problem I/O applications.

A diverse set of applications and an upsurge in site-wide file systems, data lakes and active archives are driving fast-paced data growth in large-scale environments and analytical workflows, which are placing rigorous demands on storage infrastructures and creating unique challenges for HPC users. Performance ranks as the number one storage and Big Data challenge by strong majority (76 percent) of those polled; and mixed I/O performance was cited as the biggest concern by a strong majority (61 percent) of the respondents, which represented an eight percentage point increase compared with last year's survey results.

An even higher portion of respondents (68 percent) identify storage I/O as the main bottleneck specifically in analytics workflows. As these responses demonstrate, performance issues have escalated as Big Data environments contend with a proliferation of diverse applications creating mixed I/O patterns and stifling the performance of their storage infrastructure.

Only a small and diminishing percentage of respondents believe today's file systems and data management technologies will be able to scale to Exascale levels, while almost 75 percent of respondents believe new innovation will be required. This belief is illustrated in respondents' views on addressing performance issues:

  • The strong majority (60 percent) view burst buffers to be the most likely technology to take storage to the next level as users seek faster and more efficient ways to offload I/O from compute resources, to separate storage bandwidth acquisition from capacity acquisition and to support parallel file systems to meet Exascale requirements.

As an increasing number of HPC sites move to real-world implementation of multi-site HPC collaboration, concerns about security, privacy, and data sharing have intensified significantly. A strong majority of those surveyed (70 percent) view security and data-sharing complexity as the biggest impediments to multi-site collaborations.

  • One HPC organisation that is at the forefront of employing leading-edge technologies to keep its research data secure and private is Weill Cornell Medicine, a large, multi-site organisation managing massive volumes of patient data. You can watch a video of how Vanessa Borcherding, Weill Cornell Medicine's Director of Scientific Computing Unit, discusses their approach to securing potentially sensitive genomic data that needs to be available for collaborative research.

With storage performance a critical requirement for today's large-scale and petascale-level data centres, site-wide file systems continue to be a significant infrastructure trend in HPC environments, according to 72 percent of HPC customers polled. Site-wide file systems allow architects either to consolidate data storage multiple compute clusters on the same storage platform and/or to provide the flexibility to upgrade storage and servers independently as needed. In addition to some of the largest supercomputing sites like those at Oakridge National Lab (ORNL), National Energy Research Scientific Computing Center (NERSC) and Texas Advanced Computing Center (TACC), site-wide file systems are expanding into more mid-sized data centers with a smaller number or smaller sized compute clusters.

"The results of DDN's annual HPC Trends Survey reflect very accurately what HPC end users tell us and what we are seeing in their data centre infrastructures. The use of private and hybrid Clouds continues to grow although most HPC organisations are not storing as large a percentage of their data in public Clouds as they anticipated even a year ago. Performance remains the top challenge, especially when handling mixed I/O workloads and resolving I/O bottlenecks. Given this, it's not surprising that 90 percent of those surveyed are using flash within their data centres, but what is notable is that the more storage experience a site has, the more likely they are to use flash to accelerate multiple tiers of storage rather than putting it all in one tier for one part of the workflow", added Laura Shepard. "Survey respondents also reaffirmed that data storage is the most strategic part of the data centre."

Source: DataDirect Networks - DDN

Back to Table of contents

Primeur weekly 2016-11-07

Focus

EGI to finalize service catalogue and ISO certification ...

Exascale supercomputing

SLAC and Berkeley Lab researchers prepare for scientific computing on the exascale ...

Quantum computing

Researchers nearly reached quantum limit with nanodrums ...

Focus on Europe

New approach for ARM-based technology to halve the cost of powering data centres ...

PRACE to award contracts in third and final phase of Pre-Commercial Procurement (PCP) ...

PRACE welcomes new Managing Director Serge Bogaerts ...

PRACE 2016 Digest Special Edition on Industry is out ...

Supercomputer comes up with a profile of dark matter ...

Middleware

Bright Computing supplies Bright OpenStack to Stony Brook University ...

DDN Annual High Performance Computing Trends survey reveals rising deployment of flash tiers and private/hybrid Clouds versus public for HPC ...

With Corral 3, TACC provides a more unified data structure and increased space ...

Hardware

Mellanox launches open source software initiative for routers, load balancers, and firewalls ...

Mellanox Multi-Host technology reshapes data centre economics ...

Cray awarded $26 million contract from the Department of Defense High Performance Computing Modernization Programme ...

Hewlett Packard Enterprise completes acquisition of SGI ...

Centre for Modelling & Simulation in Bristol launches new supercomputer ...

Baylor University selects Cray CS400 cluster supercomputer to power innovative research ...

SGI awarded $27 million systems contract with the Army Research Laboratory Defense Supercomputing Resource Center ...

Applications

XSEDE spins off annual conference to unite research computing community ...

Researchers at UCSB explore the delicate balance between coherence and control with a simple but complete platform for quantum processing ...

Cosmic connection: KITP's Greg Huber worked with nuclear physicists to confirm a structural similarity found in both human cells and neutron stars ...

New technique for creating NV-doped nanodiamonds may be boost for quantum computing ...

New bacteria groups, and stunning diversity, discovered underground ...

The Cloud

IBM drives Cloud storage with new all-flash and software defined solutions ...

Capital markets firms continue to invest in hardware for compute Grids alongside growing Cloud adoption, according to TABB Group Research ...