Back to Table of contents

Primeur weekly 2014-11-17

Special

China's Tianhe-2 supercomputer retains top spot on fourth consecutive TOP500 List ...

TOP500 site gets new look and expands content ...

Focus

Netherlands Forensic Institute to crack crime using HPC and Big Data ...

Intel to support merging of HPC and data analytics on the hardware, software and application level ...

Exascale supercomputing

U.S. to build two flagship supercomputers for national labs ...

Department of Energy awards $425 million for next generation supercomputing technologies ...

The Cloud

Altair launches HyperWorks Unlimited - Virtual for Amazon Web Services ...

EuroFlash

SysFera-DS V5.0 released ...

Bright Computing appoints industry veteran Clement Lau as Director for Asia Pacific ...

Bright Cluster Manager now supports RHEL 7 and CentOS 7 ...

UK Laboratory celebrates 50 years of supercomputing ...

Groundbreaking Welsh research project seeks to substantially cut cyclist deaths ...

MEGWARE and Numascale extend their collaboration and co-exhibit at SC14 ...

IBM, Forschungszentrum Jülich and NVIDIA team to establish POWER Acceleration and Design Center ...

E4 Computer Engineering and Applied Micro Circuits Corporation will showcase the ARKA server RK003 at SC14 ...

USFlash

Indiana University showcases advanced computing tech at annual supercomputing conference ...

John West to head Strategic Initiatives at TACC ...

IDC finds growth, consolidation, and changing ownership patterns in worldwide data centre forecast ...

Supercomputers fuel global high-resolution climate models ...

100 supercomputers later, Los Alamos high-performance computing still supports national security mission ...

Asia Supercomputer Community to launch ASC15 at SC14 ...

ArrayFire is now open source ...

Mellanox announces availability of 100Gb/s Direct Attach Copper and Active Optical Cables ...

Mellanox provides leading application performance with FDR InfiniBand for HPC-purpose built Dell PowerEdge C4130 ...

Mellanox EDR 100Gb/s InfiniBand solutions chosen for leading CORAL supercomputer project ...

CoolIT Systems features full range of Direct Contact Liquid Cooling solutions for HPC servers and data centres at SC14 ...

Supercomputing invited plenary talks focus on high-performance computing at DreamWorks Animation and NIH's vision for solving future computational challenges in health care ...

Supercomputing beyond genealogy reveals surprising European ancestors ...

U.S. Department of Energy selects IBM "data centric" systems to advance research and tackle Big Data challenges ...

Oak Ridge to acquire next generation supercomputer ...

Ohio supercomputing experts to leverage conference presence ...

Intel to support merging of HPC and data analytics on the hardware, software and application level


2 Oct 2014 Heidelberg - At the ISC Big Data'14 Conference in Heidelberg, Primeur Magazine had the opportunity to interview Alan Priestley, Director Big Data Analytics and Stephan Gillich, Director HPC and Workstations, both active at Intel EMEA. According to Stephan Gillich, there is an overlap between High Performance Computing and Big Data Analytics. There is an overlap in the underlying technology. Hadoop is one of the key tools in data analytics and in HPC, the main instruments are clusters and servers. In fact, they are looking pretty much the same. Intel is providing technology and products to those servers. There is also an overlap in the applications. In the genomics area, there is lots of data involved which you can analyze. There is also a lot of simulation being done. The genomics area is a good example to show the overlap between data analytics and HPC. We can call this high performance data analytics.

Alan Priestley added that the HPC community has been processing lots of data and running analysis against this for an extended period of time. What we are now starting to see is that many IT organisations and enterprises are faced with the challenge of new data types. In the past they just brought traditional data into relational databases. They knew what they had, they stored it, they processed it and they did business intelligence.

Now you see lots of new data types coming in - we have spoken about the Internet of Things, Social Media sources - and the IT organisations are faced with two types of challenges. The first one is how to store the data and the second one is how to monetize that data and get benefit from it for their enterprises.

The HPC community is used to processing the data. Many IT organisations are searching ways to deal with these huge amounts of data and how to translate that into value for their business.

Primeur Magazineaddressed the topic of open source.

Alan Priestley said that much of the software that is used for Big Data analytics is open source software. Many organisations are starting to pick up open source software solutions, such as Hadoop, and use them internally. It is a minimal cost for them to start experimenting. A number of companies are taking the pure software from Apache and converting that into distributions and make it available to their customers. This is comparable to what happened in the Linux community and the Hadoop family.

Intel is very supportive with the open source community. It has a lot of contributors in the community and has a lot of source code provided, coming out of there. Recently, Intel has done a lot of work around Hadoop with Hadoop running on Intel processors. This is not only about performance but also about security. Intel has been optimizing Hadoop to make it more secure, leveraging some of the features that Intel has built over the years.

Intel recently made an investment in Cloudera. Cloudera has integrated the Intel Hadoop features to make them available to a broader community.

Primeur Magazinealso asked about the virtualization of the infrastructure.

Many people are taking the server and virtualizing that down, Alan Priestley confirmed. Intel has a software-defined server. This is a precursor to the private Cloud. You can distribute the Cloud on the server. Intel is now making a software-defined mechanism for networking and storage. Many networks in data centres are predefined to deploy the routines. With SDN, Intel is deploying the data plane and the control plane separately. The control plane is moved to another server to manage all the devices in the data centre.

The administrators can manage the network, reposition the network, and change the network settings very quickly.

Storage use to involve big monolithic blocks but now Intel is heading for a flexible storage environment. The next level beyond that is software-defined infrastructure. Intel is moving towards an approach where you seperate the server into compute, storage, and I/O modules. The modules are interconnected using high speed interconnect based on silicon photonics. That way it is possible to reconfigure the physical infrastructure and the software.

To deploy the workload, it is now possible to determine what server infrastructure to require from the management console, configure the data centre to meet that need, and deploy the workload into the data centre.

Analytics requires big high performance computers with lots of memory. You do not want to analyze all the time, sometimes the systems are idle. When you can reconfigure the data centre, responding to the needs at that time, this provides flexibility and efficiency.

The flexible infrastructure is an enabler for Cloud infrastructure in general, added Stephan Gillich. This is useful for HPC as well. There is a large part of the market which is called midmarket HPC and which is looking for flexible solutions in terms of support for the high performance needs.

Many people are still using workstation for the evaluation process in the production life cycle and simulation part today. If they have a Cloud-like flexible infrastructure available, they can get access to higher performance systems in a flexible way. As such, they will be able to develop their products faster and better.

Intel is trying to support this evolution from many angles: from the hardware aspect, the high performance technology and products but also from the software side with Intel's efforts in the open source space.

If you have a flexible infrastructure, you still have applications on top of that. We see the hardware going more and more parallel but the application software needs to keep up with that. Intel has started an initiative with the Intel Parallel Computing Centres where Intel works with code owners of so-called community code that is being used by research but also by industry in many fields, in order to make these applications really capable of using the parallelism. This is why we call them parallellization labs. Intel needs to work with the software industry to achieve this.

Ad Emmen

Back to Table of contents

Primeur weekly 2014-11-17

Special

China's Tianhe-2 supercomputer retains top spot on fourth consecutive TOP500 List ...

TOP500 site gets new look and expands content ...

Focus

Netherlands Forensic Institute to crack crime using HPC and Big Data ...

Intel to support merging of HPC and data analytics on the hardware, software and application level ...

Exascale supercomputing

U.S. to build two flagship supercomputers for national labs ...

Department of Energy awards $425 million for next generation supercomputing technologies ...

The Cloud

Altair launches HyperWorks Unlimited - Virtual for Amazon Web Services ...

EuroFlash

SysFera-DS V5.0 released ...

Bright Computing appoints industry veteran Clement Lau as Director for Asia Pacific ...

Bright Cluster Manager now supports RHEL 7 and CentOS 7 ...

UK Laboratory celebrates 50 years of supercomputing ...

Groundbreaking Welsh research project seeks to substantially cut cyclist deaths ...

MEGWARE and Numascale extend their collaboration and co-exhibit at SC14 ...

IBM, Forschungszentrum Jülich and NVIDIA team to establish POWER Acceleration and Design Center ...

E4 Computer Engineering and Applied Micro Circuits Corporation will showcase the ARKA server RK003 at SC14 ...

USFlash

Indiana University showcases advanced computing tech at annual supercomputing conference ...

John West to head Strategic Initiatives at TACC ...

IDC finds growth, consolidation, and changing ownership patterns in worldwide data centre forecast ...

Supercomputers fuel global high-resolution climate models ...

100 supercomputers later, Los Alamos high-performance computing still supports national security mission ...

Asia Supercomputer Community to launch ASC15 at SC14 ...

ArrayFire is now open source ...

Mellanox announces availability of 100Gb/s Direct Attach Copper and Active Optical Cables ...

Mellanox provides leading application performance with FDR InfiniBand for HPC-purpose built Dell PowerEdge C4130 ...

Mellanox EDR 100Gb/s InfiniBand solutions chosen for leading CORAL supercomputer project ...

CoolIT Systems features full range of Direct Contact Liquid Cooling solutions for HPC servers and data centres at SC14 ...

Supercomputing invited plenary talks focus on high-performance computing at DreamWorks Animation and NIH's vision for solving future computational challenges in health care ...

Supercomputing beyond genealogy reveals surprising European ancestors ...

U.S. Department of Energy selects IBM "data centric" systems to advance research and tackle Big Data challenges ...

Oak Ridge to acquire next generation supercomputer ...

Ohio supercomputing experts to leverage conference presence ...