10 Nov 2016 Sunnyvale, Yokneam - Mellanox Technologies Ltd., a supplier of high-performance, end-to-end interconnect solutions for data centre servers and storage systems, has introduced the world's first 200Gb/s data center interconnect solutions. Mellanox ConnectX-6 adapters, Quantum switches and LinkX cables and transceivers together provide a complete 200Gb/s HDR InfiniBand interconnect infrastructure for the next generation of high performance computing, machine learning, Big Data, Cloud, web 2.0 and storage platforms. These 200Gb/s HDR InfiniBand solutions maintain Mellanox's generation-ahead leadership while enabling customers and users to leverage an open, standards-based technology that maximizes application performance and scalability while minimizing overall data center total cost of ownership. Mellanox 200Gb/s HDR solutions will become generally available in 2017.
"The ability to effectively utilize the exponential growth of data and to leverage data insights to gain that competitive advantage in real time is key for business success, homeland security, technology innovation, new research capabilities and beyond. The network is a critical enabler in today's system designs that will propel the most demanding applications and drive the next life-changing discoveries", stated Eyal Waldman, president and CEO of Mellanox Technologies. "Mellanox is proud to announce the new 200Gb/s HDR InfiniBand solutions that will deliver the world's highest data speeds and intelligent interconnect and empower the world of data in which we live. HDR InfiniBand sets a new level of performance and scalability records while delivering the next-generation of interconnects needs to our customers and partners."
"Ten years ago, when Intersect360 Research began its business tracking the HPC market, InfiniBand had just become the predominant high-performance interconnect option for clusters, with Mellanox as the leading provider", stated Addison Snell, CEO of Intersect360 Research. "Over time, InfiniBand continued to grow, and today it is the leading high-performance storage interconnect for HPC systems as well. This is at a time when high data rate applications like analytics and machine learning are expanding rapidly, increasing the need for high-bandwidth, low-latency interconnects into even more markets. HDR InfiniBand is a big leap forward and Mellanox is making it a reality at a great time."
"The leadership scale science and data analytics problems we are working to solve today and in the near future require very high bandwidth linking compute nodes, storage, and analytics systems into a single problem solving environment", stated Arthur Bland, OLCF Project Director, Oak Ridge National Laboratory. "With HDR InfiniBand technology, we will have an open solution that allows us to link all of our systems at very high bandwidth."
"Data movement throughout the system is a critical aspect of current and future systems. Open network technology will be a key consideration as we plan the next generation of large-scale systems, including ones that will achieve Exascale performance", stated Bronis de Supinski, chief technology officer in Livermore Computing. "HDR InfiniBand solutions represent an important development in this technology space."
"We are excited to see Mellanox continue leadership in high speed interconnects", stated Parks Fields, SSI team lead HPC-design at the Los Alamos National Laboratory. "HDR InfiniBand will provide us with the performance capabilities needed for our applications."
"High-speed storage for HPC solutions are critical for maximizing performance benefits of today's HPC, machine learning, media production and Big Data", stated Kurt Kuckein, director of product management, DDN Storage. "DDN and Mellanox HDR 200Gb/s technology will enable absolute unmatched performance in high performing storage solutions for our end-customers that demand the ultimate in performance for their real-time workloads."
"Whether it's high performance computing, Big Data or Cloud, Mellanox and Dell EMC HPC Systems customers will benefit from the extreme performance, scalability and first to market speed advantage of our joint end-to-end solutions", stated Jim Ganthier, senior vice president, Validated Solutions Organization and HPC, Dell EMC. "Our collaborative innovation with Mellanox helps customers accelerate time to insights and results, utilizing an open standards-based approach and enabling their next discoveries."
"Fabrics are key to high performance clusters", stated Scott Misage, vice president, HPC Solutions and Apollo Pursuits, Hewlett Packard Enterprise. "Mellanox 200Gb HDR products will help our joint customers take full advantage of the scalability of HPE's purpose-built Apollo HPC solutions, maximizing overall application efficiency for their High Performance Computing workloads."
"Mellanox is not only an innovator for networking solutions but an advocate for improving data centre ROI", stated Mr. Qiu Long, president of the Huawei Server Product Line. "With the introduction of this new 200Gb/s HDR solution, high performance computing and many other demanding applications can forge ahead."
"Mellanox is advancing the bandwidth, latency, and programmability of fabrics with 200Gb HDR InfiniBand solutions for the OpenPOWER ecosystem, and we are looking forward to integrating HDR InfiniBand into the OpenPOWER technology portfolio", stated Brad McCredie, vice president and IBM Fellow, Systems and Technology Group CTO, IBM Systems. "The OpenPOWER ecosystem incorporates the best of new technologies through collaborative innovation, and were excited to see how ConnectX-6 and Quantum will push performance to the next level."
"Mellanox has taken a quantum leap forward in data center networking with InfiniBand solutions that now provide world-class performance of 200 million messages per second", stated Mr. Leijun Hu, VP of Inspur Group. "In addition, the new Mellanox Quantum 200Gb/s HDR InfiniBand switches now represent the world's fastest, most flexible switch with an extremely low latency of 90ns."
"Demanding HPC workloads, such as Artificial Intelligence, requires extremely high bandwidth for enormous amounts of data crunching. HDR InfiniBand will be an increasingly important technology for the modern data centre. Mellanox intelligent interconnect solutions are the foundation for many of our market leading HPC solutions, from big to small; we're excited to deliver the advantages of HDR to a broader set of HPC clients running exceptionally challenging workloads", stated Scott Tease, Executive Director, High Performance Computing, Lenovo Data Center Group.
"ConnectX-6 significantly improves bandwidth to NVIDIA GPUs resulting in better scale out solutions for HPC, deep learning, and data center applications", stated Dr. Ian Buck, Vice President of the Accelerated Computing Group at NVIDIA. "With integrated support for NVIDIA GPUDirect technology, Mellanox interconnect and NVIDIA's high performance Tesla GPUs will enable direct data transfers across clusters of GPUs, essential to addressing complex and computationally intensive challenges in very diverse markets."
"Our customers are constantly looking towards the next cutting edge infrastructure that gives them the competitive advantage", stated Ken Claffey, VP and GM, Seagate Cloud Systems and Silicon Group. "Seagate couldn't be more excited to embrace Mellanox's HDR 200Gb/s capabilities that will deliver unmatched storage platforms for network-intense applications like media streaming and compute clustering."
"We are thrilled to see Mellanoxs newest solutions that literally double data speeds from the previous generation", stated Mr. Chaoqun Sha, SVP of Technology at Sugon. "These new solutions are not only ideal for both InfiniBand and the Ethernet standards-based protocols, but also give customers the flexibility to take advantage of Mellanox's innovative Multi-Host technology."
The ConnectX-6 adapters include single/dual-port 200Gb/s Virtual Protocol Interconnect ports options, which double the data speed when compared to the previous generation. It also supports both the InfiniBand and the Ethernet standard protocols, and provides flexibility to connect with any CPU architecture - x86, GPU, POWER, ARM, FPGA and more. With unprecedented world-class performance at 200 million messages per second, ultra-low latency of 0.6usec, and in-network computing engines such as MPI-Direct, RDMA, GPU-Direct, SR-IOV, data encryption as well as the innovative Mellanox Multi-Host technology, ConnectX-6 will enable the most efficient compute and storage platforms in the industry.
The Quantum 200Gb/s HDR InfiniBand switch is the worlds fastest switch supporting 40-ports of 200Gb/s InfiniBand or 80-ports of 100Gb/s InfiniBand connectivity for a total of 16Tb/s of switching capacity, and with an extremely low latency of 90ns. Mellanox Quantum advances the support of in-network computing technology, delivers optimized and flexible routing engines, and is the most scalable switch IC available. Mellanox Quantum IC will be the building block for multiple switch systems - from 40-ports of 200Gb/s or 80-ports of 100Gb/s for Top-of-Rack solutions - to 800-ports of 200Gb/s and 1600-ports of 100Gb/s modular switch systems.
To complete the end-to-end 200Gb/s InfiniBand infrastructure, Mellanox LinkX solutions will offer a family of 200Gb/s copper and Silicon Photonics fiber cables.