InfiniBand ranks first overall for high-speed interconnect on the TOP500 list, connecting 3 to 7.5 times more new supercomputers on the TOP500 list compared to proprietary networks. InfiniBand remains the interconnect of choice for HPC and AI infrastructures.
The TOP500 List has evolved in the recent years to include more hyperscale, Cloud, and enterprise platforms, in addition to the high-performance computing and machine learning systems. Nearly half of the systems on the November 2018 list can be categorized as non-HPC application platforms, with a vast part of these systems representing US, Chinese and other hyperscale infrastructures, and are interconnected with Ethernet. Mellanox Ethernet solutions connect 130 systems or 51% of the Ethernet systems on the list.
"Mellanox InfiniBand and Ethernet solutions now connect the majority of systems on the TOP500 list, an increase of 38 percent over the last twelve-month period. InfiniBand In-Network Computing acceleration engines provide the highest performance and scalability for HPC and AI applications, and accelerate the top three supercomputers in the world. InfiniBand enables record performance in HPC and AI, enabling the advancement of academic and scientific research which is reshaping our world. We continue to win new opportunities and are proud to have deployed the first HDR InfiniBand supercomputer at the University of Michigan. We expect to see more HDR InfiniBand connected platforms this year", stated Eyal Waldman, president and CEO of Mellanox Technologies.
"For the first time, Mellanox's Ethernet solutions connect the majority of the Ethernet based platforms on the TOP500 list, demonstrating the growing adoption of our 25 gigabit per second and faster Ethernet adapters, switches and cables for hyperscale, Cloud and enterprise infrastructures. The innovations built into our InfiniBand and Ethernet solutions deliver the highest return on investment for compute and storage infrastructures, and enable the next generation of the world's leading supercomputers, hyperscale and enterprise data centres. We have already started planning the NDR 400 gigabit InfiniBand technology that will empower the Exascale supercomputing and machine learning platforms in the future."
Key takeaways from the November 2018 list include: