"The increasing complexity of science and engineering research at Purdue is driving a need for increasingly faster and scalable computational resources", stated Michael Shuey, HPC system manager at Purdue University. "Mellanox's FDR InfiniBand solutions, and in particular their Connect-IB adapters, allow MPI codes to scale more readily than our previous systems. This enables more detailed simulations, and helps empower Purdue scientists to push the envelope on their research in weather, bioscience, materials engineering and more."
"We are pleased to have Mellanox's FDR InfiniBand solution as the interconnect of choice for Purdue's Conte supercomputer, the nation's fastest university-owned supercomputer", stated Gilad Shainer, vice president of marketing at Mellanox Technologies. "Utilizing Mellanox's Connect-IB adapters, Purdue is able to take advantage of the adapter's leading message rate and bandwidth performance to provide its scientist with unmatched performance and capabilities to enhance and accelerate their highly-complex simulation modelling."
Connect-IB is the world's most scalable server and storage adapter solution for High-Performance Computing (HPC), Web 2.0, Cloud, Big Data, financial services, virtualized data centres and storage environments. Connect-IB adapters deliver the highest throughput of 100Gb/s utilizing PCI Express 3.0 x16, unmatched scaling with innovative transport services, sub-microsecond latency and 137 million messages per second - 4x higher message rate over competing solutions.
Available today, Mellanox's FDR 56Gb/s InfiniBand solution includes Connect-IB adapter cards, SwitchX-2 based switches - from 12-port to 648-port, fiber and copper cables, and ScalableHPC accelerator and management software.