The system to be cooled is being housed at the NRC's facility for radio astronomy at Penticton B.C. The sealed container functions as a Faraday Cage to prevent leakage of electromagnetic energy which would contaminate the telescope's observations. The CHIME collaboration realised early in its planning process that cooling the custom GPU-intensive servers with traditional air conditioning would be difficult and costly, and began exploring liquid-cooled solutions.
In order to complete its primary cosmological mission, mapping out the largest volume of space ever attempted in a survey, CHIME requires a powerful signal processing back-end, capable of sustaining real-time correlation of high-cadence radio data. Given the scale of the telescope, with 400MHz of bandwidth and 2,048 receiving elements, this requires ~8x1015 integer operations per second (~8 Pop/s) operating 24/7 on a 6.4 Tb/s input stream. All nodes must be able to operate in high ambient temperatures: up to 45˚C for extended periods of time.
CoolIT Systems custom Rack DCLC implementation will provide a net cooling effect on room temperature. The liquid cooled system will consist of 256 rack-mounted General Technics GT0180 custom 4u servers housed in 26 racks managed by CoolIT Systems Rack DCLC CHx40 Heat Exchanger Modules. The custom direct contact cooling loops will manage 100% of heat generated by the single Intel Xeon E5 2620v3 CPUs and the Dual AMD FirePro S9300x2 GPUs, while simultaneously pulling heat from the ambient air into the liquid coolant loops.
"We chose to work with CoolIT Systems because their solutions are modular and robust, and as a result the most flexible and efficient for our situation", stated Keith Vanderlinde, Assistant Professor at the University of Toronto. "With the custom liquid cooling solution, we can drastically reduce CHIME's energy consumption and squeeze additional processing out of the GPUs."
"Our success with CHIME proves that our solutions are truly versatile, and that diverse HPC scenarios can increase their efficiency through the advantage of liquid cooling systems", commented CoolIT Systems CEO Geoff Lyon about the company's first experience cooling the computing system for a radio telescope. "We look forward to similarly unique installations in the future, and to continuing to work on less-traditional HPC cooling challenges", he concluded.
The liquid cooling installation's estimated completion date is early 2017.