Nvidia goes (even more) green with new liquid-cooled GPUs

Nvidia A100
(Image credit: Nvidia)

Nvidia has lifted the lid on a new line of liquid-cooled A100 and H100 GPUs that promise to bring greater energy efficiency to the data center.

Announced at Computex 2022, the new configurations are billed as “the next step in accelerated computing for Nvidia GPUs”, marking the first time the company has offered direct-to-chip liquid cooling.

Liquid-cooled A100 GPUs will be available in a few months’ time in a PCIe card format and will feature inside the HGX A100 server. The new H100 card, meanwhile, will be available in the HGX H100 server from early next year.

The cooling conundrum

Traditionally, data center operators - from large enterprises to cloud vendors - have relied on air conditioning to keep servers and other equipment from overheating.

However, chilling the air inside a data center is both inefficient and expensive. And this is especially true for facilities located in tropical climates (e.g. Hong Kong or Singapore), which are locked in a never-ending battle with the environment.

With organizations placing greater emphasis than ever on sustainability, attention has turned to identifying methods of cooling data centers more effectively, without compromising on performance.

“Data center operators aim to eliminate chillers that evaporate millions of gallons a water a year to cool the air inside data centers. Liquid cooling promises systems that recycle small amounts of fluids in closed systems focused on key hot spots,” Nvidia explained.

“We plan to support liquid cooling in our high-performance data center GPUs and our NVIDIA HGX platforms for the foreseeable future.”

In testing, the new liquid-cooled A100 cards were able to execute identical workloads using 30% less energy. By Nvidia’s calculations, switching out CPU-only servers running AI and HPC workloads for GPU-accelerated systems worldwide could save up to 11 trillion watt-hours of energy.

Another benefit is that the liquid-cooled GPUs fill just one slot in a server rack, whereas their air-cooled counterparts fill two, which means operators have the opportunity to pack much more compute into the same area.

Nvidia says that upwards of a dozen server manufacturers - from ASUS to Gigabyte, Supermicro and more - intend to integrate the new cards into their products later this year, with the first systems to hit the market in Q3.

  • This year, Computex is once again virtual, but we'll still be bringing you all the breaking computing news and launches as they happen, so make sure you check out all of TechRadar's Computex 2022 coverage
Joel Khalili
News and Features Editor

Joel Khalili is the News and Features Editor at TechRadar Pro, covering cybersecurity, data privacy, cloud, AI, blockchain, internet infrastructure, 5G, data storage and computing. He's responsible for curating our news content, as well as commissioning and producing features on the technologies that are transforming the way the world does business.