- Aug 17, 2014
Nvidia has announced its new plan for reducing the energy use of data centers crunching massive amounts of data or training AI models: liquid-cooled graphics cards. The company announced at Computex that it’s introducing a liquid-cooled version of its A100 compute card, and says that it consumes 30 percent less power than the air-cooled version. Nvidia’s also pledging that this isn’t just a one-off, it’s already got more liquid-cooled server cards on its roadmap, and hints at bringing the tech to other applications like in-car systems that need to keep cool in enclosed spaces. Of course, Tesla’s recent recall for overheating chips shows how tricky that can be, even with liquid cooling.
According to Nvidia, reducing the energy needed to perform complex computations could make a big impact — the company says data centers use over one percent of the world’s electricity, and 40 percent of that is down to cooling. Reducing that by almost a third would be a big deal, though it is worth noting that graphics cards are only one part of the equation; CPUs, storage, and networking equipment also draw power and need cooling as well. Nvidia’s claim is that with liquid cooling, GPU-accelerated systems would be far more efficient than CPU-only servers on AI and other high-performance tasks.