Innovative new ‘direct cooling’ systems lower processing temperatures and support smart energy efficiency strategies for the world’s data centers
Sustainability in IT, particularly within the realm of compute-intensive AI and GenAI workloads, is crucial due to the significant energy consumption and heat generation associated with AI and GenAI operations. This high energy consumption poses challenges for energy availability and efficiency, making sustainable approaches to cooling and energy use essential.
Traditional data center air conditioning-based systems require much more electricity than liquid cooling. Air cooling is less effective in managing heat exchange, as it is a poor heat conductor. In contrast, direct cooling uses the higher thermal properties of liquid to reduce temperatures more efficiently. Direct cooling reduces energy consumption and cost and delivers more efficient temperature reduction across processing units.
Innovative cooling technologies, such as Hewlett Packard Enterprise’s (HPE) Direct Liquid Cooling (DLC) systems, offer a sustainable solution by efficiently reducing processing temperatures and operational costs. The adoption of such technologies minimizes utility costs, lowers carbon emissions, and supports the stable operation of AI ready data centers. HPE has recently announced significant advancements in their Direct Liquid Cooling technology.

In October 2024, HPE introduced the industry’s first 100% (fanless) direct liquid cooling systems architecture. This innovation enhances the energy and cost efficiency of large-scale AI deployments by providing 20% more performance per kilowatt, resulting in a carbon reduction of 87% and 86% cost savings compared to traditional air-cooled systems. The architecture includes an 8-element cooling design covering all critical components such as the GPU, CPU, and network fabric.
In November 2024, HPE expanded their direct liquid-cooled supercomputing solutions, introducing two new AI systems designed for large language model training, NLP, and multi-modal model training. These new offerings are part of HPE’s leadership-class HPC portfolio and are geared towards service providers and large enterprises to fast-track AI system deployment. Where 100% Direct Liquid Cooling is the sustainability norm for the fastest supercomputers in the world, two specific liquid cooling methods are key for enterprise use:
- The liquid-to-air cooling method involves using liquid coolants to absorb heat from the air. This can be done at either the inlet or exhaust of the server systems. One common method is the use of Rear Door Heat Exchangers (RDHX), which neutralize the exhaust air with cooled liquid at the back of the server rack. Another method involves the use of systems like HPE Adaptive Rack Containment (ARC), which cools the inlet air of the servers. The heated air is then cooled by a heat exchanger, and the heated fluid is subsequently cooled by the facility’s water system.
- The 70% Direct Liquid Cooling (DLC) method involves using cold plates on GPUs and CPUs to extract about 70% of the heat generated by the server. The remaining 30% of the heat is removed by fans, which typically run at lower speeds than in traditional air-cooled systems. The coolant from the servers is routed through a heat exchanger to transfer its heat to the facility water without mixing the primary and secondary side fluids.
HPE’s direct cooling technology is now running on four of the Top-10 supercomputers on the Green500 the systems of the TOP500, that ranks world’s most energy-efficient supercomputers.
This underscores HPE’s leadership in direct liquid cooling technologies, which are essential for Enterprise use of modern AI, GenAI and new Agentic AI workloads, offering enhanced performance, efficiency, and sustainability.
This is a contribution by Stefan De Schuyter, HPE BeLux Country CT. The editors are not responsible for the content. Check here for more information about the company.