Data center operators are switching from air cooling to liquid cooling as rack densities increase
The rising workloads from artificial intelligence are pushing traditional air cooling systems to their limits. To address this challenge, both chipmakers and data center operators are turning to a more efficient solution: liquid cooling.
Almost all data center operators plan to use liquid cooling. For example, Digital Realty has launched a high-density colocation service with liquid cooling, handling workloads of up to 70 kilowatts (kW) per rack.
As rack densities are expected to exceed 70 kW, the only viable solution for cooling these high-performance servers is liquid cooling – mainly direct-to-chip, like in gaming systems, or using immersion cooling, which is more challenging for the environment and requires more maintenance.
In data centers, we have reached a point where rack densities have surpassed what is possible with air cooling. Liquid cooling is an inevitable development to handle the increased workloads of processing units.
For data center operators under pressure to improve their power usage effectiveness (PUE), the power efficiency gains from liquid cooling offer a significant advantage. Less energy is needed to cool the refrigerant, leading to substantial cost savings.
The benefits of liquid cooling
But the benefits of liquid cooling go beyond energy and cost savings. A liquid cooling installation eliminates the need for mechanical air cooling equipment, freeing up valuable floor space for additional data hall capacity.
Another advantage, in the case of an immersion bath, is the lower floor-to-floor height requirement of four meters, similar to a grade A office space. This significant reduction, compared to the standard six meters needed for most existing data centers, comes with the trade-off of needing increased structural floor loading.
Recommended by LinkedIn
If the same amount of computing power is installed within the same floor area, the weight per square meter increases from 12 to 15 kilopascals (kPa) to at least 20 kPa in a liquid cooling installation.
Despite the clear benefits of liquid cooling, switching cooling technology is a major undertaking for operators, potentially requiring an overhaul of the existing infrastructure and design.
One of the key design changes involves plumbing installations. Traditionally, we’ve put a lot of effort into keeping water out of the data halls, using pre-action sprinklers or water detection systems. But in direct-to-chip cooling, small pipes are needed to deliver the coolant directly to the chips for heat removal.
This reality means new, purpose-built high-performance data processing units needed for AI make data centers better positioned to adopt liquid cooling more readily. While existing data centers with spare capacity can be partially upgraded, it’s unlikely that an existing facility can be upgraded to 100% AI due to infrastructure constraints.
Any major change in a live data center environment is a time of high risk and must be carefully managed. Operators need to consider how, from a resiliency standpoint, these works may result in service disruptions in other data halls.
The data center industry generally aims to be leading-edge, not bleeding-edge. Operators are often slow to adopt change and typically favor proven solutions. But advancements in AI and the wider technology industry are forcing data centers to adapt.
Data center companies are figuring out how best to deploy AI, whether in smaller, older buildings or in the design of their future facilities.
We provide direct intros to our personal investor network with a 93% success rate!
3moLiquid cooling? Definitely a game changer. Got its perks, but that switch ain’t easy; planning is key, ya know? Jens Gottsleben
As AI continues to push the boundaries, it's essential that our data center infrastructure evolves to meet these challenges. It's great to see advancements that address these growing needs.