BY XIAOZHI LIM
Data centres are energy guzzlers – and they’re growing hungrier due to the rise in internet traffic, demand for cloud-based computing and AI. Researchers are trying to find solutions, fast. XiaoZhi Lim reports
In 2020, data centres consumed some 300TWh, or 1.5% of the world’s electricity (Nature, 2025, DOI: 10.1038/s41586-025-08832-3). But not all that electricity is used for powering computer chips. The ratio of energy consumed by a data centre, compared with that drawn by the equipment, is a key sustainability metric called the power usage efficiency (PUE). In 2022, average global PUE was around 1.55, and much of the additional energy consumed is used to keep servers cool. Data centres are ‘huge heat generation factories,’ says Mohammad Azarifar of Auburn University, US. By one estimate, 99% of the electricity that chips consume ultimately turns into waste heat, all of which needs to be dissipated (Appl. Therm. Eng., 2024, DOI: 10.1016/j.applthermaleng.2024.123112).
Currently, most data centres are cooled with air-conditioning. In tropical countries such as Singapore, cooling can take up to 40% of the energy consumed, says Ruban Whenish, who manages the Sustainable Tropical Data Centre Testbed (STDCT) at the National University of Singapore (NUS). However, AC is no longer sufficient to keep data centres cool.
‘Air has very low thermal conductivity so it’s facing a limit,’ says Renkun Chen, from the University of California San Diego, US (UCSD). With chip manufacturers squeezing more transistors into their chips and data centre operators packing more servers into their racks, the amount of heat generated is fast outpacing the speed at which cool air can remove it. AC is also becoming less effective as ambient temperatures rise with a warming climate.
Liquid coolants can pick up much more heat from a hot surface than air. While performances vary, liquid-cooling techniques can handle today’s electronics, which still generate less than 100W/cm2 of heat. But researchers expect chips to grow hotter as they run faster. And besides the servers in data centres, other high-powered electronics such as lasers or batteries will need to stay cool in a warming world too.
Cooling hotspots
Singapore, with more than 70 data centres and daily temperatures averaging 31-33°C, has been researching better ways to keep data centres cool. Housed behind the natural history museum at NUS, the STDCT is a working data centre that hosts the university’s cloud services and is also a testbed for research into emerging cooling techniques. In 2023, the STDCT launched with a target to find solutions, or combinations of solutions, that would reduce the data centre PUE to below 1.2.
A top priority is to find ways to safely raise the temperature of cooled supply air to 26°C or above. That’s because with every 1°C rise in air-conditioning temperature setting, a data centre could save 2-5% of its total power consumption. But doing so risks overheating the electronic equipment. Besides degrading the servers, overheating slows chip performance; if heat cannot be dissipated quickly enough, that would also limit how closely servers can be packed together in a data centre, necessitating a larger footprint.
One of the earliest liquid-cooling methods was simply to pipe cold water to heat exchangers fitted on the doors of server racks in a data hall. Introduced by IBM in 2005, such rear-door heat exchangers cool only the space within the racks. ‘You don’t need to cool the entire data hall, just the servers,’ Whenish says.
An emerging strategy is direct-to-chip cooling, which targets just the hottest components: the CPU and the GPU chips themselves. NVIDIA’s Hopper chip, for example, generates 86W/cm2 of heat. In this strategy, a cold plate is fixed on top of a chip; cool fluid flows through channels within the plate and picks up heat, exiting warmer than when it entered. At STDCT, Poh Seng Lee and colleagues are developing a cold plate design that can cool via both liquid and air. Together with an array of microchannels, the team added finned structures above the plate to further dissipate heat through air cooling.
This dual liquid-and-air cooling design builds in redundancy in case liquid cooling fails, Lee explains. So far, the team has opted to use propylene glycol and water – an industry standard – for the working fluid. Additive manufacturing of the cold plate eliminates joints, which helps reduce risks of fluid leakage. The team has launched a Singapore-based start-up, CoolestDC, to commercialise the technology.
Immersing servers
Direct-to-chip cooling tackles only the server hotspots; the remaining heat generated must still be removed by AC. A different strategy called immersion cooling absorbs all the heat a server generates by submerging the server entirely in a cool fluid. Keeping servers submerged within a liquid also helps protect the electronic equipment from dust and, in desert climates, sand, says STDCT researcher Melvan Tan. Such immersion fluids must be dielectric, or electrically insulating, so they do not damage the electronic equipment.
Hydrocarbon-based coolants are currently leading candidates. Major oil companies are moving into this new market – for example, Shell has partnered with STDCT and supplied the fluid for the immersion cooling system being tested.
The manufacturers of servers, however, aren’t quite yet prepared for immersion cooling. ‘Not all the parts are ready’ in some servers, says Lee. Getting more server manufacturers to maintain the warranty of their equipment if placed in a cooling fluid is a top priority before immersion cooling can advance, he says.
Immersion fluids are also more viscous than air, so they don’t flow well through the narrow fins of the cold sink components on servers built to dissipate heat through air, adds Tan. He is working on designs that can support viscous fluid movement.
Meanwhile, Azarifar has identified another point for improvement – the speed of the fluid circulating through the tank. In a typical immersion cooling tank, pumps circulate the fluid, which starts out cool at the bottom of the tank and rises past the submerged servers while removing heat. ‘The velocity of the [fluid] is very low, it’s at most 10mm/s,’ he says.
Increasing the speed of fluid flow would increase the rate of heat removal. ‘One of the most efficient ways to move a liquid is the way jellyfish swim,’ Azarifar says. Azarifar and colleagues built a system that mimics this, by employing jets that use a diaphragm to suck and expel liquid (Appl. Therm. Eng., 2025, DOI: 10.1016/j.applthermaleng.2024.125007). The jets shoot water at an immersed, water-proofed server perpendicularly, just as a high-pressure power washer removes dirt best when it is hitting the side of a house or a car perpendicularly, explains Azarifar.
Using high-speed cameras, the team documented fluid speeds up to 1.5m/s. Azarifar envisages outfitting existing immersion cooling tanks with these liquid jet systems, which draw just 0.2W of power and contain no moving parts, while boosting the rate of heat removal 12-fold. The researchers have tested their system on other immersion fluid candidates such as mineral oil and have applied for a patent.
Phase change
Single-phase techniques in which a liquid simply enters cold and exits hot cannot remove as much heat as two-phase techniques. The latter allow liquids to remove heat by evaporation before condensing away from the server and recirculating. Much more heat can be absorbed by a liquid as it changes phase, compared with changing temperature, says UCSD’s Chen. Evaporating the same volume of water can remove at least 10-20 times more heat, compared with just raising its temperature.
Two-phase immersion cooling has garnered plenty of interest since the late 2010s, says Tan. As with single-phase immersion cooling, it involves submerging servers in a tank of fluid, but the fluid boils off at lower 50-60°C temperatures. The vapour is collected, condensed outside the tank, and recirculated. According to a recent life-cycle analysis by Microsoft, two-phase immersion cooling could offer 20% reductions in energy demand and greenhouse gas emissions plus 48% reduction in water consumption, compared with air cooling. In comparison, the single-phase direct-to-chip and immersion cooling techniques offer 15% and 31-45% savings, respectively.
Because two-phase immersion cooling fluids have more stringent property requirements, including specific boiling points and the ability to maintain the integrity of electrical data signals through the liquid, candidates are usually limited to fluorinated liquids.
More recently, adoption of two-phase immersion cooling has slowed due to regulatory uncertainty over per- and poly-fluoroalkyl substances (PFAS), particularly in the EU. Another blow to the adoption of this technique was 3M’s decision in December 2022 to stop producing PFAS products, including fluorinated dielectric fluids considered top candidates for two-phase immersion cooling, says Tan.
Another fluorochemicals manufacturer, Chemours, remains confident of the success of its fluorochemical fluids to aid data centre cooling. In August 2025, Samsung Electronics qualified Chemours’ fluids for use in two-phase immersion cooling systems with its latest solid-state hard drives. Chemours has also begun partnering with Navin Fluorine to produce the fluids, with initial manufacturing beginning in 2026.
However, two-phase immersion cooling still faces a theoretical limit on how quickly it can remove heat – estimated at about 200W/cm2.
The main mechanism of heat removal is by boiling off the liquid, when gaseous bubbles escape from the liquid’s surface, explains Chen. ‘There’s not much you can do about the kinetics, like how fast you can move the bubbles, that’s why there’s a limit.’
In comparison, two-phase direct-to-chip cooling could theoretically remove up to 1500W/cm2 of heat. Here, a coolant evaporates as it passes through a cold plate on a chip. Then, the vapour is routed to a condenser outside the server before being condensed and pumped back. Chemours has also identified two-phase direct-to-chip cooling as an emerging cooling technique, and the firm is working with industrial partners to optimise a set of fluids to support that application as well.
Chen and colleagues prepared an evaporator based on a porous membrane for two-phase direct-to-chip cooling (Joule, 2025, DOI: 10.1016/j.joule.2025.101975). Instead of membranes with single-channel pores, the researchers selected glass fibre membranes. Such membranes comprise long glass fibres that have been randomly pressed into a flat mat, so the gaps between the fibres become a network of pores. Single-channel pores can clog, Chen explains, creating weak points of failure where the liquid cannot evaporate.
Using water as the working fluid, the researchers managed to dissipate up to 800W of heat/0.5cm2 across the glass fibre mat.
Chen believes the heat removal rate could be boosted by optimising the membrane’s microstructure and by using better heat conductors, such as metals. The team is also investigating how well the system performs with dielectric fluids other than water.
Sending heat to space
Regardless of the cooling technique, any heat removed must ultimately be transferred to a cold sink, most commonly cold water. An alternative could be cold outer space, at a temperature of just 3K. Normally, our atmosphere traps heat, but infrared energy with wavelengths between 8 and 13µm can radiate away through an atmospheric ‘window’ into outer space.
Zhen Li at Tsinghua University, China, and his colleagues attempted to find out whether this cold sink of outer space could help support cooling of data centres (J. Therm. Sci., 2025, DOI: 10.1007/s11630-025-2146-x). The team used a commercially available radiative cooling film, optimised to emit energy of 8-13µm, to build radiative cooling structures that absorb heat from water passing through outdoor pipes. After testing their performance, the team simulated a year-long cooling performance to help cool liquid refrigerant moving in and out of a data centre.
The researchers found the radiative coolers could help reduce the water’s temperature by 2-3°C. Radiative cooling doesn’t work equally well throughout the year – for example, performance drops during the summer when it is more humid. Despite this, after simulating a full year’s performance, the researchers found the PUE of a data centre in Beijing could be reduced from 1.38 to 1.19 when supported with a radiative cooler.
Drying the air
In humid climates such as in Singapore, dehumidification can reduce the burden on air-conditioning. Besides lowering room temperatures, AC systems must expend energy to dry the air. This latent heat of cooling can consume up to 40% of the electricity used for cooling (Coord. Chem. Rev., 2023, DOI: 10.1016/j.ccr.2023.215384). Humid air also lowers the efficiency of indirect evaporative cooling, which cools by water evaporation and is considered one of the most efficient techniques. Partnering with Nortek Air Solutions, Meta has built such a cooling system, called StatePoint Liquid Cooling, at its data centre in Singapore.
To reduce the impact of humidity, STDCT researchers led by Seeram Ramakrishna are attempting to pre-dry the air passively before it goes through the evaporative cooling system. For this effort, Ramakrishna and his colleagues coated heat exchangers with a high-performing desiccant, such as a metal-organic framework. Warm humid air is first dried in a desiccant-coated heat exchanger, so making it more efficient at evaporating water. Waste heat from the system dries out the water-saturated desiccant so it can be reused.