Nexalus Co-Founders (L to R) - Dr. Cathal Wilson (COO), Professor Anthony Robinson (CSO) & Kenneth ... More
Everyone in the power generation business is talking about the huge increase in data center energy consumption. The IEA released a report on April 10th saying that data center energy use would double in the next five years, with a 4x increase in energy required to run AI models. The IEA forecasts that by 2030, data center energy demand will exceed the aggregate energy demand of the entire nation of Japan.
The climate impact of this power consumption cannot be overstated. Some portion of energy generation capacity will be provided by clean energy sources, but the projected jump in demand is so large that utilities are postponing decommissioning of large-carbon-footprint coal-fired plants.
Nexalus, a startup based in Ireland, the data center capital of Europe, has developed an innovative way to lower data center power consumption by over one-third and use the “waste heat” from these facilities to do other useful work, thereby lowering overall electricity consumption and fossil fuel demand.
The basics of data center energy consumption
You would think that most of the energy consumed by a data center powers its servers. Think again. The industry metric PUE (power usage effectiveness) is the ratio of data center facility power use to the amount of power going to the servers. According to the U.S. EPA, the average PUE for a domestic data center is 2.0, meaning that a data center uses two watts of power for every one watt used by its servers.
Where does the other half of the energy go? Mainly, it goes into cooling the servers’ chips, which heat up while performing 50 trillion floating-point calculations per second so you and your coworkers can enjoy doctored images of a Cybertruck crushed by a block of cheese. This is why you see so many air conditioner compressors in photos of data centers.
This Google data center in Council Bluffs, Iowa shows a typical data center design. Servers are ... More
How to lower data center energy consumption
Many smart startups are working on solutions to the problem of reducing data center energy consumption. Everyone understands that air cooling is wasteful and that waste heat has the potential to be a valuable resource.
Some startups are fighting fire with fire--using AI to find opportunities for greater efficiencies in data center cooling (NexDCCool Technologies, a startup spun out of a Penn State lab and Fluix, founded by a recent graduate of the University of Central Florida are using this approach).
Another class of solutions deals with immersion cooling. In server farms, this is done by designing sealed server enclosures that are flooded with dielectric liquids (liquids that do not conduct electricity and are often used as insulators in high voltage equipment like transformers). Immersion cooling systems are very good at reducing overheating of components but are also very costly. Servers need to be modified or redesigned, dielectric liquids are not cheap nor is the infrastructure related to tanks and fluid management, and maintenance can be a nightmare.
Nexalus has focused on a technique known as direct liquid cooling, a technology that has been receiving a lot of interest due to its relative ease of implementation and its effectiveness in cooling chips equipment under high-temperature AI workloads.
Legacy DLC systems pump water through microchannel arrays or flow low-boiling-point refrigerants onto a cold plate that sits atop servers’ chips. The liquid carries away heat as it flows over the chips. Two firms using the water-based DLC model are CoolIT from Canada and Asetek from Denmark.
The main selling point of refrigerant-based designs (one prominent manufacturer of this model is the Israeli firm Zutacore) is that they do not use high-pressure water flows, so the perceived risk to equipment from leakage is lower.
While water’s physical properties make it a better coolant than synthetic refrigerants, legacy water-based systems have a big disadvantage: cold plate microchannels are so thin that larger high-pressure pumps must be used to counteract the friction between the water and the plate, requiring more energy and posing a greater risk of server damage in the event of a leak.
Nexalus’s solution is to surround the chip with a watertight shroud containing tiny high-velocity water jets which strike the cold plate perpendicularly at the precise areas where the chip runs hottest, rather than running water evenly through microchannels over the entire chip.
A Nexalus shroud built for a popular type of Dell server. The company claims its water-cooling ... More
This precise high-velocity flow cools the chip more effectively, and the lack of microchannels reduces the pumping pressure and attendant pumping power by a factor of 25.
These two innovations reduce data center power demand considerably, but the development team at Nexalus has rethought server case design to cut power draw even more and create a circular system for the heat pulled out of server cases.
Managing surplus energy for local businesses and municipalities
In most data centers, servers are air-cooled. Even DLC systems only use water to cool the hottest components—the chips themselves—and cool everything else with air, making a server room about as loud as a subway car going through a station.
Air-cooling components require air handling and conditioning systems that run around the clock. These systems use a lot of water, which is lost to the atmosphere when heat is drawn out of the building.
Nexalus’s air-tight server cases feature internal fans that direct heat to water lines which also draw heat away from circuitry. This water joins with the water from the chip-cooling systems at an average temperature of around 60°C (140°F), then flows to a heat exchanger, where it is cooled. The cold water then flows back to cool the servers. The system is a closed loop, drastically reducing data server water usage, an important environmental issue when data centers are built in arid areas.
Nexalus provides the 140°F waste heat from the exchanger to local businesses or municipalities in a form that can be used in industrial applications or for district heating with nearly zero energy lost in the process.
Recovered heat used for other applications represents power that does not need to be generated by burning fossil fuels or pulled off the electrical grid, so a fully integrated Nexalus-equipped data center would relieve grid congestion and reduce overall carbon footprints.
Laws in several European countries were recently changed to require the use of heat recovery systems before a municipality will sign off on data center development plans. European data center developers have responded by signing deals to heat swimming pools, supply trout farms with heat, and provide neighborhoods with district heating.
Nexalus claims that its DLC systems can reduce the carbon footprint of a typical data center by 23,000 metric tons per year, and an additional 24,000 metric tons can be offset through the waste heat recovery process.
Co-founder and CEO Kenneth O’Mahony spoke with me off the record about the firm’s incipient partnership announcements. Suffice it to say that I was impressed. The firm has publicly announced a deal with Hewlett-Packard Enterprise to integrate its cooling system with three of HPE’s leading server models, and I expect further announcements with other top-tier server manufacturers and data center developers.
The Nexalus team in December 2024. Nexalus is intent on cutting data center power consumption using ... More
O’Mahony told me that the company was also receiving interest from telecom providers looking to push more computing power nearer to end users. “Edge computing” is a hot topic, and O’Mahony believes Nexalus’s self-contained, small-footprint systems can help facilitate it.
I do not fly private jets, but I do love my subscription to a popular LLM. I feel a twinge of climate guilt whenever I ask it a question, however, because I know my query stokes data center energy consumption. O’Mahony and his colleagues at Nexalus know how important it is to reduce data center carbon footprints while facilitating the growing use of AI technology. We must all balance business imperatives with climate constraints in this post-Climate world. Intelligent investors take note.