0% found this document useful (0 votes)
64 views

Cooling Supplement Vertiv

Uploaded by

Ren Tsung Huang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
64 views

Cooling Supplement Vertiv

Uploaded by

Ren Tsung Huang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Sponsored by

Cooling
Supplement

Keeping IT cool
INSIDE

The end of over-cooling An intro to liquid cooling Plant-based cooling

> Data centers are letting > There are many kinds of liquid > Don’t use fossil fuels for
temperatures go up cooling. Here’s how it works immersion cooling
Water is Precious.
So is Your Business.
It’s time for a new wave of cooling solutions.
Vertiv’s water-free and energy-efficient thermal
management solutions and system-level controls are
designed to preserve our limited natural resources.

Visit Vertiv.com to learn more.

2 | DCD Magazine • datacenterdynamics.com


Cooling 

Sponsored by

Contents
4. The end of over-cooling
Data centers are switching off chillers and Keeping a cool head
letting temperatures go up. But is that

C
really the best idea?
ooling has been Goodbye to over-cooling?
10. An introduction to liquid cooling at the center of The biggest effort in the green data
There are many kinds of liquid thinking about green center movement has been around
cooling. Find out which one is right data centers since reducing the energy wasted in
for you the very dawn of the cooling data centers.
discipline, more than The old consensus was that data
13. Plant based cooling 15 years ago. centers should be kept at a chilly
If liquid cooling is environmentally You might think it's all been temperature below 20°C, to be
friendly, why does it use fossil fuel- sorted out by now - but if you absolutely sure that the electronics
based immersion fluids? thought that, you'd be wrong. were not stressed.
The last ten years have seen Hardware vendors disagreed,
a sustained effort to reduce the and industry bodies assured data
energy used in data center cooling center operators that warmer
systems, traditionally provided by temperatures were safe.
air-conditioning technologies, Now the conservative
shifting energy to the racks. colocation sector is acting on those
At the same time, there's been a recommendations.
growing movement that says liquid But wait. Research engineers are
cooling is going to be more efficient, sounding a cautious note. It turns
and better for the environment. out that the electronics use more
This supplement questions both power when the temperature goes
those assumptions - and provides up, to perform the same work.
a primer on liquid cooling, because Some are suggesting that the
we believe that, whatever the details, move to raise temperatures is a big
liquid cooling is the future. mistake, based on over-reliance on
simplistic efficiency measures like
A fluid choice PUE (power usage effectiveness).
Liquid cooling is not a new Could it be that we need a bigger
technology, and it's not a single dataset and more research to sort
technique. It's been around for 60 this out?
years, and is available in around
half-a-dozen forms. Plant-based immersion baths
Our primer takes you through Finally, immersion cooling

4
the history of the liquid cooling has earned its reputation for
movement, which has been environmental friendliness.
widespread in computers from It uses less energy, it's
mainframes to humble desktop quieter, and it removes heat in a
computers. concentrated form which makes
We also walk you through the heat reuse practical.
various options, from simple circuits But there's a small snag. All too
of water, kept from the electronics often, the immersion fluids used by
by coldplates, through immersion the tank makers are fossil fuel-based
tanks, up to two-phase systems hydrocarbons.
where sophisticated fluids bubble We spoke to a food giant that
and recondense. wants to fill your immersion tanks
As liquid cooling emerges from with a plant-based alternative.

10 13 its niche, you will need to learn a lot


of new tools.
And apparently it's fully
recyclable, too.

Cooling Supplement | 3
 Cooling Supplement

Over-cooling is
past its sell-by date Peter Judge
Executive Editor

Fourteen years after


definitive proof that
warmer is better colocation
companies are still
struggling to turn their
cooling systems down

4 DCD Supplement • datacenterdynamics.com


The end of over-cooling 

I
n definitive guidance that it is This was not an idle whim. run in a traditional air-conditioned
perfectly safe to run data centers ASHRAE engineers said that higher environment. The 450 hot servers had
at temperatures up to 27°C (80°F). temperatures would have little effect saved some 67 percent of the power
But large parts of the industry on the lifetime of components, but budget.
persist in over-cooling their
would offer significant energy savings.
servers, wasting vast amounts In this higher-temperature test, Intel
of energy and causing unnecessary Figures from the US General actually found a measurable increase
emissions. Services Administration suggested in failure. Amongst the hot servers, two
that data centers could save four percent more failed. But that failure
There are signs that this may
percent of their total energy, for every rate may have had nothing to do with
be changing, but progress has
degree they allowed the temperature the temperature - the 450 servers
been incredibly slow - and future
developments don’t look like speeding to climb. under test also had no air filtration or
things up very much. humidity control, so the small increase
Hyperscale companies are often
in failure rate may have been due to
best placed to pick up advanced
Don’t be so cool dust and condensation.
technology ideas. They own the
When data centers first emerged, building, the cooling systems, and the
IT. So if they allow temperatures to
Some like it hot
operators kept them cool to avoid any
climb, then it’s their own equipment Academics backed up the idea, with
chance of overheating. Temperatures
that feels the heat. support coming from a 2012 paper
were pegged at 22°C (71.6°F), which
meant that chillers were working from the University of Toronto titled
So it’s no surprise that cloud giants
overtime to maintain an unnecessarily were the first to get on board with Temperature Management in Data
cool atmosphere in the server rooms. raising data center temperatures. Centers: Why Some (Might) Like It Hot.
Facebook quickly found it could go “Our results indicate that, all things
In the early 2000s, more energy was
beyond the ASHRAE guidelines. At its considered, the effect of temperature
spent in the cooling systems than in
Prineville and Forest City data centers, on hardware reliability is weaker than
the IT rack itself, a trend which seemed
they raised the server temperatures to commonly thought,” the Canadian
obviously wrong. The industry began
29.4°C, and found no ill effects. academics conclude. “Increasing
an effort to reduce that imbalance, and
created a metric, PUE (Power Usage “This will further reduce our data center temperatures creates the
Effectiveness) to measure progress. environmental impact and allow us potential for large energy savings and
to have 45 percent less air-handling reductions in carbon emissions.”
PUE is the total power used in the
hardware than we have in Prineville,” At the same time, server makers
data center, divided by the power
Yael Maguire, then Facebook’s director responded to ASHRAE’s guidelines,
used in the racks - so an “ideal” PUE
of engineering, said. and confirmed that these new higher
of 1.0 would mean all power is going
Google went up to 26.6°C, and temperatures were acceptable without
to the racks. Findings ways to switch
Joe Kava, then vice president of data breaking equipment warranties.
off the air conditioning, and letting
temperatures rise, was a major strategy centers, said the move was working: Given that weight of support, you
in approaching this goal. “Google runs data centers warmer than might have expected data center
most because it helps efficiency.” temperatures to rise dramatically
In 2004, ASHRAE (the American
Intel went furthest. For ten months across the industry - and you can still
Society of Heating, Refrigerating
in 2008, the chip giant took 900 find commentary from 2011, which
and Air-Conditioning Engineers)
servers, and ran half of them in a predicts a rapid increase in cold aisle
recommended an operating
traditionally cooled data center, while temperatures.
temperature range from 20°C to 25°C.
In 2008, the society went further, the other 450 were given no external However, look around for
suggesting that temperatures could be cooling. The server temperatures went recommended data center
raised to 27°C. up to 33.3°C (92°F) at times. temperatures today, and figures of
At the end of the ten months, the 22°C and 25°C are still widely quoted.
Following that, the society issued
Revision A1, which raised the limit to chip giant compared those servers This reluctance to change is widely
32°C (89.6°F) depending on conditions. with another 450 which had been put down to the industry’s reputation

Cooling Supplement || 55
Cooling Supplement
 Cooling Supplement

for conservatism, although there are Equinix pushes the idea of increased
some influential voices raised against temperatures as a way for its customers
the consensus that higher temperatures to meet the goal of reducing Scope 3
are automatically better (see Box). emissions, the CO2 equivalent emitted
from activity in their supply chain.

For colocation customers, the


Equinix makes a cautious move energy used in their colo provider’s
facility is part of their Scope 3
All of which makes a recent
emissions, and there are moves to
announcement from Equinix very
encourage all companies to cut their
interesting. On some measures,
Scope 3 emissions to reach net-zero
Equinix is the world’s largest colocation
goals.
player, housing a huge chunk of the
servers which are not either in on- Revealingly, Equinix does not
premises data centers on in the cloud. provide any supporting quotes at all
from customers eager to have their
In December, Equinix announced servers hosted at a higher temperature.
that it would “adjust the thermostat of
its colocation data centers, letting them For Equinix, the emissions for
run warmer, to reduce the amount electricity used in its cooling systems
of energy spent cooling them down are part of its Scope 2 emissions, which
unnecessarily.” it has promised to reduce. Increasing
the temperature will be a major step
“With this new initiative, we can towards achieving that goal.
intelligently adjust the thermostat in
our data centers in the same way that "Our cooling systems account
consumers do in their homes,” said for approximately 25 percent of our
Raouf Abdel, EVP of global operations total energy usage globally," said
in any Equinix data centers. Instead, Abdel. "Once rolled out across our
for Equinix.
customers will be notified at some current global data center footprint,
Equinix’s announcement features unspecified time in the future, when we anticipate energy efficiency
congratulatory quotes from analysts Equinix is planning to adjust the improvements of as much as 10 percent
and vendors. thermostat at the site where their in various locations."
equipment is hosted.
Rob Brothers, program vice Equinix is in a difficult position. It
president, data center services, at "Starting immediately, Equinix will can’t increase the temperature without
analyst firm IDC explains that “most begin to define a multi-year global risking the displeasure of its customers,
data centers … are unnecessarily cooler roadmap for thermal operations within who might refuse to allow the increase,
than required," its data centers aimed at achieving or go elsewhere.
significantly more efficient cooling and
Brothers goes on to say that the decreased carbon impacts," says the It’s a move that needs to be made,
announcement will see Equinix “play press release. and Equinix deserves support for
a key role in driving change in the setting the goal. But the cautious
industry and help shape the overall And in response to a question from nature of the announcement makes it
sustainability story we all need to DCD, Equinix supplied the following clear that this could be an uphill battle.
participate in." statement: "There is no immediate
impact on our general client base, as However, Equinix clearly believes
The announcement will "change we expect this change to take place that future net-zero regulations will
the way we think about operating over several years. Equinix will work push customers in the direction it
temperatures within data center to ensure all clients receive ample wants to be allowed to go.
environments,” he says. notification of the planned change to "Equinix is committed to
Which really does oversell the their specific deployment site." understanding how these changes will
announcement somewhat. All Equinix affect our customers and we will work
has promised to do is to make an Customers like it cool together to find a mutually beneficial
attempt to push temperatures up Reading between the lines, it is obvious path toward a more sustainable future,”
towards 27°C - the target which that Equinix is facing pushback from says the statement from the company.
ASHRAE set 14 years ago, and which it its customers, who are ignoring the “As global sustainability
already recommends can be exceeded. vast weight of evidence that higher requirements for data center operations
No Equinix data centers will get temperatures are safe, and are become more stringent, our customers
warmer straight away, either. The unwilling to budge from the traditional and partners will depend on Equinix to
announcement will have no immediate 22°C temperature which has been the continue leading efforts that help them
impact on any existing customers norm. achieve their sustainability goals." 
6 DCD Supplement • datacenterdynamics.com
The end of over-cooling 

IS OVER-COOLING
REALLY SO BAD?

S urprisingly, after the industry seems to have reached a


consensus that data centers should be run as warmly as
possible, there are dissenting voices - some of them very
This effect varies between different processors, says
Summers, with Xeon E5-2769-v3 CPUs running at 50
percent workload drawing 8W more when the temperature
authoritative. was increased from 40°C to 75°C in a wind tunnel, with the
There are two main objections to running data server fans set to target a fixed CPU temperature.
centers warmer. One is that data center staff working in a Essentially, when the air inlet temperature goes up, the
contained hot aisle will be subjected to a harsher working cooling work shifts from the air conditioning systems to
environment. The other is that the chips in the servers will the fans in the servers, which have to work harder. This
also be subjected to more extreme conditions. automatically reduces the PUE, because the fans are in the
John Haile, a retired 24-year veteran of Telehouse, racks, and PUE is designed to maximize the energy used
commented on a LinkedIn discussion about Equinix’s within the racks, compared to energy used in the external
announcement: “The people that work in the data center cooling systems.
generally have to work in the hot aisle once the row goes Running at hotter temperatures can create completely
live. The temperatures in there are well over 40°C - it drys illusory benefits, says Summers: “With increased supply
your eyes out.” temperatures we do see an increased overall energy
While many professionals are prepared to work at higher consumption even though the PUE drops.”
temperatures, and some even relish the opportunity to In immersion tanks, where systems have no server fans,
work in shorts, others question whether the effort is even Summers ran 108 of the same CPUs in an immersion tank.
beneficial in the first place. In this situation, his team found there was a six percent
Running with hotter air temperatures may create a drop in power requirements at the same 50 percent
completely spurious benefit, based on over-reliance on one workload when the tank coolant was dropped from 50°C to
efficiency metric, argues Professor Jon Summers, research 30°C.
lead in data centers at Research Institutes of Sweden (RISE), “Other than less energy consumed by the DC cooling
Data centers measure efficiency by aiming for a low PUE equipment resulting in a lower ISO PUE, what are the
(power usage effectiveness) which is created by shifting reasons for pushing up air-supply temperatures?” Summers
power consumption from the building’s air conditioning to asks.
the racks. This makes sense if the energy in the racks is all Summers’ colleague Tor Björn Minde, head of RISE’s ICE
used for computation, but some are used for cooling fans, data center agrees: “Why in the world would you like to do
points out Professor Summers. this?”
“Increasing temperatures will improve the ISO PUE of a Allowing warmer temperatures might make sense if the
DC, which a vast majority appear to cite as a measure of outside air temperature is above +30°C, says Minde, but
efficiency,” says Summers. His research that a reduction otherwise “you should run it as cold as possible. The power
in energy used by the air conditioning will be offset by the draw of the IT is less at low temperatures. If you have free-
increased energy used in servers. cooling, run it cold. You will have less fan speed overall
“At RISE Research Institutes of Sweden, in the ICE data both in the facility and on the servers.
center we have researched the effect of supply temperature Minde thinks the industry should aim for constant CPU
on DC IT equipment using wind tunnels, full air-cooled data temperatures, and use the air conditioning compressor only
centers, direct-to-chip, and immersion systems connected when the CPU temperature is getting too high.
to well-controlled liquid cooling testbeds,” says Summers. Further work will be done on this – and Interact, a
“The upshot is that irrespective of the cooling method, the division of TechBuyer, has also been researching the issue,
microprocessors draw more power when operated hotter
and will be publishing a paper with the IEEE in 2023. 
for the same digital workload due to current leakages.”

Cooling
Cooling Supplement | 7| 7
Supplement
 Advertorial

How Data Center


Cooling Is Evolving
to Meet the Needs
of Slab Floor Data
Centers

Andrea Moscheni
Product Manager,
Vertiv

S
lab, or non-raised floor, data centers have helped cloud and
colocation providers meet growing capacity demand by
accelerating speed-to-market and reducing capital costs. Those
benefits have, however, come with new data center cooling
challenges. Cooling solutions not tailored to the needs of slab
floor facilities can jeopardize equipment reliability and reduce
cooling system efficiency. But with new challenges come new opportunities,
and recent developments in control strategies and cooling technologies are
enabling high performing cooling in non-raised floor environments.

Data Center Cooling Challenges in Slab Floor Facilities


When slab floor data centers were first gaining traction, the airflow control
strategy that had proven effective in raised floor environments was applied to
these non-raised floor data centers. But this strategy — which manages airflow
and fan speed based on pressure differential, or Delta P — hasn’t been as
effective in slab floor data centers as it is in raised floor environments.

Without the duct provided by the space beneath the floor, pressure is
more difficult to measure and manage in slab floor data centers. Data center
designers also lose the ability to control airflow to racks using properly sized

8 DCD Supplement • datacenterdynamics.com


VERTIV | Advertorial 

strategies and technologies designed To address the challenge of airflow


specifically for slab floor data centers are now distribution discussed previously, new
available. chilled water cooling units have been
engineered to meet the airflow requirements
A More Effective Control Strategy of slab floor data centers, including
for Slab Floor Data Center Cooling perimeter and thermal wall cooling units.

With control based on Delta P proving Perimeter units, for example, have
inefficient in slab floor data centers, Vertiv been redesigned to relocate the fan at the
developed a control strategy based on the top of the unit and create a larger surface
temperature differential (Delta T) between area for air distribution. This allows these
the supply air leaving the cooling units and units to distribute more air at lower speeds,
the return air to the cooling units. improving the ability to move air down the
length of the row and reducing the risk of
Temperature is much easier to measure negative pressure at the beginning of the
than pressure, and by setting a temperature row.
control point for return air above the supply
air temperature, operators can ensure New thermal wall units adapt the air
enough airflow is reaching each rack. handling unit (AHU) concept to the needs of
slab floor data centers. Installed in the service
This strategy takes into consideration corridor, they blow air horizontally to the
numerous failure conditions, such as server room, providing high volumes of air
blocked cold aisles, and provides monitoring that move at low speeds. These systems are
to ensure air temperatures at the rack are particularly well suited when high-density
precisely controlled and consistently meet cooling units are required.
temperature service level agreements
(SLAs) — something that isn’t possible with Both products can be integrated to
a Delta P control strategy. The need to run the chilled water system manager, which
fans at higher-than-necessary speeds optimizes the entire system by coordinating
to compensate for pressure variations operation of external and internal units.
across the row is eliminated, and return air
temperatures are maintained at the setpoint Optimizing Cooling in Slab Floor
to optimize cooling unit efficiency. For Data Centers
more on this control strategy, see the Vertiv
and positioned floor tiles. Instead of cold Developers and operators of slab floor
white paper, Overcoming the Challenges in
air being distributed directly to the front of data centers no longer have to accept
Cooling Non-Raised Floor Data Centers.
racks through the tiles, air must travel the compromises in cooling system
length of the row. To compensate, many performance to realize the cost and speed
More Effective and Efficient
operators drive fan speeds too high, wasting benefits enabled by eliminating the raised
Products for Slab Floor Data Center floor. By using control strategies and cooling
fan energy and resulting in lower return
air temperatures that prevent cooling units Cooling technologies engineered specifically for
from operating at their design efficiency. slab floor data centers, they can leverage
Chilled water cooling systems offer a
the environmental and operating benefits
The need for air to travel down the row number of benefits to cloud and colocation
of chilled water cooling while effectively
also creates airflow patterns that can limit provides developing or operating slab floor
managing airflow and temperature across
the ability to cool racks closest to the cooling data centers. One of the most significant
the facility. For more information on
units when standard data center cooling is the ability of chilled water systems to
selecting the right cooling system for your
units are used. The velocity of the air at reduce direct and indirect greenhouse
data center see the white paper, Chilled
the beginning of the row has to be high gas emissions compared to other cooling
Water Data Center Cooling for Non-Raised
enough to ensure adequate airflow at the technologies. Reductions in direct
Floor Applications. 
emissions are enabled by a chiller’s ability
end of the row. With standard cooling units,
to use low global warming potential (GWP)
that requires velocities so high they create
hydrofluoroolefin (HFO) refrigerants. Indirect
negative pressures in front of racks at the
emissions are reduced through the overall
beginning of the row. This increases the
efficiency of these systems, which can
potential for temperature-related failures in
achieve very low power usage effectiveness
these racks.
(PUE) values through the use of intelligent
As a result, operators of slab data centers control systems. To learn more, read the
have had to compromise both cooling Vertiv white paper, How Chilled Water
system efficiency and equipment reliability. Systems Meet Data Center Availability and
But that is no longer necessary, as new Sustainability Goals.

Cooling Supplement | 9
 Cooling Supplement

An introduction to
liquid cooling in the Vlad-Gabriel Anghel
Head of Product at
DCD>Academy

data center

An overview of
W
ith data center workloads ever increasing due to
advanced analytics, AI, and the digitization of every
direct-to-chip and process, the average rack power draw has shot up
considerably. And, as we know, with more power

immersion cooling draw comes more waste heat that needs to be removed from the
rack and eventually the white space.
In the recent past, when racks consumed up to 20kW, air-based
cooling methodologies could be relied on to keep the IT hardware
operating safely and efficiently. But as some racks start to exceed
30kW or more, new cooling approaches need to be used.
This is in part due to the densification of IT hardware in general
with each new CPU generation packing more processing capacity in
smaller and smaller die sizes. Workloads such as artificial intelligence
(AI) and machine learning (ML) require floating point operations
which are usually delivered via a graphical processing unit. These
GPUs are designed to have a normal operating temperature above
80°C (176°F) when fully utilized for a particular workload.

10 DCD Supplement • datacenterdynamics.com


Liquid Cooling 

Although air-based cooling options solutions? We’ll look at these next in the
exist for racks drawing more than 20kW, context of the data center.
they are often cumbersome to install and
maintain effectively, essentially passing
the point of diminishing returns in terms
of cooling capacity. As such, owners
and operators of data centers are now
cautiously looking towards liquid cooling
for their new facility projects.

Interboard water-based heat exchangers


A short history of liquid cooling
Liquid cooling of IT equipment seems
like a new technology, but that cannot be Source: Exploring Innovative Cooling
further from the truth. Solutions for IBM’s SuperComputing
Systems: A Collaborative Trail Blazing Liquid Cooling Technologies
Liquids in general can be a great heat
transfer medium and with a little chemical Experience
engineering, boiling and condensation
points can be tailored precisely, improving
Dr. Richard C. Chu, IBM Fellow, Enterprise-Grade Liquid
Academician, Academia Sinica, Cooling Solutions
the heat transfer using dielectric fluids.
ROC Member, National Academy of
Various forms of liquid cooling have Engineering, USA When analyzing liquid cooling options
been around since the late 1800s when for enterprise-grade IT hardware there are
they were used to insulate and cool extra Today, liquid cooling is present in essentially two main categories of liquid
high voltage transformers. The automotive pretty much every desktop PC – and the cooling – Direct-to-Chip Liquid Cooling
industry is another ecosystem that relied, concept has essentially remained the (sometimes called conductive or cold
and still relies on, liquid cooling - the same. The cooling process is made up of plate liquid cooling) and immersive liquid
water in a typical auto radiator. three distinct parts: - the heat plate, the cooling.
supply and return pipes, and the radiators
Liquid cooling entered the computer When considering the phases (what
and fans.
sector early in its history, when IBM state the fluid is in – either liquid or gas)
released a series of enterprise-grade that the coolant goes through we have five
computers called System/360, in the early distinct types of liquid cooling as seen in
1960s. figure above.
The System/360 has been one of the
most enduring lines of commercially
available computers. While the original Direct-to-Chip Single Phase
hardware is now retired, S/360 code This method of cooling requires delivering
written in the early 1960s are still found in the liquid coolant directly to the hotter
new mainframes today. It was also the first components of a server - CPU or GPU -
computer to have a unified instruction with a cold plate placed directly on the
set, making upgrades or changes to the chip. The electric components are never in
mainframe easier than ever. direct contact with the coolant.
The System/360 was also cooled with
a hybrid approach using both air and The heat plate is essentially a metal With this method, fans are still required
liquid cooling. This was quite big and plate that covers the whole CPU die with to provide airflow through the server
cumbersome to install, but IBM developed a small reservoir on top. The plate is to remove the residual heat. While the
the hybrid model to accommodate engineered to be as conductive as possible air-cooling infrastructure is greatly
increased heat loads. With these systems, in terms of heat. Any heat generated by reduced, one is still required for the correct
as much as 50 percent of the heat the chip will be transferred to the reservoir operation of this liquid cooling method.
dissipated was removed from the cooling on top.
Coolants can be either water or
air via water-cooled heat exchangers. The liquid in this closed loop will travel dielectric fluids, but water will infer a
via the supply and return pipes to the downtime risk of leakage, however, Leak
radiators where heat will be pushed out of Prevention Systems (LPS) are available.
the PC enclosure through the radiator fins Single phase refers to the fact that the
– these fins being actively cooled by fans. coolant does not change states - i.e from a
liquid to a gas.
Consumer-grade liquid cooling options
have originally only dealt with CPU heat, This is also the same method used in
but now almost every component of a the previous desktop PC example.
modern-day PC can be liquid-cooled.
That is the consumer-grade option of
liquid cooling – but what about larger-
Direct-to-Chip – Two-Phase
Layout of hybrid air/liquid approach in
System/360 scale deployments and enterprise-grade The two-phase direct-to-chip liquid

Cooling Supplement | 11
 Cooling Supplement

cooling method works like the previous What are dielectric fluids?
single-phase method, the only difference
being that the liquid coolant changes Dielectric liquids are used as electrical
states - from a gas to a liquid and vice- insulators in high voltage applications,
versa as it completes the cooling loop. e.g. transformers, capacitors, high-voltage
These systems will always use engineered cables, and switchgear (namely high
dielectric fluid. voltage switchgear).

In terms of heat-rejection, two-phase Their functions are to provide electrical


systems are better than single-phase insulation, suppress corona and arcing,
systems and have a lower risk of leakage and serve as a coolant. Generally, they are
due to the coolant's state-changing nature. split into two categories, fluorochemical,
They do however require additional and hydrocarbons.
controls which will increase maintenance Fluorochemical fluids, generally with
costs over the lifetime of the system. Immersion Cooling - Open-Tub:
a lower boiling point, are predominantly
Single Phase
used for two-phase immersion cooling.
Essentially, it is a rack turned on its Hydrocarbons typically are not used for
back, filled with dielectric fluid - instead
Immersive Liquid Cooling – IT- Two-Phase immersion cooling systems, as
of mounting servers horizontally, they are most hydrocarbons are combustible and/
Chassis Single-Phase now mounted vertically. or flammable. Therefore, hydrocarbons
This cooling approach uses a single-phase These systems are usually fitted with are typically only used in Single-Phase
dielectric fluid and is in direct contact centralized power supplies and the natural applications.
with IT components. Servers are fully or dielectric fluid is cooled off through a heat
partially immersed in this non-conductive Both fluorochemicals (or fluorocarbons)
exchanger using a pump which can be and hydrocarbons (e.g., mineral oils,
liquid within the chassis effectively installed either inside or outside the tub, or synthetic oils, natural oils) can be used for
removing all sources of heat. by convection. Single-Phase immersion cooling. Fluids
with a higher boiling point (above the
maximum temperature of the system) are
Immersion Cooling – Open Tub necessary to ensure the fluid remains in
the liquid phase.
– Two-Phase
Considerations when deciding among
As with Single-Phase, in this method the various fluorochemicals and hydrocarbons
IT equipment is completely submerged include heat transfer performance
in fluid vertically within a tank. But, (stability and reliability over time, etc.),
importantly with this approach, the ease of IT hardware maintenance,
dielectric fluid must be capable of fluid hygiene, and replacement needs,
IT Chassis Single Phase changing states from liquid to gas as it material compatibility, electrical
heats up. properties, flammability or combustibility,
In such a system, submerged and environmental impact, safety-related
The cooling can happen either
exposed parts will create heat, turning the issues, and total fluid cost over the lifetime
passively via conduction or actively
liquid into a gas, which rises to the surface of the tank or data centers.
pumped. Both heat exchangers and
pumps can be found inside the chassis or and condenses on a coil, falling naturally
in a side arrangement where the heat is back down once it cools off enough by
transferred from the liquid to a water loop. turning back into a liquid state.
Current Adoption
This approach also involves no fans,
While far from mainstream, liquid cooling
so its operation is nearly silent (0 dB). In
is positioning itself as the cooling solution
contrast, some air-cooled facilities can
for high-performance computing. Its
reach upwards of 80 dB in the data hall
mainstream adoption will however
with workers requiring hearing protection
depend on advances in technology and
for longer exposures.
chip designs.
Retrofitting already existing data
centers is costly for some forms of liquid
Immersion Cooling – Open Tub cooling, while the weight of immersion
– Single-Phase tanks makes it impractical for many
current raised floor facilities. 
Sometimes referred to as an "open
bath,” this immersive liquid cooling
method involves the IT equipment being immersion Cooling - Open-Tub:
Two Phase
completely submerged in fluid.

12 DCD Supplement • datacenterdynamics.com


Plant-based cooling 

Plant-based
immersion cooling Peter Judge
Executive Editor

Dealing with the


hidden problems of dielectrics

I
mmersion-based cooling is currently
perceived as one of the greenest technologies
in data centers, as it reduces the energy
needed to cool a facility, while extracting heat
in a quiet and efficient manner.
But there could be a problem. Data center
cooling systems from the likes of Asperitas or
Submer consist of large tanks of fluid in which
electronics are submerged. Generally, that fluid is
a synthetic oil composed of various hydrocarbons,
ultimately derived from petroleum.

That might not be a big issue, because


immersion cooling systems don’t burn their cooling
fluid, it circulates within the tanks until it needs
replacement.

But the hydrocarbon-based fluid will eventually


need to be disposed of, and will reach the
environment.

Could there be an alternative?

Cooling Supplement | 13
 Cooling Supplement

In the general
data center sector,
immersion cooling
Cool vegetable oil is still a small niche, fluid, sources suggest. Individual tanks
hold from 250 to 500 gallons and, while
US food giant Cargill thinks there is. The as most operators are the fluid lasts a long time, it will have to be
company is one of the largest in the US,
starting 150 years ago as a salt distributor,
dealing with a huge replaced at some point.

and is now best-known for egg-based installed base of air- There are anecdotal stories of data
centers placing orders for 25,000 gallons
products. But it also works with grain
and vegetable oil, and a few years back, cooled systems of coolant at one time, which amounts to
it quietly branched out. Into data center around 60 totes.
cooling. What appears to be happening is
“We saw an opportunity for the operators are choosing to replace their
renewables aspect, and the environmental immersion cooling after perhaps five or
aspect,” explains Kristin Anderson, Cargill’s six years.
business development manager for When this happens, if customers
cooling solutions. “We're really excited dispose of the fluid it will still have
about this product and the environmental effectively zero carbon emissions, since
opportunities.” it will be releasing captured carbon.
The product, NatureCool, is at least 90 However, there is a possibility that
percent based on soy oil, and designed customers might still manage to reuse
to replace petroleum-based immersion that fluid, if it can be processed to make it
coolants in data centers and cryptomining suitable for bio-diesel use.
facilities. Because it comes from plants Given that NatureCool is 90 percent
that have naturally trapped carbon, it can soy oil, the other 10 percent might need to
be said to be CO2 neutral - although uses be removed, in some sort of processing,
land that would otherwise be used for tanks, Cargill says it can biodegrade leaving an oil which can be safely burnt in
food. quickly and easily, within ten days - even diesel generators.
though it is stable and long-lasting inside
Environmentally friendly products can the system. The reuse doesn’t end there, as the totes
involve a trade-off on performance, but themselves are a potential source of waste.
Cargill believes that doesn’t apply here. It Recycling the product Industry practice is generally to discard
claims the fluid has a 10 percent higher them, but Cargill recycles totes.
heat capacity than leading synthetic Cargill has considered the lifecycle of the
immersion cooling fluids. product and made it recyclable - not only Cargill’s cooling customers all get virgin
the fluid but the packaging as well. totes rather than second-use containers,
It also passes safety standards, with a but customers are encouraged to send
high flash point of 325°C (617°F). Unlike The company can supply fluids in them back so they can be cleaned and
some other immersion fluids, it can't self- tanker trucks, holding 5,800 gallons, but reused or sold on the secondary market.
ignite, because its flames will go out after that doesn’t work for data centers, as the
the heat source is removed. trucks can’t be driven into the facility and
up to the tanks.
And there are other benefits in its Market potential
practical use, Cargill claims. The company Instead, most customers use what is
says that synthetic spills require expensive known as “totes,” 330-gallon containers All too often, the recycled option is an
remediation, using solvents which then made of heavy plastic with a metal cage expensive niche product, but Cargill
reinforcement, that can be transported appears to want to take a substantial share
need to be cleaned up, using techniques
by forklift truck. Totes are about four-by- of the market.
which are highly regulated.
four-by-four (feet), and add to the price of
The product was initially conceived
By contrast, Cargill says NatureCool the fluid.
around 2017, and started out in tests
spills just need soap and water.
Large facilities, with up to 500 tanks, with small partners. It has been available
In fact, when the fluid is outside the can get through a surprising amount of commercially for four and a half years,

14 DCD Supplement • datacenterdynamics.com


Plant-based cooling 

finding a market amongst early adopters business, immersion cooling is obviously environmental credentials, possibly linked
of immersion cooling. a good opportunity, because it provides a to pending regulations on greenhouse
higher-margin outlet for vegetable oil. gases and other chemicals with a global
This year, in 2022, the company has warming potential.
built up enough momentum to hire a team However, there are already a variety of
to market the product and make a formal synthetic oils in competition, so there will Cargill hopes that potential immersion
launch. be pressure on Cargill to keep the fluid cooling customers will demand a plant-
cheap. Certainly, operating in the price- based zero-emissions product in tanks,
With its massive food volumes, Cargill and ask vendors to endorse and supply it.
sensitive crypto sector will require that.
can produce large volumes of NatureCool
to meet potential demand, however, One might wonder if immersion DCD has approached leading
forecasting could be an issue, as the cooling in data centers could expand immersion cooling providers to ask if they
company’s initial market is mostly in the so quickly that the facilities start to take are aware of NatureCool or have certified
unpredictable cryptomining sector. raw materials away from the food sector, it, and the initial response seems to be
perhaps pushing up prices, but there’s no favorable.
In the general data center sector,
current danger of that. While some vendors are staying quiet
immersion cooling is still a small niche,
as most operators are dealing with a The fluid is available internationally, for now, Asperitas says it is “excited” by the
huge installed base of air-cooled systems. shipped in ISO-standard shipping development.
It is difficult to get those data centers container tanks. Cargill is a large Asperitas says there don’t seem to be
to consider converting to immersion enough organization to have an entire any issues with compatibility, but will
cooling: it would involve junking their air transportation team that will handle this need to confirm this with OEMs. “We look
conditioning systems, and investing in task, and also cover the minutiae of the forward to working with Cargill through
different sorts of support infrastructure international shipping process, including a special OCP immersion cooling fluids
and staff. customs and VAT. group to assess performance using the
newly published Figures Of Merit (FOMs),”
High-performance computing (HPC)
said a statement.
has moved further towards immersion
cooling, but it’s clear that cryptomining is Partners and channels Cargill has joined the Open Compute
the current opportunity. Crypto operators Project (OCP), an industry group aiming to
Users buying immersion cooling systems reduce the environmental footprint of data
are not tied to existing installed hardware,
do not want to have to buy the coolant center hardware, and hopes to raise the
and they simply want to run equipment as
separately. In the event of any failure or profile of immersion coolants.
fast and cheaply as possible.
incompatibility, this would mean finger-
They routinely overclock equipment pointing and potentially a failure of the “Immersion cooling is the new frontier
to get maximum performance, creating tank’s warranty. of technologies that allows for more
higher demands for heat removal, which efficient, higher performing systems
For this reason, Cargill aims to sell that also help make the IT industry
immersion cooling can deliver.
its product directly through the tank more sustainable,” said Kurtis Miller,
vendors, and it will be getting it certified as the managing director of Cargill’s bio-
compatible with those tank products. industrial business, and a contributor to
What about food? the OCP's Requirements document for
We can also expect marketing
Taken in context with the rest of Cargill’s campaigns which are based around its immersion cooling. 

There are anecdotal stories of data centers


placing orders for 25,000 gallons of coolant at
one time, which amounts to around 60 totes

Cooling Supplement | 15
A New Wave
Of Cooling Solutions.
Cool the data center with non-raised floor solutions.
High-performing cooling in non-raised floor environments
is here and ready to meet growing capacity demand and
reduce operating costs. Ensure equipment reliability and
improve cooling system efficiency in the existing footprint.

Visit Vertiv.com to learn more.

16 DCD Supplement • datacenterdynamics.com

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy