Printer Friendly

System architectures and fluids for high heat density cooling solutions.

INTRODUCTION

According to ASHRAE's publication "Datacom Equipment Power Trends and Cooling Applications", by 2010, computer and communications rack heat loads are projected to reach 15 to 48 kW heat load per rack.

Driving this trend is the fact that advances in technology are allowing more and more computing power to be placed into smaller and smaller packages. Other contributing factors include the trend of businesses to reduce capital costs by putting virtualized servers in smaller spaces, and consolidation of multiple remote data centers into centralized mega data centers. This compaction increases power requirements thereby generating more heat.

BASELINE STRATEGIES TO INCREASE COOLING EFFICIENCIES

Certain changes can be made to the physical infrastructure to increase the efficiency of the cooling system, which will help better manage the heat generated by high density equipment. These include properly sealing the data center and optimizing the air flow within the data center.

Seal the Data Center Environment

Cooling system efficiency is reduced when air is leaking through floors, walls and ceilings, or when humidity is transferred from (or to) outside the critical facility. Therefore, the data center should be isolated from the general building and outside environment as much as possible.

Doors should be kept closed at all times and vapor seals should be used to isolate the data center atmosphere. The vapor seal is one of the most important methods for controlling the data center environment.

Without a good vapor seal, humidity will migrate into the data center during the hot summer months and escape during the cold winter months. In ASHRAE's publication "Design Considerations for Datacom Equipment Centers", the expanded recommended relative humidity level for Class 1 and Class 2 data center environments is 41.9[degrees]F (5.5[degrees]C) dew point to 60% RH and 59[degrees]F (15[degrees]C) dew point. Computer room precision air conditioners (CRACs) control humidity through humidification or dehumidification as required. An effective vapor seal can reduce the amount of energy expended on humidification or dehumidification.

Optimize Air Flow

Once the room is sealed, the next step is to ensure efficient air movement. The goal is to move the maximum amount of heat away from the equipment, utilizing a minimum amount of energy. Optimizing air flow requires an evaluation of how rack arrangement, CRAC placement/air distribution and cable management, might be impacting the air flow in the room.

Rack Arrangement. Most equipment manufactured today is designed to draw in air through the front and exhaust it out the rear. This allows equipment racks to be arranged to create hot aisles and cold aisles. This approach positions racks so that rows of racks face each other, with the front of each opposing row of racks drawing cold air from the same aisle (the "cold" aisle). Hot air from two rows is exhausted into a "hot" aisle, raising the temperature of the air returning to the CRAC unit and allowing it to operate more efficiently (Figure 1). This principle is called a hot-aisle/cold-aisle configuration.

[FIGURE 1 OMITTED]

Blanking Panels/Racks. To implement an effective hot-aisle/cold-aisle configuration, it is vital that the hot air not mix with the cold air. Therefore, perforated floor tiles should be removed from hot aisles and used only in cold aisles. Blanking panels should be placed in the open spaces in racks to prevent hot air from being drawn back through the rack. Even empty spaces between racks should be filled with blanking panels or racks to prevent the mixing of hot and cold air.

Seal Raised Floor. Some type of cabling grommet/seal should also be used in the cable penetrations in the raised floor to prevent the cold air from entering the space through cable openings, which are typically at the rear of the rack. Also the separation between the under floor plenum and adjacent rooms should be sealed so cold air does not leak from the pressurized raised floor into adjacent rooms.

CRAC Placement. When using the hot-aisle/cold-aisle configuration, CRAC units should always be placed perpendicular to the hot aisle to reduce air travel and prevent hot air from being pulled down into the cold aisles as it returns to the air conditioner. If the CRAC units cannot be placed perpendicular to the hot aisle, the return ceiling plenum can be effective in minimizing the mixing of hot and cold air (Figure 2).

[FIGURE 2 OMITTED]

Cable Management. The growing increase in the number of servers that data centers need to support has created cable management challenges in many facilities. If not properly managed, cables can obstruct air flow through perforated floor tiles and prevent air from being properly exhausted out the rear of the rack. The under-floor plenum should be checked to determine if cabling (or piping) is obstructing air flow. Overhead cabling is becoming an increasingly popular means to eliminate the potential for obstruction. Deeper racks are also now available to allow for increased airflow. Sometimes existing racks can even be equipped with expansion channels to add depth for cables and airflow.

It is also recommended to investigate the option of bringing high-voltage 3-phase power as close to the IT equipment as possible and increasing the voltage of the IT equipment. These steps will minimize the quantity and size of the power cable feeds under the floor. This can sometimes be accomplished by using high-voltage 3-phase managed power strips within the rack, but may also require the use of multiple-pole distribution panels or PDUs located within the row of IT equipment racks. If racks have extensive server cabling in the rear that obstructs the hot air exhaust from the servers, fans can be added to the rear of racks to help draw the hot air out of the rack. In a similar way, fans can be added to the front/bottom of the rack to improve the cold air distribution to the servers in the rack. However, it is important to remember that these fans consume energy and generate additional heat that must be removed from the room.

HIGH HEAT DENSITY COOLING

In typical installations with 12 to 24 in. (0.3 to 0.6 m) raised floor heights, the raised-floor cooling becomes less effective as rack densities exceed approximately 5 kW and load diversity across the room increases. At higher densities, equipment in the bottom of the rack may consume so much cold air that remaining quantities of cold air are insufficient to cool equipment at the top of the rack. The height of the raised floor creates a physical limitation on the volume of air that can be efficiently distributed into the room, so adding more room air conditioners may not solve the problem. Adopting the baseline strategies described above is a good place to begin when faced with increasing heat loads in the data center. However, they may not be enough to effectively remove the heat generated by high density equipment. In that case, additional actions are recommended. The actions can generally be divided in 2 groups:

* Fluid--Bringing the cooling fluid (typically water, refrigerant or air) closer to the heat source.

* Architecture--Selecting Open, Closed or Semi-closed/open architecture

In most cases, the best action is a combination of the two measures.

Cooling Fluids

Higher density applications can benefit from liquid-cooling brought closer to the heat loads to effectively remove the high concentrations of heat being generated. By bringing a cooling liquid closer to the heat source, the amount of energy typically required for air movement is reduced considerably. The capacity and efficiency of the cooling system is also increased because the temperature of the air entering the cooling coil is now higher.

The liquid choices available for cooling are mainly water, refrigerant and dielectric fluid. Table 1 highlights key thermal properties. Because dielectric fluid is substantially less efficient and more costly when compared to both water and refrigerant, it will not be considered further in this paper.
Table 1. Key Coolant Properties (ASHRAE Best Practices for Datacom
Facility Energy Efficiency)

         Coolant     Freezing Point,      Thermal      Specific heat,
                       [degrees]F       Conductivity,     Btu/lb *
                      ([degrees]C)      Btu/h * ft *     [degrees]F
                                      [degrees]F (W/m    (J/K * kg)
                                            * K)

Dielectric, FC-87      -175 (-115)      0.033 (0.057)    0.251 (1050)

Water                    32 (0)         0.347 (0.6)      1.004 (4203)

Ethylene                -36 (-38)       0.215 (0.372)    0.788 (3299)
glycol/water (50:50
v/v)

R-134a                 -154 (-103)      0.048 (0.083)    0.337 (1410)

R-744                   -70 (-57)       0.049 (0.085)    0.815 (3412)

                Coolant               Density,     Latent Heat of
                                   lb/[ft.sup.3]    Vaporization,
                                   (kg/[m.sup.3])  Btu/lb (kJ/kg)

Dielectric, FC-87                   103.6 (1659)       44 (102)

Water                                62.3 (998)      1058 (2460)

Ethylene glycol/water (50:50 v/v)    67.8 (1086)

R-134a                               76.4 (1223)       93 (216)

R-744                                48.4 (775)        66 (153)


Water. Water has several positive attributes as a cooling fluid, including low cost, non-toxicity, plentiful availability, and it can be used in virtually any size room. Also, water has been used in data center cooling many years. Conversely, water can introduce a host of issues to the data center, especially when it is distributed closer to the heat load. Water is a conductive liquid, so cooling system leaks can be electrically disastrous. It is also corrosive and requires careful engineering of the materials used in system construction. When water is used as a cooling fluid, it is typically not recommended for use with overhead piping or when cooling units are located above the electronic equipment, even if the water circuit has controls that keep the water temperature above the dew point in the room.

Refrigerant. By contrast, refrigerants such as R-134a and R-744 ([CO.sub.2]) are non-conductive and exist in a vapor state at room conditions. They are nontoxic, non-flammable, environmentally friendly (Ozone Depletion Potential of zero) and fully approved for use as a coolant. However, at data center operating temperatures, R-744 has an operating pressure that is approximately 10 times higher than the typical operating pressure for R-134. Therefore, the piping, connections and units in the R7-44 based system must be designed for this considerably higher pressure.

R-134a provides very high performance heat transfer in two-phase operation. Compared with water, required flow rates for water based systems tend to be four to eight times higher than R134 two-phase refrigerant and pressure drops in the cooling system are significantly lower in refrigerant systems than for water systems. (Hannemann 2007).

From an efficiency perspective, refrigerant performs better than water for high-density cooling because greater heat absorption capacity of two phase refrigerant requires lower fluid volumes to remove comparable heat. Refrigerant for high heat density cooling can be used in either Direct Expansion or Pumped versions. In the pumped refrigerant version, there is no compressor operating in the circuit, unlike a direct expansion refrigeration system (Figure 3). This allows the pumped refrigerant circuit to operate at a considerably lower pressure and, because no oil is needed in the pumped refrigerant circuit, oil traps and other oil-related issues are avoided.

[FIGURE 3 OMITTED]

In the pumped refrigerant version, the refrigerant is pumped in the piping system as a liquid, becomes a gas within the distributed cooling units when the heat from electronic equipment is transferred into the fluid circuit, and then is returned to either a pumping unit or a chiller. In the pumping unit/chiller, the heat is emitted from the fluid circuit as the gas is condensed back to a fluid before it is pumped back to the cooling unit. This phase change of the fluid contributes to greater system efficiency than water-based systems.

Since refrigerants are non-conductive and exist as a vapor at room conditions, refrigerant piping and cooling units can be placed above the racks if the controls have a function that keep the fluid temperature above the dew point in the room. This can save floor space.

Architecture

Cooling can be brought closer to the load through either a closed, open or semi-closed/open architecture. The main advantage with the closed and semi-closed/open architectures is that they have the ability to separate the hot and cold air and therefore increase the capacity and efficiency of the cooling system. Nevertheless, even in an open architecture environment the capacity and efficiency can be increased if the fluid is brought close to the heat source so the possibilities for the hot and cold air to mix are minimized.

Open Architecture. By definition, the open architecture has the active cooling source outside the enclosure. Typically this means that the cooling units are placed at the perimeter of the room and supply cold air to the front of the racks via a raised floor (Figure 1). The open architecture utilizes the room air volume as a thermal storage to ride through short power outages. In open architecture for high heat density, where distributed cooling units are on or near racks, but not part of an enclosure, room air is used as a buffer in the event of a failure, making it a safer alternative in many cases. An example of a high heat density cooling system with open architecture and distributed cooling is shown in Figure 4.

[FIGURE 4 OMITTED]

The ride-through time until an over temperature limit is reached during a failure depends in general on the heat load, air volume, thermal mass, and initial conditions in the space. For an open architecture solution, the ride-through time is longer than for a closed architecture; typically it is several minutes. With large rooms and low heat densities, the time can be much longer; in some cases more than one hour. Figure 5 (Stahl 2001) shows the full scale tested ride-through times for different heat densities in an open architecture configuration in a relatively small room.

[FIGURE 5 OMITTED]

In addition to providing better thermal ride-through in the event of a catastrophic failure, an open architecture allows greater flexibility to reconfigure as additional cooling capacity is needed.

Closed Architecture. Closed architecture fully encloses the rack, or a group of racks. The active cooling source can be located inside the actual rack (embedded cooling) or inside the closed architecture environment. An example of a high heat density cooling system with closed architecture is shown in Figure 6. Note that the rack in this figure has a fail-safe function that automatically opens the doors in case of a failure, converting the rack to the open architecture solution.

[FIGURE 6 OMITTED]

Using distributed cooling in a closed architecture, the electronic and cooling equipment are located together in a sealed environment. This approach provides high-capacity cooling at the expense of flexibility and fault tolerance if failure-mode precautions are not built in. Closed architecture cooling offers limited flexibility of rack combinations and often no back-up emergency cooling. If the cooling fails, racks are isolated from any room cooling.

The ride-through time until an over temperature limit is reached in case of a failure can be realized very fast for a closed architecture solution; in extreme cases this can be less than 60 seconds.

Semi-Closed/Open Architecture. Semi-closed/open architecture can have the active cooling source located inside or outside the space.

The semi open/closed architecture approach can apply to both individual racks and a group of racks. When applied to a group of racks arranged in rows as in Figure 7, it is often called cold aisle containment. This separates the cold and hot air to increase the efficiency and capacity of the cooling system by sealing the cold aisle with doors and ceiling panels. Aisle containment can also be applied to the hot aisle. However, compared to hot aisle containment, where the focus is to contain the hot air, cold aisle containment is focused on not only separating hot and cold air, but also delivering cold air to the cold aisle where the electronic equipment air inlets are located.

[FIGURE 7 OMITTED]

Aisle containment can be done with the aisle fully contained, or partially contained with only the end of the aisle closed off with doors or "curtains" as in Figure 8.

[FIGURE 8 OMITTED]

In the raised floor version of the cold aisle containment, the active cooling is outside the containment, typically along the perimeter of the room. Then again, it can also be placed inside the containment with, or without, a ceiling cover as in Figure 9.

[FIGURE 9 OMITTED]

It should be noted that in many cases, when racks are connected with ducts on the inlet or exhaust side, additional fans are required to overcome the pressure drop the ducts adds. However, these fans add to the total power draw and also generate additional heat that must be removed from the room.

Choosing Cooling Fluid and Architecture Solutions

Selecting the best cooling solution for a given data center is not easy. Since the contributing factors typically are complex and often mutually competitive, by nature, the best solution is situation specific. Therefore, when choosing cooling fluid and architecture, it is important to structure the requirements for the desired cooling solution and compare how the available alternatives each meet the requirements. This comparison can be done utilizing Table 2 and Table 3.
Table 2. Comparison of Cooling Fluids Based on Cooling Solution
Requirements

 Cooling Solution    Refrigerant Technology    Water Based Technology
    Requirement

                               ***                       **

Capacity to Cool     Phase changing of the     One-phase fluid in the
High Heat            fluid in the system       system can limit
Densities            yields higher capacities  capacity.
                     in limited space.

                               **                        *

Flexibility to       Pre-piped room and quick  Pre-piped room and quick
Equipment            connect couplings can     connect couplings can
Reconfiguration and  allow flexibility to      allow flexibility to
Changed Room         reconfigure.              reconfigure. However,
Layout                                         reconfiguration cannot
                                               be done without
                                               introducing
                                               water-related risks to
                                               the data center.

                               ***                       **

Energy Efficiency    Phase changing of the     Pumping water to the
                     fluid in the circuit      heat exchangers, located
                     yields very good energy   close to the heat
                     efficiency due to         source, yields good
                     smaller pumps and less    energy efficiency.
                     pressure drop in the
                     heat exchangers located
                     close to the heat
                     source.

                               ***                       **

Provide Thermal      Due to the phase          The water (one-phase
Ride Through in      changing of the fluid     fluid) contained in the
Case of a Failure    contained in the piping   piping circuit, can
                     circuit, thermal ride     yield some thermal ride
                     through time can be       through time.
                     achieved.

                               ***                       **

Floor Space          Refrigerant technology    With water based
Efficiency           enables floor             technology, non-overhead
                     space-saving overhead     solutions are typically
                     solutions.                used because of water
                                               related risks.

                                *                        *

Low Complexity of    Heat exchangers close to  Heat exchangers close to
Cooling Redundancy   the heat source increase  the heat source
                     complexity of cooling     increases complexity of
                     redundancy.               cooling redundancy.

                               ***                       *

Avoid Possibility    No water introduced in    Requires careful piping
for Water Leaks in   the middle of the data    layout, piping
the Data Center      center.                   containment/trays,
                                               detection and isolation
                                               to minimize the
                                               possibility of a water
                                               leak.

                                *                        *

Possibility to       Requires space for        Requires space for
Implement as         distribution piping (and  distribution piping (and
Retrofit             heat exchangers) to       heat exchangers) to
                     implement.                implement.

                               **                        **

Known and            Direct expansion          Water based cooling was
Comfortable          refrigerant technology    more common 20 years
Technology           is very well known since  ago. The technology is
                     many years. Pumped        slowly becoming used
                     refrigerant technology    again because of
                     is known but in a         increasing heat
                     relatively new            densities.
                     application when used
                     for data center high
                     heat density cooling.

* Fair ** Good *** Excellent

Table 3. Comparison of Cooling System Architectures Based on Cooling
Solution requirements

Cooling Solution      Open           Closed       Semi Open/Closed
   Requirement    Architecture    Architecture      Architecture

                       **             ***                 **

Capacity to Cool  Can Cool High   Closed          Can Cool High Heat
High Heat         Heat            architecture    Densities
Densities         Densities       has potential
                                  to cool very
                                  high heat
                                  densities.

                      ***             **                  **

Flexibility to    Open cold and   Closed racks    Can limit
Equipment         hot aisle       limit           flexibility due to
Reconfiguration   architecture    flexibility     containment/ducts.
and Changed Room  increases       due to power
Layout            flexibility.    and cooling
                                  connections
                                  (duct/pipe)
                                  and size/
                                  weight of the
                                  rack.

                      **              ***                 **

Energy            Distributed     Yields very     Distributed cooling
Efficiency        cooling units   good energy     units can yield
                  can yield good  efficiency for  good energy
                  energy          the cooling     efficiency for the
                  efficiency for  system.         cooling system.
                  the cooling
                  system.

                      ***              *                  **

Provide Thermal   Due to the      For a closed    Due to the semi
Ride Through in   open            rack (without   open architecture,
Case of a         architecture    automatic door  the room typically
Failure           of both cold    opening or      can be utilized as
                  and hot aisle,  similar) the    a heat sink.
                  the room can    thermal ride
                  be utilized as  through time
                  a heat sink.    is very
                  Ride through    limited,
                  time depends    typically only
                  on many         a few minutes
                  factors but is  or less.
                  typically
                  several
                  minutes. At
                  low heat
                  densities, the
                  time can be
                  much higher.

                      ***              *                  **

Floor Space       Available       Cooling         Containment
Efficiency        overhead units  units/ducts/    parts/ducts can
                  and piping      fans in a       occupy floor
                  requires no     closed          space.
                  floor space.    architecture
                                  rack typically
                                  use premium
                                  floor space.

                      ***              *                  **

Low Complexity    One redundant   Requires one    One redundant
of Cooling        cooling unit    redundant       cooling unit can
Redundancy        can serve many  cooling unit    typically serve
                  racks, or all   per section of  many racks, or
                  racks, in a     closed racks    possibly all racks
                  room.           or for each     in a room.
                                  closed rack

                      **               *                  **

Possibility to    Most existing   Requires space  Requires some space
Implement as      data centers    to implement.   to implement.
Retrofit          already have
                  an open
                  architecture.

                      ***             **                  **

Known and         Most existing   The use of      The use of semi
Comfortable       data centers    closed          open/closed
Technology        already have    architecture    architecture is
                  the open        is increasing,  increasing.
                  architecture.   especially for
                                  small and
                                  medium size
                                  data centers.

                      ***              *                  **

Flexibility       Open            Closed rack     Semi open/closed
Regarding Rack    architecture    limits          architecture limit
Types and         allows most     flexibility.    the flexibility
Manufacturer      racks to be                     some.
                  used.

                      **              **                  **

Environment for   Typically       Typically       Typically yields
Operation/        yields          comfortable     acceptable sound
Maintenance of    acceptable      but can yield   levels, air
Rack Equipment    sound levels,   uncomfortable   velocities and
                  air velocities  air             temperatures.
                  and             velocities,
                  temperatures.   air
                                  temperatures
                                  and noise
                                  levels when
                                  closed rack
                                  has to be
                                  opened for
                                  maintenance
                                  work.

                      ***              *                  **

Accessibility     Open            Closed cooling  Semi open/closed
for Operation/    architecture    architecture    architecture
Maintenance of    allows access   racks limits    can/limit the
Rack Equipment    to racks front  access for      access some.
                  and rear.       operation
                                  maintenance
                                  work.

* Fair ** Good *** Excellent


SUMMARY

As heat densities continue to rise, the possibility for hot spots and overburdened cooling systems also increases. It is important that facility and data center managers periodically examine their existing cooling capabilities to ensure that not only current needs are being met, but also to make sure the data center is provided the flexibility needed to meet future demands.

While there are a number of baseline steps that can be taken to optimize traditional cooling, high heat densities may require the installation of cooling technologies that are specifically designed to handle these applications. Only by fully understanding the cooling fluid and system architecture options available can one hope to make the most informed decision on the type of cooling technology that best meets the needs of the data center.

REFERENCES

ASHRAE. 2009. Best practices for datacom facility energy efficiency.

ASHRAE. 2005. Datacom equipment power trends and cooling applications. www.ashrae.org.

Hannemann, R. and H. Chu. 2007. Analysis of alternative data center cooling approaches. InterPACK '07, Paper InterPACK-1176. http://www.thermalformandfunction.com/documents/InterPACK-1176.pdf

Stahl, L. and C. Belady. 2001. Designing an alternative to conventional room cooling. IEEE, 23rd Telecommunications Energy Conference, Oct. 2001, pp. 109-115. http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=988519.

Lennart Stahl is senior marketing manager for Liebert Cooling Products, Emerson Network Power in McKinney, TX.
COPYRIGHT 2010 American Society of Heating, Refrigerating, and Air-Conditioning Engineers, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2010 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Stahl, Lennart
Publication:ASHRAE Transactions
Article Type:Report
Geographic Code:1USA
Date:Jan 1, 2010
Words:3909
Previous Article:Top-level energy and environmental dashboard for data center monitoring.
Next Article:High density cooling solutions--taking IT to the next level: the Cold Aisle Containment alternative.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters