Upload
sajuhere
View
270
Download
2
Embed Size (px)
Citation preview
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
1/12
3 0 A S H R A E J o u r n a l a s h r a e . o r g A u gu s t 2 0 1 0
About the Author Mark Hydeman, P.E., is a principal and founding
partner at Taylor Engineering in Alameda, Calif. He
is a member of ASHRAE TC 9.9 and vice-chair of
SSPC 90.1. He was lead developer of TC 9.9’s Re-
search Work Statement 1499 on humidity control.
Implications of
Current Thermal GuidelinesFor Data Center Energy UseBy Mark Hydeman, P.E., Fellow ASHRAE
Most data centers are designed and operated according to existing industry
guidelines for temperature and humidity control. These guidelines include
ASHRAE Special PublicationThermal Guidelines for Data Processing Environments,1
ANSI/TIA-942-1 2005 Telecommunications Infrastructure Standard for Data Centers2
developed by the Telecommunication Industry Association (TIA), and the Network
Equipment Building Systems (NEBS), which also is used by the telecommunica-
tion industry for telecom central office facilities. This article examines the impact of
these guidelines on datacom facility efficiency and suggests ways to improve them.
Existing Thermal Guidelines
The current published recommenda-tions for temperature and humidity con-trol in data centers are shown inFigure 1. Thermal Guideli nesdefinesrecommend- ed and allowed ranges for temperatureand humidity as follows:• Recommended. Facilities should be
designed and operated to target the
recommended range; and
• Allowable. Equipment should bedesigned to operate within the ex-tremes of the allowable operatingenvironment.
In addition, Thermal Guidelines ad-dresses five classes of datacom facilities(Table 1 ). Class 1: Standard Data Cen-ter; Class 2: Information Technology(IT) Space, Office or Lab Environment;
Class 3: Home or Home Office; Class 4:
Manufacturing Environment or Store;NEBS: Telecommunications Central
Office Facility.As shown in Figure 1, the recom-
mended temperature and humidity en-velope expanded in 2008. This changeis significant as it increased the numberof hours that a typical data center can becooled using an economizer or evapora-tive (noncompressor) cooling. However,as discussed later, the upper and lowerhumidity limits severely handicap theoperation of these eff icient coolingtechnologies.
These specifications are defined atthe inlet to the IT or data communica-tions (datacom) equipment. A servertypically has discharge temperatures20°F to 60°F (11°C to 33°C) higherthan the inlet.
This article was published in ASHRAE Journal, August 2010. Copyright 2010 American Society of Heating, Refrigerating and Air-ConditioningEngineers, Inc. Posted at www.ashrae.org. This article may not be copied and/or distributed electronically or in paper form without permission
of ASHRAE. For more information about ASHRAE Journal, visit www.ashrae.org.
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
2/12
Thermal Guidelines’ Impact on HVAC
Energy Use
The 2008 changes to Thermal Guide- lines significantly broadened the tempera-ture range for recommended operation of
a datacom facility. Therefore, what is theimpact of raising the temperature of theair supplied to the datacom equipment ina datacom facility? The answer dependson the characteristics of the HVAC sys-tem. In general, HVAC system coolingcapacity and efficiency increases. Thisis particularly true of systems that haveair economizers, water economizers orevaporative cooling. Even without thesetechnologies, a direct-expansion (DX)system uses less energy per unit cooling ata warmer supply air temperature setpointdue to a decrease in the compressor lift. To simplify this discussion, I initially
ignore the impact of elevated equipmentinlet temperatures to the power draw ofdatacom equipment. As described later(in section “Effect of Server Inlet Tem-perature on Server Energy”), elevated inlettemperatures can increase the server fanspeed (and energy) and also create leakagecurrents. The server energy and airflow donot increase significantly if we keep theserver inlets at or below ~75°F (~23.9°C).
Figure 2 shows the impact of the out-door air dry-bulb temperature (OAdb)
for our study. This data came from computer simulations thatwere done for the U.S. Department of Energy’s DC-PRO pro-filing tool for data centers (http://tinyurl.com/34qavoz). Thesimulations were calibrated to a manufactures’ CRAC andCRAH units and followed the baseline model assumptions fromChapter 11 of Standard 90.1 (Energy Cost Budget Method) forthe chilled water plant equipment. The simulations presentedin this table did not include humidification. The data in the table shows the percent of the HVAC energy
(including fans, pumps and compressors) that was saved through
the addition of an air-side economizer. For the air-cooled CRACunits, the economizers saved between 19% and 63% of theHVAC energy. The largest savings were in Climate Zones 5B(cold and dry) and 3C (moderate and marine), and the smallestsavings were in Climate Zone 2A (hot and moist). For the chilledwater CRAH units served by a water-cooled chilled water plant,the air-side economizers saved between 12% and 37% of theHVAC energy. The greater savings for air-cooled DX is a reflec-tion that this equipment is far less efficient than chilled waterCRAH units with a water-cooled chilled water plant.
A u gu s t 2 0 1 0 A S HR A E J o ur n a l 3 1
Figure 1: ASHRAE temperature & humidity guideli nes for data centers and telecom facili ties.
on the cooling capacity and COP of an air-cooled DX unit.3 Although a decrease in outdoor air dry bulb is not precisely thesame as the increase in supply air temperature, they are directlyrelated as they both reduce compressor lift.* A decrease inoutdoor air dry-bulb temperature reduces the operating con-denser pressure and an increase in the supply air temperatureincreases the operating evaporator pressure. As shown inFigure2 , a 10°F (5.6°C) decrease in outdoor air dry bulb providesapproximately a 5% increase in unit cooling capacity and an11% increase in COP.
To emphasize the impact of supply temperature on energyefficiency of data centers, I analyzed bin data for four of the 16ASHRAE Standard 90.1 climate zones: Washington, D.C., inClimate Zone 4A (mild and moist); Salt Lake City in ClimateZone 5B (cold and dry); Houston in Climate Zone 2A (hot andmoist); and San Jose, Calif., in Climate Zone 3C (moderate andmarine). According to the U.S. Department of Energy, thesefour ASHRAE climate zones represent approximately 91.4%of all commercial building new construction and renovationsin the period 2003–2007.4 Table 2presents the energy savings from air-side economizers
on air-cooled direct-expansion CRAC units and chilled water
CRAH units served by water-cooled chillers in the four climates
Class
Dry-Bulb TemperaturePercent Relative Humidity
And Dew-Point Temperature
Allowable Recommended Allowable Recommended
1 59°F – 90°F64°F – 81°F 20% – 80%
42°F DP – 60% RH
and 59°F DP2 50°F – 95°F
3 41°F – 95°F N/A 8% – 80% N/A
4 41°F – 104°F N/A 8% – 80% N/A
NEBS 41°F – 104°F 64°F – 81°F Maximum of 85% Maximum of 55%
Wet-Bulb Temperature (°F) – Slanted
Dew-Point Temperature (°F) – Horizontal
80% 60% 40%
New (2008) Class 1 & 2
Recommended Range
Dry-Bulb Temperature (°F)
H u m i d i t y R a t i o ( P o u n d s M o i s t u r e P e
r P o u n d o f D r y A i r )
40
50
70
80
60
.030
.028
.026
.024
.022
.020
.018
.016
.014
.012
.010
.008
.006
.004
.002
.00040 50 60 70 80 90 100 110
20
Previous Class 1 & 2
Recommended Range
Generally Accepted Practice for
Telecommunications Central Office
(Lower RH Limit Not Defined in Telecom)
Allowable Class 1 Operating
Environment (For Reference Only)
*Compressor lift is the difference between the evaporator and condenserrefrigerant pressures.
Table 1: Thermal guidelines for each of the five classes of datacom facil it ies.
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
3/12
3 2 A S H R A E J o u r n a l a s h r a e . o r g A u gu s t 2 0 1 0
Percent of HVAC Energy Savings
ClimateAir-Cooled DX
CRAC Units
Chilled Water
CRAH Units With
Water-Cooled Chillers
Washington, D.C. 4A 43% 25%
Salt Lake City 5B 54% 31%
Houston 2A 19% 12%
San Jose, Calif. 3C 63% 37%
Table 2: Analysis of ai r-side economizer savings.
Washington, D.C. 4A Salt Lake City 5B Houston 2A San Jose, Calif. 3C
T db
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
4/12
Adverti sement formerly in this space.
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
5/12
3 4 A S H R A E J o u r n a l a s h r a e . o r g A u gu s t 2 0 1 0
Table 4: Wet-bulb bin analysis for the four ASHRAE climates.
Washington, D.C. 4A Salt Lake City 5B Houston 2A San Jose, Calif. 3C
T wb
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
6/12
Adverti sement formerly in this space.
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
7/12
3 6 A S H R A E J o u r n a l A u gu s t 2 0 1 0
Table 5: Coil analysis for impact on CHWR
fr om change in supply air temperatur e.
significant: A recentASHRAE Transac- tions paper showed ~30% HVAC energysavings in El Paso, Texas, (Climate Zone3B) and almost 50% in Helena, Mont.,(Climate Zone 6B).8 The key issue is
that a warmer supply air temperatureincreases the savings from a water-sideeconomizer as it increases the chilledwater return temperature from the CRAHunits.
Using an NTU (number of transferunits)9 effectiveness coil model (or amanufacturer’s coil selection program),you can establish the impact of supplyair temperature on the chilled water flowand return temperature. The results usingan NTU effectiveness model are showninTable 5. The assumptions in this modelinclude:
• 330,000 Btu/h (97 kW) chilledwater coil;
• 44°F (6.7°C) chilled water sup-ply; and
• 25°F (14°C)∆T on the air.
Using the same coil for supply air tem-peratures of 70°F or 80°F (21°C or 27°C)results in leaving water conditions ofaround 60°F (16°C) for the 70°F (21°C)supply air temperature and around 65°F
(19°C) for the 80°F (21°C) supply airtemperature.As discussed in the symposium paper,8
a warmer chilled water temperatureyields more savings from the water-sideeconomizer. In addition to the increaseduse of the water-side economizer, thepumping energy is reduced as the coilsrequire less flow at the warmer supply airtemperature (almost a 25% decrease inflow in the example inTable 5 ).
Potential Barriers to Energy Savings
I have established that raising the sup-ply air temperature in a data center theo-retically can lead to significant energysavings. However, to successfully raisethe supply air temperature and realizethe savings in a real data center there are
several important issues to address. Theseinclude the need for air management; theimpact of warmer temperatures on serverenergy use; the impact of the humiditylimits on free cooling; and the applicationof demand-based controls.
Dry-Side
Airflow 12,000 cfm 12,000 cfm
RAT 95°F 105°F
SAT 70°F 80°F
Qtot 330,000 Btu/h 330,000 Btu/h
Q sen 330,000 Btu/h 330,000 Btu/h
Wet-Side
Water Flow 41.25 gpm 30.75 gpm
CHWS 44°F 44°F
CHWR 60°F 65°F
Qtot 330,000 Btu/h 330,000 Btu/h
Adverti sement formerl y in this space.
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
8/12
A u gus t 2 0 1 0 A SHR A E J our n a l 3 7
Air Management
Lawrence Berkeley National Laborato-ries (LBNL) performed benchmarking ofdozens of data centers.10 One metric ofperformance it has tracked is the ratio of
fan supply airflow to the airflow throughthe servers by measuring the∆T acrossthe servers and CRAC or CRAH units.In most of the data centers that wereevaluated, the servers were running about25°F (14°C) while the CRAC or CRAHunits averaged around 10°F (5.6°C). Thisindicates a 2.5-to-1 ratio in airflow andan avoidable excess use of fan energy.
The reason for this overflow is generallyattributed to several factors: designersbeing overly conservative; the IT load inthe criteria for the data center design notbeing realized in the field; and facilityoperators addressing hot spots by speed-ing up HVAC fans (if using variable speedfans) or turning on standby units. These issues are effectively addressed
through airflow management including:adding blanking plates to racks to preventrecirculation of air from the hot aislesto the cold aisles; plugging leaks in thefloor and between racks; rebalancingfloor tiles and adding hot or cold aislecontainment.11 In a demonstration by
LBNL at one of its supercomputer facili-ties (Figure 5 ), LBNL was able to reducefan power by 75% and reduce variationin rack inlet temperatures by as much as26°F (14°C).12
InFigure 5 you see the three rack inlettemperature sensors: the yellow sensoris near the top of the rack, the pink sen-sor is in the middle of the rack and theblue sensor is located near the bottomof the rack. Horizontally, the figure isdivided into three regions: on the left is
the baseline or as-found condition; in themiddle is a period of adjustment as theyset up the cold aisle containment; and tothe right is the period of cold aisle con-tainment test. In the baseline conditionthe temperatures range from a low ofapproximately 50°F (10°C) to a high ofaround 76°F (24°C). By contrast, the tem-peratures of the three sensors are muchcloser together during the containmentdemonstration: they range between 50°Fand 60°F (10°C and 16°C). As shown in
this graph, the baseline measurements
did not meet the ASHRAE recommendedtemperature limits. By contrast, the coldaisle containment test (Alternate 1) couldhave increased its supply temperatureby approximately 20°F (11°C) and still
have been within the 2008 ASHRAErecommended thermal guidelines. Cold
aisle containment decreased the spreadin temperatures between the three rackinlet sensors.
In addition to getting better tempera-ture control at the racks, the researchers
also were able to reduce the operatingspeed and power of the CRAH units.
Adverti sement formerly in this space.
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
9/12
3 8 A S H R A E J o u r n a l a s h r a e . o r g A u gu s t 2 0 1 0
Figure 6 shows that they were able to reduce the CRAH powerby ~75% during the containment test.
Effect of Server Inlet Temperature to Server Energy
Another practical issue of warmer server inlet temperatureshas to do with the way that servers are designed. Servers witheither variable speed or two-speed fans demonstrate an increasein energy as the inlet temperatures increase. This increase in
server energy is most pronounced as the server inlet tempera-tures go above approximately 77°F (25°C).13 In addition to thefans, “leakage currents” also can increase server power drawat elevated temperatures.13 In 2009 APC and Dell tested threetypes of servers in an environmental chamber to demonstrate theserver power increase as a function of inlet temperature.13 Theirtest data corresponds to the data inFigure 7 from a field test byLBNL of server power as the server inlet temperature increases.As shown inFigure 7, the server energy increased nearly 14%as the inlet temperature increased from 70°F to 90°F (21°C to32°C). Although this energy increase is significant, it is far lessthan the reduction in facility energy use from using air-side
economizers, water-side economizers or evaporative cooling.Server thermal designers can change the power response to
inlet temperature through design. The fact that this equipmentwas traditionally designed for an inlet temperature of ~75°Fdoesn’t mean that it can’t be designed to safely operate at highertemperatures. Energy was relatively inexpensive when manyof these server thermal designs were developed. Manufactur-ers of servers are now creating products that can operate inhigher temperature environments. In the spring of last year, onemanufacturer announced servers that are designed to operatein environments up to 104°F (40°C).14 At these temperaturesit will be possible to cool most data centers completely with
either unconditioned air or evaporative cooling.
Server components are temperature sensitive and that thermalstress is a function both of absolute temperature and the rate
of change. As we increase the temperatures in the data centersto save energy, we are cutting into the safety margins that themanufacturers have provided to protect their equipment.
Humidity Controls
As shown in Table 2, the imposition of the upper and lowerhumidity limits greatly reduces the effectiveness of air-sideeconomizers in data centers. In the 2008 ASHRAE TC 9.9publication on the expanded thermal guidelines, the authorsdefended the upper humidity limit on concerns of conductiveanodic filament (CAF) growth, and the lower humidity on con-cerns about electrostatic discharge (ESD).15 Both of these issues
are addressed in the March 2010 ASHRAE J ournal article,
Figure 5: Rack inlet temperature distri butions during containment tests.
6/13/06 6/14/06 6/14/06 6/15/06 6/15/06 12:00p 12:00a 12:00p 12:00a 12:00p
T e m p e r a t u r e ( ° F )
90
85
80
75
70
65
60
55
50
45
40
Baseline Alternate 1
Setup
Low
Medium
High
Fi gure 6: CRAH power and speed dur ing containment tests.
6/13/06 6/14/06 6/14/06 6/15/06 6/15/06 12:00p 12:00a 12:00p 12:00a 12:00p
P o w e r ( k W )
25
20
15
10
5
0
Baseline Alternate 1
Setup
60 Hz
36 Hz
40 Hz
Fi gure 7: Fi eld test of server energy as a function of i nlet tempera-
ture from LBNL .
100%
95%
90%
85%70 75 80 85 90
y = 0.0002x2 – 0.0214x + 1.4492
R 2 = 0.9951
Percent Inlet Temperature kW
Polynomial Percent Inlet Temperature, kW
Temperature (°F)
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
10/12
Adverti sement formerly in this space.
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
11/12
4 0 A S H R A E J o u r n a l A u gu s t 2 0 1 0
Humidi ty Controls for Data Centers: Are They Necessary? 16 As established in that article, the current humidity limits wereset through industry consensus without the benefit of researchor a formal review of failure analysis. Citing existing publishedresearch and industry standards, the article establishes that ESD
is not effectively addressed by the use of humidity controlsand that CAF formation does not occur below 80% RH, andonly then if there are a number of coincidental environmentalconditions present.
Variations in Server Loads and Airflow Requirements
A data center is a dynamic environment: the servers havevariable loads and, in many cases, variable speed fans. Tomaintain inlet conditions and to maximize efficiency theHVAC cooling capacity and distribution have to change withthe loads. The industry practice is moving from temperaturesensors representing general conditions (often located in theCRAC or CRAH return) to distributed sensors representingrack inlet temperature (the criteria established in ASHRAEand EIA/TIA guidelines). These sensors are either externalto the servers (often mounted on the doors at the rack inlet)or are internal to the servers and accessed through IT networkprotocols like SNMP.
Conclusions
There are considerable energy savings to be had from op-erating data centers at the upper end of the thermal envelope.For DX and chilled water systems, the warmer temperaturesincrease capacity and efficiency of the cooling systems. And,
in chilled water systems, there are additional savings from theincrease in coil∆T that results in a reduction in pumping energy.For systems with air-side economizers, water-side economizersand evaporative cooling, the warmer temperatures increase thehours that these technologies can completely carry the load.
ASHRAE Technical Committee 9.9, Mission Critical Fa-cilities, Technology Spaces and Electronic Equipment, took animportant first step by expanding the upper end of the recom-mended thermal envelope from 77°F to 80.6°F (25°C to 27°C)in 2008. The next step should be the expansion or elimination ofthe humidity requirements. As established in this article, thesegreatly impair the potential savings from air-side economiz-ers. We can achieve significant energy savings in data centersby operating closer to the upper recommended temperaturelimits but only if you are willing to violate the recommendedhumidity limits.
We would achieve even more savings if we could work togetherwith the IT and data communications manufacturers to design their
Adverti sement formerl y in this space.
8/20/2019 ASHRAE Journal - ASHRAE Data Center Thermal Envelope Implications - Hydeman
12/12
A u gus t 2 0 1 0 A SHR A E J our n a l 4 1
products and thermal management implementations for higherinlet temperatures. In an effort to realize these savings we needboth research and cross industry collaboration. It is importantto have a public forum for these broadly disparate industries todiscuss and collaborate on these issues. It is the author’s belief
that the formation of a formal ANSI/ASHRAE Standard for thesethermal guidelines would force this collaboration and would pro-vide needed balance in the concerns of all of the affected partiesincluding the IT equipment manufacturers, facility owners and
13. Moss, D., J. Bean. 2009. “Energy Impact of Increased Server Inlet Temperature.” APC White Paper #138. http://tiny.cc/iduhv.
14. Miller, R. 2009. “Rackable CloudRack Turns Up the Heat.” DataCenter Knowledge. http://tinyurl.com/dfvgkr.
15.ASHRAE Environmental Gui delines for Datacom Equipment— Expanding the Recommended Environmental Envelope . 2008. Devel-oped by ASHRAE TC 9.9.
16. Hydeman, M. and Swenson, D.E. Humidity Controls for DataCenters: Are They Necessary? ASHRAE Journal 52(3):48 – 55.
operators, HVAC engineers, the researchcommunity, energy efficiency advocates,utility companies and HVAC manufactur-ers. Given the tremendous amount of en-ergy used by datacom facilities worldwidethis is an issue of concern to all of us.
References1. ASHRAE. 2008.Thermal Guideli nes for
Data Processing Environments. Developed byASHRAE Technical Committee 9.9.
2. ANSI/TIA-942-1 2005, Telecommu- nications Infrastr ucture Standard for Data
Centers .3. Data from a Carrier 25 ton Gemini I I
split system DX unit. Selection made using astandard three-row coil at 12,500 cfm (5900L/s) and an entering evaporator coil tempera-ture of 80°F (27°C) db and 62°F (17°C) wb.
4. Liu, B. 2010. DOE preliminary deter-mination of Standard 90.1-2010 presented byBing Liu of PNNL to the Standing StandardsProject Committee 90.1. Construction activity
data obtained from McGraw-Hill Construc-tion http://construction.com/.
5. Sorell, V. 2007. “OA economizers for datacenters.”ASHRAE Journal 49(12):32 – 37.
6. Product testing reports available fromwww.etcc-ca.org. Results cited are from thePG&E tests reported by Robert A. Davis.
7. Scofield, M., T. Weaver. 2008. “Datacenter cooling: using wet-bulb economizers.”ASHRAE Journal 50(8):52 – 58.
8. Stein, J. 2009. “Waterside economiz-ing in data centers: design and controlconsiderations.” ASHRAE Transactio ns 115(2):192 – 200.
9. 2009 ASHRAE Handbook—Fundamen- tals , Chapter 4.10. LBNL . “Benchmarking: Data Cen-
ters.” High-Performance Buildings forHigh-Tech Industries. http://hightech.lbl.gov/benchmarking-dc.html.
11. Sullivan, R., L. Strong, K. Brill. 2004.“Reducing Bypass Airflow is Essential forEliminating Computer Room Hot Spots.”Uptime Institute.12. Silicon Valley Leadership Group. 2008.
“Case Study: Lawrence Berkeley NationalLaboratory Air Flow Management.” http://dcee.svlg.org/case-studies-08.html.
Adverti sement formerly in this space.