21
A Sustainable Data Center with Heat-Activated Cooling Thermodynamic Feasibility 1 Anna Haywood , Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

Embed Size (px)

Citation preview

Page 1: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

A Sustainable Data Center with Heat-Activated Cooling

Thermodynamic Feasibility

1Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

Page 2: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

INTRODUCTIONThermal architecture side: II-EN: BlueTool:

Infrastructure for Innovative Cyberphysical Data Center Management Research

NSF funded award #0855277

In response to an acknowledged problemIncreasing amount of data center energy use,Currently at 3% of all US energy consumption~50% of which is used for cooling the data

center2

Page 3: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

The BlueTool project

Design and on-site assessment services

BlueWeb: On-line data and simulation services

Online services:• Measurement archive• Profile archive• Model archive• Energy Calculator

authorized userBackend data andservice access

Research:• Model development• Scheduling testing• Design methodology development• Alternative, eco-friendly cooling technologies

Researchers at ASU

Consulting services:• Energy and efficiency

assessment• Design and online

solutions• Expert advising

BlueCenter:Experimental testbed

BlueSense: on-site monitoring

http://impact.asu.edu/BlueTool/

3

Page 4: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

ObjectiveThe overall objective of thermal project is to

reduce the grid power consumption of the cooling system for data centers.

4

HOW?use heat-driven, LiBr absorption chiller to

reduce the cooling load on a typical Computer Room AC (CRAC)

heat to drive the chiller will be originated from the data center itself.

Page 5: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

Challenges1. generating enough high-temperature heat

from the blade components inside the data center,

Target heat source = CPUs

2. and then capturing and transporting that heat effectively and efficiently to a Li-Br heat-activated absorption unit.

dynamic/fluctuating heat output of CPUs

5

Page 6: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

Overall Concept: Capture 90% of CPU heat and send to chiller

6

CPUs dissipate most heat on the board high heat fraction (HHF)

a capture fraction (CF) of 0.90.

Low loss

LiBr heat-activated chiller

SHiTQ ,

CRACQ T budget =70-95oC

Page 7: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

High Heat Fraction

7

42U

chassisIT blade server

CPUs

Figure 1. IT server equipment.

Dell DataCenter Capacity Planner Tool: 103W/CPU, 294W/blade and 19.78kW/rack

HHF: CPU heat /blade heat = 0.7(206W/294W)

10blades/chassis

5 chassis/rack

7U

Page 8: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

Target Heat SourceDell PE 1855Intel® Xeon®

Nocona Processors 3.20 GHz

2 CPUs/server blade103W/CPU =

206W/bladeTDP @ 72oC

8

Page 9: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

How much heat required from CPUs to run chiller for best performance?

Cooling Capacity: 10 ton =35.2 kW and COPC = 0.7

*Goal: 50.3 kW*Translates into 269 server blades of the Dell PE 1855 with dual Xeon Nocona CPUs.

gQ

eQCOP

inHeat

outCoolingOP CC

orC

Page 10: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

Apply Steady-State System AnalysisData center + cooling system layout

System equations applied to PUEApply equations to gauge system performance

Analyze power effectiveness of data centerPUE metric: Power Usage EffectivenessRatio of power delivered to the facility divided

by power used exclusively for the IT equipment

10

Page 11: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

System diagram: work and heat flow paths

11Contributors: Dr. Phelan, Anna Haywood, Jon Sherbeck, Phani Domalapally

Page 12: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

PUE is traditionally defined assuming conventional electric supply configurations

For non-conventional configurations,using alternative sourcesor reusing heat,

12

PowerElectricIT

PowerElectricTotalPUE

.

PUE Level of Efficiency

3.0 Very Inefficient

2.5 Inefficient

2.0 Average

1.5 Efficient

1.2 Very Efficient

Industry benchmarked PUE values (GreenGrid 2009)

PUE may fall below 1.0

Page 13: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

PUE applied to our system diagram

13

IT

LossLightsCPCRACIT

IT

TOT

W

WWWWW

W

WPUE

CoolW

CRAC

CRACCRAC COP

QW

IT

CPLossLightsCRAC

SHiTEVAPTOTIT

W

WWWCOP

QQQW

PUE

,

Relates electric power for compressor to the heat load on the CRAC

CPU heat load removal

Page 14: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

Equations relating PUE to HHF, CF, QEXT

14

total heat flow from data center as a load on the cooling equipment

SHiTEVAPTOT QQQ ,

ITCPLightsLoss

CRAC

ITEXTITCLightsLossIT

IT

W/}WWWCOP

WCFHHF)QWCF(HHFCOPWWW

W{PUE

SHiTQ ,

EVAPQ

TOTQ

chiller’s cooling capacity reduces heat load on CRAC

heat extraction from the CPUs to the storage

Portion of rack heat driving chiller

Page 15: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

15

Rearranging terms and simplifying

IT

CP

IT

Lights

IT

Loss

IT

EXTC

IT

Lights

IT

Loss

CRAC

W

W

W

W

W

W

CFHHF)W

QCF(HHFCOP

W

W

W

W1

COP

1

1PUE

PUE can become less than one. This cooling portion (heat removal) is divided by COPCRAC to represent electric power.

CRAC

IT

EXTC

COP

)WQ

CF(HHFCOP

Term suggests that external heating can generate excess cooling that can be “exported,” i.e., used to cool adjacent rooms or facilities, -- (pwr out)

CoolW

Win = QL/COPCRAC

Page 16: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

Calculated values for our data center with 6 racks

16

HHF 0.70

(6 racks) 118.70 kWe

(CPUs) 74.77 kWth

0.55 kWe

0.50 kWe

0.21 kWe

-1.88 kWe

Coefficients of performance

COPCRAC 3.9 *typical COPR

COPC 0.7 optimum

Capture Fraction (CF) 0.90 Expected

PUE 0.99

CRAC

CP

Lights

Loss

HiT

IT

WW

WWQW

Heat pwr

Elec pwr

ITWCFHHF

Page 17: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

0 10 20 30 40 50 60 70 80 90 100

0 20 40 60 80 100 120

0.97

1.02

1.07

1.12

1.17

1.22

1.27

0.97

1.02

1.07

1.12

1.17

1.22

1.27

High-Temperature Capture Fraction (%)

COPC = 0.7

COPC = 0.6

COPC = 0.5

PUE “very efficient” for our Data Center

17

PUE=0.99

PU

E

Page 18: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

PUE even better with Solar Source Added

18

PUE=0.81

0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85 0.9 0.95 10.80

0.85

0.90

0.95

1.00

1.05

COPC = 0.7

COPC = 0.5

COPC= 0.6

PU

E

Page 19: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

ConclusionThe potential exists to utilize some of the waste heat

generated by data centers to drive absorption chillers, which would then relieve some of the cooling load on the conventional computer room air conditioner (CRAC).

By reusing data center waste heat and supplementing the high-temperature heat captured from the CPUs with an external source of heating, such as from solar energy, it is theoretically possible to generate a PUE (Power Usage Effectiveness) ratio of less than one.

19

Page 20: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

Extra material

20

Page 21: Thermodynamic Feasibility 1 Anna Haywood, Jon Sherbeck, Patrick Phelan, Georgios Varsamopoulos, Sandeep K. S. Gupta

Example using reused heatTake an initial PUE of 1.283% of that goes to servers

If can utilize 30% of dissipated heat, then PUE drops to 0.9

21

9.04.15MW

1.24MW-5MWPUE

4.15MW

5MWPUE