16
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division Greening Data Centers Dallas Thornton SDSC Division Director, Cyberinfrastructure Services March 2, 2011

20110302 on vector green datacenters

Embed Size (px)

DESCRIPTION

SDSC-McGill study on how relocating data centers dramatically reduces costs

Citation preview

Page 1: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

Greening Data CentersDallas Thornton

SDSC Division Director, Cyberinfrastructure Services

March 2, 2011

Page 2: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

US Data Centers (TeraWatt Hours per Year)

Sources: Report to Congress on Server and Data Center Energy Efficiency Public Law 109-431;U.S. Environmental Protection Agency ENERGY STAR Program, August 2, 2007; Kaufman, Ron.

Television's Hidden Agenda. TurnOffYourTV.com, 2004

Data Centers Are Enormous Users of Power

27

61

125

US Televisions(248 Million Units)

Page 3: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

Measuring Data Center Facility Efficiency• The most common measure is Power Use Efficiency (PUE): 

[Total Datacenter Electrical Load] PUE =

[Datacenter IT Equip. Electrical Load]

Source: Green Grid

Page 4: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

PUE Tabletop Reference…PUE Level of Efficiency

3.0 Very Inefficient

2.5 Inefficient

2.0 Average

1.5 Efficient

1.2 Very Efficient

1.0 Ideal

Typical Server RoomsFrom office conversions (worst) to basic hot/cold 

aisle legacy data centers (better)

Optimized Data CentersHot/cold aisle containment, HVAC throttling based 

on loads, and high‐efficiency UPSes

Greenfield Design in CanadaAll of the above + innovative climate‐leveraging 

technologies and designsSources: Green Grid, 2008 UC NAM Data Center Audit, 2009 UCSD/SDSC NAM Data Center Audit, 2010 SDSC/McGill University Joint Data Center Design

Page 5: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

SDSC Data Center Overview• ~19,000 sq. ft., 13 MW of on‐site power• Regional co‐location data center for UC system

• 100+ projects from 6 campuses

• Energy efficient alternative to server closets, offices, etc.• Home of SD‐NAP

• Many 10 Gb and 1 Gb connections to other organizations and networks:• CENIC, Cox, Time Warner, Salk Institute, Scripps Research Institute, SDSC, etc. 

Page 6: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

Optimizing Features• Aisle Thermal Containment

• 15ᵒ ΔT from top to bottom of rack → 1ᵒ ΔT • 10ᵒ ‐ 15ᵒ increase in return temperatures• Cold aisle and hot aisle options• Fire code considerations

Page 7: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

Optimizing Features (Cont.)• Increased Supply Temperatures

• Move to near top of ASHRAE spec. (80ᵒ F)• Drives AHU return temperatures higher, 

allowing more cooling from chilled water

• VFD Fans on AHUs• Allows for fan energy savings… IF accurate 

controls can be put in place.

• Adaptive Controls• Address redundancy and inefficient cooling• Allow ‘big picture’ control of cooling, throttling 

based on real‐time loads

Page 8: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

Optimizing Features (Cont.)• Rack Blanking Panels

• Cost effective solutions: Coro‐plast

• Floor Brushes• Conveyer belt brush: sold in varying lengths

• Efficient Electrical Systems• 480V/277V or (even better) 400V/240V power• Efficient UPS and generator configs

Page 9: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

SDSC/McGill Data Center Conceptual Design• Goal: Most Efficient Class One Data Center in North America• Optimize Cooling Systems for Quebec Climate

• Evaporative free cooling – Primary cooling• Seasonal ice storage – Top up cooling• No compressor based cooling

• 1.06 PUE means UC could achieve full CapEx recovery in lessthan 10 years with energy cost savings

• Lower‐cost, green hydro power• $0.045/kWh vs. $0.08‐$0.15/kWh in California

• Design funded by grants from Canada‐CaliforniaStrategic Innovation Partnerships (CCSIP) andCLUMEQ

Page 10: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

0.000

0.002

0.004

0.006

0.008

0.010

0.012

0.014

0.016

0.018

0.020

0.022

0.024

0.026

0.028

0.030

-30 -25 -20 -15 -10 -5 0 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 100

Hum

idity

Rat

io (l

bs H

2O p

er lb

s dr

y ai

r)

Dry Bulb Temperature (F)

Data Source: Government of Canada - National Climate Data & Information ArchiveData Set: WMO #71627, Montreal/Pierre Elliott Trudeau Airport, Typical Year

Elevation: 118 feetAir Pressure: 14.633224 psia

Full Free Cooling7228 hrs/yr

Partial Free Cooling1380 hrs/yr

Auxillary Cooling152 hrs/yr

Free Cooling Analysis with 65F CHWS

Page 11: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

Supplemental Cooling:Seasonal Ice Storage Pond System

Page 12: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

Supplemental Cooling:Seasonal Ice Storage Pond System

Page 13: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

Backup• Pay for rental chillers only when (if) you ever need it• Design for portable chillers to connect in an emergency

Page 14: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

Results Ai

r Coo

led

Wate

r Coo

led

Supply Temperatures

Hours of Free Cooling / year PUE

Annual Energy Use Mechanical Cooling Needed Water Usage

Air Cooled Water Cooled Energy Cost( $0.058/ kWh) Hours per Year1

Additional Load at Extreme Weather

(wetbulb = 68.7°F)

Evaporation + Carry Over Blowdown

Cost($5.52/1,000

gal)

°C °F °C °F hrs/yr % of yr kWh/yr2 $ tons gallons gallons $

10% 90% 29.4 85.0 23.9 75.0 8,532 97% 1.06 74,543,000 $4,323,000 228 0 33,200,000 8,100,000 $228,000

Page 15: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

Potential Facility‐Related Cost SavingsAssumptions

• 5 MW IT Load• 24x7 Operation

Typical Local DC• 2.0 PUE• 10 MW Consumption

• $0.10/kWh Power Costs• $8.8M Power Bill

Efficient Local DC• 1.35 PUE• 6.75 MW Consumption• $0.10/kWh Power Costs• $5.9M Power Bill

Potential Cost Savings of 74% and Energy Savings of 47%Though Facility Changes Alone!

Ultra-Efficient • 1.06 PUE• 5.3 MW Consumption• $0.05/kWh Power Costs• $2.3M Power Bill

Page 16: 20110302 on vector green datacenters

SAN DIEGO SUPERCOMPUTER CENTER  at the UNIVERSITY OF CALIFORNIA, SAN DIEGO Cyberinfrastructure Services Division

“Anyone who knows all the answers most likely misunderstood the questions.”