Upload
eileen-adams
View
214
Download
0
Tags:
Embed Size (px)
Citation preview
Commodity Data Center Design
James Hamilton2007-10-08
[email protected]://research.microsoft.com/~jamesrhhttp://research.microsoft.com/~jamesrh
Containerized Products
1/21/2007
Nortel Steel EnclosureContainerized telecom equipment
Sun Project Black Box242 systems in 20’
Rackable Systems Concentro1,152 Systems in 40’ (9,600 cores/3.5 PB)
Rackable Systems Container Cooling Model
CaterpillarPortable Power
2
DatatainerZoneBox
Google WillPowerWill Whitted Petabox Internet Archive
Brewster Kahle
Cooling, Feedback, & Air Handling Gains
1/21/2007
• Tighter control of air-flow increased delta-T and overall system efficiency
• Expect increased use of special enclosures, variable speed fans, and warm machine rooms
• CRACs closer to servers for tighter temp control feedback loop
• Container takes one step further with very little air in motion, variable speed fans, & tight feedback between CRAC and load
3
Intel
Intel
Verari
Shipping Container as Data Center Module• Data Center Module
– Contains network gear, compute, storage, & cooling– Just plug in power, network, & chilled water
• Increased cooling efficiency– Variable water & air flow– Better air flow management (higher delta-T)– 80% air handling power reductions (Rackable Systems)
• Bring your own data center shell– Just central networking, power, cooling, security & admin center– Can be stacked 3 to 5 high– Less regulatory issues (e.g. no building permit)– Avoids (for now) building floor space taxes
• Political/Social issues– USA PATRIOT act concerns & regional restrictions
• Move resources closer to customer (CDN mini-centers)• Single customs clearance on import• Single FCC compliance certification• Distributed, incremental fast built mini-centers
1/21/2007 4
Manufacturing & H/W Admin. Savings• Factory racking, stacking & packing much more efficient
– Robotics and/or inexpensive labor• Avoid layers of packaging
– Systems->packing box->pallet->container– Materials cost and wastage and labor at customer site
• Data Center power & cooling expensive consulting contracts– Data centers are still custom crafted rather than prefab units– Move skill set to module manufacturer who designs power & cooling once– Installation design to meet module power, network, & cooling specs
• More space efficient– Power densities in excess of 1250 W/sq ft– Rooftop or parking lot installation acceptable (with security)– Stack 3 to 5 high
• Service-Free– H/W admin contracts can exceed 25% of systems cost– Sufficient redundancy that it just degrades over time
• At end of service, return for remanufacture & recycling– 20% to 50% of systems outages caused by Admin error (A. Brown & D. Patterson)
1/21/2007 5
Systems & Power Density• Estimating datacenter power density difficult (15+ year horizon)
– Power is 40% of DC costs• Power + Mechanical: 55% of cost
– Shell is roughly 15% of DC cost– Cheaper to waste floor than power
• Typically 100 to 200 W/sq ft• Rarely as high as 350 to 600 W/sq ft
– Modular DC eliminates impossible shell to power trade-off• Add modules until power is absorbed
• 480VAC to container– High efficiency DC distribution within– High voltage to rack can save >5% over 208VAC
• Over 20% of entire DC costs is in power redundancy– Batteries able to supply up to 12 min at some facilities– N+2 generation at over $2M each
• Instead, use more smaller, cheaper data centers• Eliminate redundant power & bulk of shell costs• Resource equalization1/21/2007 6
Where do you Want to Compute Today?
10/08/2007Slides posted soon to: http://research.microsoft.com/~JamesRH
7