70 likes | 229 Views
Commodity Data Center Design. James Hamilton 2007-10-08 JamesRH@microsoft.com http://research.microsoft.com/~jamesrh. Containerized Products. Nortel Steel Enclosure Containerized telecom equipment. Caterpillar Portable Power. Rackable Systems Concentro
E N D
Commodity Data Center Design James Hamilton 2007-10-08 JamesRH@microsoft.com http://research.microsoft.com/~jamesrh
Containerized Products Nortel Steel Enclosure Containerized telecom equipment Caterpillar Portable Power Rackable Systems Concentro 1,152 Systems in 40’ (9,600 cores/3.5 PB) Datatainer ZoneBox Rackable Systems Container Cooling Model Google WillPower Will Whitted Petabox Internet Archive Brewster Kahle Sun Project Black Box 242 systems in 20’
Cooling, Feedback, & Air Handling Gains • Tighter control of air-flow increased delta-T and overall system efficiency • Expect increased use of special enclosures, variable speed fans, and warm machine rooms • CRACs closer to servers for tighter temp control feedback loop • Container takes one step further with very little air in motion, variable speed fans, & tight feedback between CRAC and load Verari Intel Intel
Shipping Container as Data Center Module • Data Center Module • Contains network gear, compute, storage, & cooling • Just plug in power, network, & chilled water • Increased cooling efficiency • Variable water & air flow • Better air flow management (higher delta-T) • 80% air handling power reductions (Rackable Systems) • Bring your own data center shell • Just central networking, power, cooling, security & admin center • Can be stacked 3 to 5 high • Less regulatory issues (e.g. no building permit) • Avoids (for now) building floor space taxes • Political/Social issues • USA PATRIOT act concerns & regional restrictions • Move resources closer to customer (CDN mini-centers) • Single customs clearance on import • Single FCC compliance certification • Distributed, incremental fast built mini-centers
Manufacturing & H/W Admin. Savings • Factory racking, stacking & packing much more efficient • Robotics and/or inexpensive labor • Avoid layers of packaging • Systems->packing box->pallet->container • Materials cost and wastage and labor at customer site • Data Center power & cooling expensive consulting contracts • Data centers are still custom crafted rather than prefab units • Move skill set to module manufacturer who designs power & cooling once • Installation design to meet module power, network, & cooling specs • More space efficient • Power densities in excess of 1250 W/sq ft • Rooftop or parking lot installation acceptable (with security) • Stack 3 to 5 high • Service-Free • H/W admin contracts can exceed 25% of systems cost • Sufficient redundancy that it just degrades over time • At end of service, return for remanufacture & recycling • 20% to 50% of systems outages caused by Admin error (A. Brown & D. Patterson)
Systems & Power Density • Estimating datacenter power density difficult (15+ year horizon) • Power is 40% of DC costs • Power + Mechanical: 55% of cost • Shell is roughly 15% of DC cost • Cheaper to waste floor than power • Typically 100 to 200 W/sq ft • Rarely as high as 350 to 600 W/sq ft • Modular DC eliminates impossible shell to power trade-off • Add modules until power is absorbed • 480VAC to container • High efficiency DC distribution within • High voltage to rack can save >5% over 208VAC • Over 20% of entire DC costs is in power redundancy • Batteries able to supply up to 12 min at some facilities • N+2 generation at over $2M each • Instead, use more smaller, cheaper data centers • Eliminate redundant power & bulk of shell costs • Resource equalization
Where do you Want to Compute Today? Slides posted soon to: http://research.microsoft.com/~JamesRH