1 / 26

Oak Ridge National Laboratory (ORNL) Data Centers

Oak Ridge National Laboratory (ORNL) Data Centers. July 29, 2009. Current Data Centers. Computational Sciences Building (CSB) LEED ™ Certified Facility Two 20,000 SF 3-ft raised floor 20MW Technical Power connected Metered total building average hourly peak power 14 to 16 MW

dorothyt
Download Presentation

Oak Ridge National Laboratory (ORNL) Data Centers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Oak Ridge National Laboratory (ORNL) Data Centers July 29, 2009

  2. Current Data Centers • Computational Sciences Building (CSB) • LEED™ Certified Facility • Two 20,000 SF 3-ft raised floor • 20MW Technical Power connected • Metered total building average hourly peak power 14 to 16 MW • 6600-ton chilled water plant dedicated to computing • Emergency ties to two other plants ~1500 ton • Three 1200-ton and two 1500-ton centrifugal chillers • 1-750kW and one1500kW back-up generator • 1- 500KVA double conversion UPS with battery back-up • 1-1000KVA double conversion UPS with fly-wheel back-up

  3. Computational Sciences Building (CSB), Building 5600 • Home to Jaguar XT5 ranked #2, Kraken XT5 ranked #6, Jaguar ranked #12, and Athena ranked #21 in the June 2009 Top 500 list. Also located in CSB are several business and networking systems.

  4. The Jaguar Cray XT5 System • 200 Cray XT5 Cabinets (25x8) • 48 Liebert XDP Cooling Systems • 37kW/cabinet – 7.5MW Total • 184 ECOPhlex-cooled Cabinets • 16 Air-cooled Cabinets • 4500 square feet (system min. only) • 1.38 Petaflops peak with 4 core • 3.62 terabyes of memory • 10.7 petabytes of disk space • 200+ gigabytes/second disk bandwith • 2.0 Petaflop peak expected at same power level with upgrade to 6 core

  5. The Kraken XT5 System • 88 Cray XT5 Cabinets (22x4) • 20 Liebert XDP Cooling Systems • 37kW/cabinet – 3.3MW Total • Complete ECOPhlex-cooled Solution • 4-cabinet/XDP and 5-cabinet/XDP Configurations • 1800 square feet (system min. only)

  6. Current data centers Multiprogram Research Facility (MRF) • LEED™ Gold Certified Facility • 32,000 SF 3-ft raised floor • 8200T chilled water plant • Four 1500 centrifugal chillers with VFD • One 1200T and one 950T centrifugal chillers • Emergency ties to two chilled water plants • 20MW Technical Power connected • Metering plan in progress • One 750KW and One 2.25KW back-up generator

  7. Current Data Centers (cont.) • One 300KVA double conversion UPS with battery back-up • One 3250KW by-pass type UPS with battery back-up • S & C 480V rated at 98.5%-99% efficiency

  8. Power comes from multiple TVA generating sources External grid Local distribution Facility Ft. Loudoun 10 miles Computing facility 280-MWsubstation Bull Run 9 miles 4000 substation Kingston 13.5 miles

  9. Features and Lessons Learned • Oil and dry type 13.8 kVA/480V and 480V/208V power transformers • Meet or exceed FEMP recommended energy efficiency standards • Reliability studies performed on the electrical and chilled water systems to identify approaches to take to maximize availability within limited budgets and footprint • Computer vendors provide power supplies with ride through capability (CBEMA or SEMI F47) that will greatly improve availability without additional investment in UPS/generators • Used detailed data gathered from power monitoring system to analyze power quality events • Thorough investigation of the impact power quality events have on various systems to determine performance boundaries of equipment

  10. Features and Lessons Learned (cont.) • Locate medium voltage power transformers physically closer to loads • Life cycle cost analysis basis of acquisition of high energy use equipment (chillers, UPS, and transformers) • Cooling towers with less drift and less fan horsepower • Optimized BAS control schemes maximize efficiency for various loads, and outside conditions • VFD chillers sized to operate in most efficient load range • Computer cooling systems designed to operate at higher chilled water temperatures than conventional facility HVAC systems • Chillers with capability to operate with lower temperature condenser water to improve efficiency • Meet or exceed efficiency recommendations of FEMP and other standards

  11. Features and Lessons Learned • Bring cooling liquid as close as possible to the final heat load, and move as little air as possible (3 orders of magnitude more thermal energy moved with liquid than air for same power consumption) • Single loop variable primary chilled water system using variable frequency pump motors • Reduced filtering on AHUs in computer rooms to reduce fan horsepower required • Use of AHUs in computer room w/plug fans and w/electronically commutated motors, locate plug fans below access floor, for greater efficiency, where possible • Control air flow of computer room AHUs to and from cabinets • Computer room AHU water valves and variable speed fan motors controlled to maintain constant supply air temperature • Expanded building automation system, increased use of data and data logging to further enhance efficiency • Lowered condenser water supply temperature for better chiller operating efficiency

  12. Features and Lessons Learned • Installed automatic flow control valves and/or throttled back water flow on computer room AHU’s to limit maximum chilled water flow, increase delta T across chillers and allow chillers to reach capacity • Use central AHU for humidity control to eliminate competing/ wasteful reheat and optimize room humidity control • Optimize BAS for continued use in tracking/trending power loads • Ventilating electrical equipment room waste heat instead of mechanically cooling the area • Specifying large computer systems to operate at 480VAC instead of 208 VAC

  13. Monitoring and Control – Power PowerNet • Power distribution metering and control  system • Manage energy cost, analyze harmonics, view waveforms of transient events , evaluate power quality, trend data, obtain metering data for billing, maximize use of available capacity, etc. • 567 monitored points  • Integrated metering from the 161 KV to the 208V end user level. • One of the largest PowerNet installations in the Southeast • Cyber security issues  

  14. Monitoring and Control- Power (cont) • Hosted on three servers for data logging and archiving • One for each Data Center and one for Medium Voltage Power distribution •  Used as an Engineering, Design  & Operation  tool • Real time data monitoring for more efficient infrastructure design • Internal  power  billing  • Power quality analysis, and system monitoring, • System operation verification during medium voltage switching • Ethernet interface to capture real time data and system conditions  • Real-time calculation of performance metrics • Calculation of complicated billing formulas for cost allocation

  15. Monitoring and Control - Mechanical • Johnson Controls Metasys controls mechanical systems • HVAC & chiller Infrastructure Automation • HVAC equipment • Power equipment (alarming) • Process equipment • Web Based • Multiple Secure Levels • Privileged access levels • View only to • Full command control and sequencing

  16. Monitoring and Control - Mechanical (cont.) • Compatible with HVAC & chiller equipment • Supervisory control • Continued local control during an outage • Real time graphics • Expandable but we are approaching its limits • Data acquisition, logging and trending • Cyber security issues

  17. Organized for efficient and reliable operations • The Electrical Power Distribution, Chilled Water, Potable Water, and monitoring and control systems are maintained and operated by a central utility organization • 3 shift operation 7X24

  18. Engineering • Electrical & Mechanical Engineering Support • New computer installations, • Retrofitting/upgrading current infrastructure systems • Operations and maintenance • Systems • Outage Planning and Management • Analysis and Modeling • TileFlow for air flow analysis in Data Center Rooms • Pipe-Flo for chilled water flow analysis • SKM Electrical System Analysis: Design, Arc Flash, Coordination , Fault Current , Voltage Drop, and Harmonic Current Studies • Fast incorporation of Lessons Learned

  19. Engineering cont. • Flexibility to incorporate new ideas or change design • Reduce operating costs • Meet tight installation schedules • Incorporate latest technology • Energy Management • Power usage and analysis • System operations improvements and enhancement • 2007 the CSB PUE = ~1.31 • 2008 PUE ~1.2-1.3

  20. MCDC Features Sustainable Construction • LEED Gold, planned • SCADA Systems more PLC based v Building Management Systems • Master Planning • Sustainable design for buildings and landscape • A flexible structure that anticipates phased program growth • Programmatic centers integrated in a greater campus • Objectives: • Water use and quality • Minimize water use in towers • Maintain clean water for chiller operating efficiency • Energy management • Electrical Distribution and Supply Systems • Mechanical Air and Water Systems • Quality indoor air F&O All-Hands

  21. MCDC to apply:Mechanical Sustainable Technologies • Install chillers with environmentally friendly refrigerant • Provide select chillers with VFDs for less than full load operating efficiency • Select chillers with efficient power draw over their entire operating range • Use 2-way valves and Variable-Primary pumping scheme for chilled water systems • Operate chilled water systems with maximum differential temperature • Install chilled water automatic flow control valves on all equipment for reduced over pumping horsepower and maximized chiller capacity • Control towers to minimum condenser water temperature with VFD fans for optimal Chiller efficiency • Separate chilled water loop for critical systems • Separate chilled water loop for dew point control • Push computer vendors to develop systems that can use tower water directly

  22. MCDC to apply: Mechanical Sustainable Technologies (Cont) • Central AHU to control dew point • Control AHU supply and return air flow for reduced recirculation, increased AHU efficiency, reduced fan power and better computer cooling • Cooling air is provided locally to reduce flow path and required fan HP • Provide AHUs with minimal filtration for reduced fan power • Provide AHUs with variable flow plug fans • Incorporate tower make-up water treatment for increased cycles and reduced water usage • Incorporate tower blow-down filtration for increased cycles and reduced water usage • Install equipment that is Energy-Star rated and exceeds ASHRAE 90.1 by 30% • Optimize control algorithms for minimal energy consumption

  23. MCDC to apply: Electrical Sustainable Technologies • Transformers meet or exceed FEMP standards • Medium power transformers are located close to loads for reduced line loss • Collaborate with computer manufacturers to use 480VAC power instead of 208VAC to minimize line losses, electrical equipment footprint & infrastructure cost • Collaborate with computer manufactures to operate chilled water at higher temperatures and eventually use cooling tower water to cool computers without chillers. • Use UPS’s with higher efficiency

  24. MCDC Technologies Provide generator backup for cooling as well as UPSs. This includes cooling of UPS & other critical electrical equipment. Where data center has multiple large computer systems, configure electrical distribution system and mechanical support systems so that single point failure in the electrical system will not take parts of several systems down, but will take one or more complete systems down along with associated cooling systems. Specify technical loads that will have more than 25kW per cabinet to have 480VAC service. Specify delta connected power supplies instead of wye to minimize harmonic currents. Use dual-corded technical equipment to the extent possible to allow maintenance to be performed on electrical systems without having to take technical systems out of service and to improve reliability. Use standby UPSs instead of double conversion to reduce losses.

More Related