320 likes | 560 Views
Vette LiquiCool TM Solution . Rob Perry Executive Manager. Arlene Allen University of California Santa Barbara Director, Information Systems & Computing. 46. Data Center Trends - Staggering Energy Consumption and Cost of Energy.
E N D
Vette LiquiCoolTM Solution Rob Perry Executive Manager Arlene Allen University of California Santa Barbara Director, Information Systems & Computing
46 Data Center Trends - Staggering Energy Consumption and Cost of Energy • Energy unit price has increased an average of 4% YOY in the USA and 11% YOY Globally • Data Center energy consumption is growing by 12% annually Source: EPA 2007 Report to Congress
46 Data Center Trends – Operating Expense Exceeds Capital Expense in less than 1 year • Data Center facility costs are growing 20% vs. IT spend of 6% • Operating costs over lifetime of a server ~ 4X originalpurchase cost • Cooling infrastructure can consume up to 55% of Data Center energy Source: Belady, C., “In the Data Center, Power and Cooling Costs More than IT Equipment it Supports”, Electronics Cooling Magazine (Feb 2007)
UCSB – “The Problem” • UCSB’s existing Data Center is being renovated for research computing and is forcing the corporate/miscellaneous IT equipment into a new space. • This new space is not designed to be a Data Center. The footprint is small, the power is limited by existing building wiring and using traditional air-cooling topology is not feasible. • The new space limitations requires the load density to increase from a typical density of 6kW or less to a higher density of 10-16kW per rack
LiquiCool - “The Solution” LiquiCool™ – A complete cooling solution for the consolidation and scale-out of compute infrastructure in today’s sustainable Data Centers
LiquiCool - How does it work? • Based on IBM IP & Technology Licenses (>30 years of water cooling experience) • Rear Door Heat Exchanger (RDHx) replaces existing rear door of IT enclosure • RDHx has chilled water Supply & Return quick connections at bottom OR top • Raised floor becomes optional • Chilled water circulates through tube+fin coil from Supply connection • Equipment exhaust air passes through coil and is cooled before re-entering the room Fin + tube Heat exchanger Front of Enclosure Rear of Enclosure Cold Supply Water Heated Water
LiquiCool System • Passive RDHx provides 100% sensible cooling • No condensation, no need for reheat or humidification • CDU creates a fully isolated, temperature controlled Secondary Loop • Chilled water source - city water, building chilled water, packaged chiller… Temperature:10-17oC 50-63oF Water pressure:30-70 psi Temperature:7oC / 45oF Water pressure:100-200 psi 10 10
RDHx - External View • Passive • No electrical connections • No moving parts • No Fans • No power • No noise • Attaches to rear • No need to rearrange racks • Does not consume valuable floor space, adds 4-6” to rear • Close-coupled • Neutralizes at the source Top Feed Connections Bottom Feed Connections
RDHx - Internal View Protective barrier Air-bleed valves Bottom Feed Hose Connections and drain valve Tube & Fin coil
RDHx Cooling in Action Temperature readings taken in the rear of a fully populated Enclosure Rear Door Heat Exchanger Door opened Server Leaving Temp 102ºF (38.9ºC) Rear Door Heat Exchanger Door closed Server Leaving Temp: 74ºF (23.5ºC) RDHx reduces Leaving Temperature by 28ºF (15.4ºC)!
RDHx is Compatible with most major IT Enclosures Industry Standard Enclosure Mount Transition Frame (if needed) Remove existing rack rear door & hinges
RDHx General Specifications • Max. Cooling Capacity: 33kW • Coolant: Chilled Water (above dew point) • Dimensions: 76.6“ H x 4.6“ D x 23.6“ W (1945mm x 117mm x 600mm) • Weight – empty: 63lbs (29kg) • Liquid Volume: 1.5 Gallons (5.7 Liters) • Liquid Flow Rate: 6-10 GPM (23-38 L/min) • Head Loss: 7 psi (48 kPa) at 10 GPM (38 L/min) • System Input Power: None required • Noise: None • Couplings: Drip-free stainless steel quick- connects • Connection Location: Bottom or Top Feed
Coolant Distribution Unit (CDU) • Power Consumption: 2.6 kW • Pump Capacity: 63 GPM at 30psi (240 L/min at 207 kPa) • Primary Head Loss: 10.2 psi at 63 GPM (70 kPa at 240 L/min) • Minimum Approach Temperature (100% load): • 120kW unit - 12°F (6.7 °C) • 150kW unit - 8°F (4.4 °C) • 63 GPM (240 L/min) on primary and secondary • Water to water heat exchanger with pumps, controls and chilled water valve • Creates an isolated secondary cooling loop • 100% sensible cooling, no condensation • Small water volume (tens of gallons) • Easier to control water quality • Redundant, fault-tolerant design • 120kW or 150kW capacity • Supports 6-12 RDHx • Optional internal manifold for quick expansion • SNMP & ModBus communications
Floor-mount CDU Internal - Front Controller Brazed plate heat exchanger Inverter drive Redundant valves Reservoir tank Redundant variable speed pumps Casters and Drain
Floor-mount CDU Internal - Rear Optional Secondary Loop Distribution Manifold Primary side water filter Primary supply and return connections Optional Secondary Loop Flex Tails
Hose Kits & External Manifolds • Connects to flex tails on CDU secondary side • ISO B or Sweated Connections • Standard & custom configurations • Each Vette Hose Kit consists of a flexible Supply hose and a Return hose • Factory assembled and tested to IBM specifications and standards • Quick-connect drip-free couplings on one end OR both ends • Straight hoses for raised floor environments, right angle hoses for non-raised floor environments • Standard lengths from 3ft. to 50ft.
Water Treatment Treatment of Cooling Water • Potential Effects of Non-Treatment • Loss of heat transfer • Reduced system efficiency • Reduced equipment life • Equipment failures or leaks • De-ionized water without inhibitors is corrosive! SCALE FOULING MICROBIO CORROSION
Scenario I – Out of Space Add RDHx – Double your load per rack Eliminate CRAC units 56% Recovery of White Space!
Scenario II – Out of Power/Capacity Add RDHx Remove (2) CRAC units Reduces cooling energy consumption to free up capacity for growth
Scenario III – High Density Adding RDHx allows 8X the Compute Power! CRAC units can typically provide efficient environmental control for rack densities of up to 5kw per rack
Reference Sites Warwick University, Coventry, UK National Center for HPC, Taiwan
Reference Sites Front view Rear view Georgia Tech Super Computing Facility - 12 racks at ~24kW each
Silicon Valley Leadership Group Case Study - Modular Cooling Systems
SVLG “Chill Off” Results Vette’s LiquiCool™ solution led the field in cooling capacity and in cooling efficiency! Vette
LiquiCool - Conclusive Savings for Energy, Space & Cost • Largest % of Data Center OPEX growth is power & cooling related • Cost of energy for cooling is a large (and growing) cost component • Data Center consolidation, virtualization and advanced hardware technology are driving higher power densities per rack and associated white space constraints • Traditional air-cooling is less likely feasible • Purchasing decisions can no longer be made solely on CAPEX • TCO must not only be considered, but is core Value Summary: • Reduces white space requirements by more than 55% • Cuts cooling energy consumption by 50% or more when compared to traditional air-cooled Data Centers • Allows 8X the amount of compute power in a typical IT enclosure • Lowers carbon footprint by more 50% or more vs. air-cooling • Bottom Line: Payback in less than 1 year when compared to traditional computer room air-conditioning