390 likes | 617 Views
Adaptive Airflow Management “Cool Green”. Wally Phelps Product Manager AdaptivCool™ wally.phelps@degreec.com. Partners in Thermal Management. Agenda. The Thermal Bottleneck Room Scale Intelligent Cooling Airflow and Humidity Best Practices and Their Limits Adaptive Airflow in Action
E N D
Adaptive Airflow Management“Cool Green” Wally Phelps Product Manager AdaptivCool™ wally.phelps@degreec.com Partners in Thermal Management
Agenda • The Thermal Bottleneck • Room Scale Intelligent Cooling • Airflow and Humidity • Best Practices and Their Limits • Adaptive Airflow in Action • Case Studies • Summary
The Real Culprit • Chip Density - Exponential • Packaging Density - Exponential • IT demand - Exponential • Rack Density - Exponential • Junction Temperature - UNCHANGED(reliable silicon gate operation) IT Thermal Bottleneck is Created
Legacy Data Centers • Designed for 1-3KW racks • Perimeter CRACs • Raised floor supports Cooling, Piping and Cabling • Low ceiling heights • Poor airflow distribution • Mixing and low ΔT • Cannot support today’s density ???
Room Scale Intelligent Cooling A Data Center Cooling Solution That • Uses tightly managed ACTIVE airflow to…. • Bring cool air to racks - where, when, right amount - dynamically • Help return hot air to CRACs • Minimize mixing (cooler racks & energy efficient cooling) • Work with or without containment panels (no fire code issues) • Gain typically 30% in cooling efficiency • Legacy Data Centers CAN and DO support today’s densityWe have customers supporting 200W/sq ft in sites designed for 100W/sq ft
System Block Diagram • CFD analysis • Underfloor supply air movers • Overhead returns • Sensors • Cooling Resource Manager • Web based monitoring LONWORKS BACNET SNMP E-Mail SMS RS485 NETWORK VPN 24x7 REMOTE MONITOR COOLING RESOURCE MANAGER INTELLIGENT SUPPLY CRAC UNITS INTELLIGENT RETURN SENSOR NETWORK
EPA Report Aug. 2007 30% improvement in infrastructure energy efficiency from improved airflow management Airflow is the single infrastructure improvement that can be done without disruption Page 9 of EPA Final Report
But I Have Enough Tonnage! • Typical Data Center is overcooled by 2.6X(Uptime Institute Study) • Is it delivered in the right place?(at the right time, and in the right amount?) • Can the heat return back to the CRAC? • Are the CRACs fighting?(and reducing capacity)
Data Center Cooling Rudiments • Air IS the cooling medium from chip → cooling coils • Air is lazy - mixing is the result • Mixing creates Hotspots and wastes cooling capacity • Typical response – turn down setpoints, overcool • +/- 1ºF setpoint equates to +/- 4% efficiency • Humidification control costs $$$
v Common Airflow Problems • Mixing • Recirculation - best practices or distribution • Short circuiting - best practices • Leakage – best practices • Poor return path - distribution • Humidity management - distribution • Under floor obstructions - distribution • Venturi reversal - distribution • Vortex generation - distribution • Legacy racks – best practices Most Data Centers have at least 3 or more of these issues
1st Step - CFD • High Air Movement = Turbulence • Intuition does not work • Unintended airflow paths • Use CFD to visualize and problem solve in 3D
Airflow Problem Examples VORTEX GENERATION UNDERFLOOOR (CRACS slightly offsetor at right angles) MIXING (Cool and warm air mixbefore server intakes)
Airflow Problem Examples VENTURI REVERSAL (Racks too close to CRACS,low or NEGATIVE flow) UNDERFLOOR OBSTRUCTIONS (Restricts airflow, uneven distribution)
Data Center Humidity Control • The Cooling Process Sensible Cooling Reduces Air Temperature Return Air Latent Cooling Condenses Water Vapor Courtesy of Liebert Electronics are 100% dependent on Sensible Cooling
Data Center Humidity Control T (°F) Water Phase Diagrams 80 Humidity Control = 2 Phase Changes L I Q U I D V A P O R Return Air ≈ 970 BTU/lb water 50 Cooling Coils Dehumidify(Phase Change Vapor>Liq.) Condensation (heat to refrigerant) 20 BTUs for 1 lb water IR Heaters Humidify (Phase Change Liq.>Vapor) 80 L I Q U I D V A P O R ≈ 970 BTU/lb water 50 Courtesy of Liebert Re-Humidification (Vaporize Water) Phase Changes are Energy Intensive! Humidity Control Can be >30% of Cooling Energy! 20
Data Center Humidity Control • Comparison 20 Ton CRAC $ Latent Cooling 40-55% RH kBtu/hr Sensible Cooling $ Airflow balance and separation affects dehumidification rate 42% RH 50% RH 20Tons 75ºF return 47ºF ECWT 67.8KBtu/hr delta $4-6K / yr chilled water cost (exclusive of re-humidification)
Airflow Balance Example Hot Aisle / Cold Aisle School room Dehumid. Dehumid. Humid. Humid. • Separated supply & return • Cooler (or more) IT equipment • Balanced CRAC loading • Lower humidity control costs • Improved energy efficiency • Mixing & Hotspots • Unbalanced CRAC loads • Poor humidity control • Wasted energy and capacity
Best Practices • Hot Aisle/Cold Aisle • CRACs perpendicular to rows • Lower density racks at ends of rows • Long rows with no space between racks • No perftiles in Hot Aisles • Minimize cable cutouts, (especially in Hot Aisles) • Load servers from bottom to top, no spaces • Use blanking panels • Use lower humidity • Prevent infiltration of unconditioned air
Best Practices Only (Example) Mixed 88W/sq ft load 310 KW 130T Cooling 34% Cooling CFM Margin 20” raised floor with no restrictions (not typical) Best practices employed This model is better than 90% of Data Centers built before ~2004 Hotspots still present Prevents energy savings from raising setpoints or turning off CRACS
Best Practices Only (Example) Examples of previous slide with CRAC Failures 9% CFM Margin still exists Serious overheating Servers shutting down
Best Practices Can’t Solve • Underfloor obstructions • Poor return path • Venturi effect • Vortex generation • Severe legacy placement issues • Difficult site envelope issues It’s a Fundamental Airflow DISTRIBUTION Problem! U servers ~160 CFM / KW Blades ~120 CFM / KW
Solving Airflow Distribution Design Requirements – Data Center Applications • Overcome fundamental DISTRIBUTION issues • Robust, reliable, user friendly • Non Intrusive, non disruptive, no downtime to install • Dynamically adjust to the changing Data Center • Modular, scaleable, reconfigurable
Room Scale Intelligent Cooling (RSIC) Example Previous example with installation of Room Scale Intelligent Cooling Location and quantity of underfloor airmovers based on CFD and rack density Hotspots eliminated Margin available to move setpoints Possibility to shut down CRACS
Room Scale Intelligent Cooling (RSIC) Example Room Scale Intelligent Cooling Example with CRAC Failures 9% CFM margin Hot spots now in check
Comparison Traditional Room Scale Intelligent Cooling Blank
Cooling Resource Manager • Easy to use Dashboard • Real time environmentals • Zone of Influence • Cooling management • CRAC shedding (energy saving) • CRAC failure management • Web enabled • 3 Tiers of redundancy • Trending, alarms, history • Interface to BACnet™, LonWorks™, SNMP and others
72 72 70 68 74 71 UPS Gear UPS Gear 78 73 Results – Brokerage Firm • Client Requirements • Reduce server temperatures • SNMP alarms • Planning for expansion • Reduced energy costs • Allow CRAC maintenance • Solution • Thermal analysis • AdaptivCool™ RSIC • Benefits • Average rack top temp reduced 8ºF • 24% Lower energy use • CRAC maintenance possible • Integrated SNMP alarms & trending “The site is miles ahead of where we were before AdaptivCool started the project”
78F 71F 70F 74F 70F 72F Results – Electronics Firm • Client Requirements • Eliminate server overheating • Allow for expansion • Reduced energy costs • Allow CRAC maintenance • Solution • Thermal analysis • AdaptivCool™ RSIC • Benefits • Average rack top temp reduced 4ºF • Server Hotspots eliminated • 24% Lower energy use • CRACs balanced, maintenance possible “Our main data center with the AdaptivCool solution installed has shown significant improvements in cooling. Everything AdaptivCool promised us came true”
Results – Co-Location Site 69F CRAC setpoints • Client Requirements • Eliminate hotspots • Recover lost capacity • Support blades in co-location • Real-time monitoring • Solution • Thermal analysis • AdaptivCool™ RSIC • Benefits • Hotspots eliminated • CRAC setpoints raised 1-3F • Trending and alarms • Improved redundancy • 22% lower energy use 70-72 F CRAC setpoints “With AdaptivCool managing our cooling, we now support (and charge for) 200w/sq ft customers in a space designed for only 100w/sq ft.”
Results – MFG Firm • Client Requirements • Eliminate hotspots • Manage summer heat • Reduce energy usage • Solution • Thermal analysis • AdaptivCool™ RSIC • Benefits • Hotspots eliminated • CRAC setpoints raised 1-2F • Thermal margin improved • 18% lower energy use “We calculated we are saving 18% on cooling costs and will have less to worry about in the warmer months.”
SAAS Model RSIC offered as an SLA • Remote Monitoring • Monthly Summary Reports: Thermal, Energy, Alarms etc. • Quarterly updates - Capacity, Trends, Critical Items • CFD Updates • On-Call Experts
Services New Site and Expansion Consulting • Optimal design considerations • Close Coupled, In-Row, Down Flow, Up Flow • Phased implementation roadmaps • Minimized capex • Minimized opex
Do IT Yourself Solution • 30 Minute HotSpot Solution • Off the shelf, self install • Same 1200 CFM per underfloor unit – 10KW Racks • 4 Models • HT-500 model upgradeable to RSIC
Thermal Peace of Mind™ • Solve hot spots at the room, row and rack level • Better manage the cool air you have - where, when, and in the right amount • Cooling energy savings of 20-30-40% or more IT equipment in the same facility • Monitoring and “on-call” expertise • Extend the useful life of existing Data Centers www.AdaptivCool.com
Thank You www.AdaptivCool.com