320 likes | 757 Views
Project Milestone Meeting City of Gresham, Oregon May 30, 2014. IT Aire data center cooling system evaluation & Results. Babak Lajevardi, Ph.D. Candidate Joseph F. Junker, P.E. Karl R. Haapala , Ph.D. Project Objective and Timeline.
E N D
Project Milestone Meeting City of Gresham, Oregon May 30, 2014 IT Aire data center cooling system evaluation &Results Babak Lajevardi, Ph.D. Candidate Joseph F. Junker, P.E. Karl R. Haapala, Ph.D.
Project Objective and Timeline • Objective: To compare the energy efficiency and performance of City of Gresham data center before and post installation of IT Aire cooling system. • Timeline • Summer 2013: Monitoring system implementation • Fall 2013-Spring 2014: Monitoring of cooling systems • Winter 2014-Spring 2014: Analysis of results • Spring 2014: Technical article and white paper
Data Center Energy Consumption • Worldwide data center energy use [EPA, 2010] • About 240 billion kWh annually • Roughly 1.3% of the world total • U.S. data center energy use [EPA, 2010] • About 80 billion kWh annually • Roughly 2% of the U.S. total
Strategies for Data Center Energy Reduction • Technology development and implementation • Optimized rack layout • Hot aisle-cold aisle containment • New cooling system equipment solutions • Energy and thermal measurement and monitoring • Allows evaluation of energy efficiency performance • Gives insight to thermal management inefficiencies • Gives insight into strategic investment for cost reduction
Data Center Air Flow Partition IT Rack Q, T, RH (supply) Q, T, RH (return) Q, T, RH (exit) Q, T, RH (intake) Hot Aisle Cold Aisle
Data Center Monitoring • Gresham City Hall wireless monitoring network • Dry bulb temperature • Relative humidity (RH) • Current transducers • Data loggers • Record with interval of one minute • Save a copy to an OSU FTP address every 24 hours AC unit power Outdoor air AC unit Rack Node Roof Receiver Cold aisle Supply air/Rack intake air Rack power Rack power Rack exit air Hot aisle Return air Data room
Current Data Center Efficiency Metrics • Power Usage Effectiveness (PUE) [The Green Grid, 2007]
PUE as a Function of Wet Bulb Temperature IT Aire: 1.1-1.2 Old System: 1.3-1.4 Wet Bulb Temperature °F Humidity Ratio (Pounds of moisture/Pounds of dry air) IT Aire: 1.0-1.1 Old System: 1.2-1.3 Dry Bulb Temperature °F ASHRAE Envelope
Current Data Center Efficiency Metrics • Rack Cooling Index (RCI)[M. K. Herrlin, 2005] • ASHRAE Guideline [ASHRAE, 2011] • Recommended: 64°F < Tintake< 77°F • Allowable: 59°F < Tintake< 90°F
Data Center Air Flow Air recirculation Q, T, RH (supply) Q, T, RH (return) Q, T, RH (exit) Q, T, RH (intake) Hot Aisle Cold Aisle Air bypass
Current Data Center Efficiency Metrics • Return Temperature Index (RTI) [Herrlin, 2007] • Supply and Return Heat Indices (SHI, RHI) [Sharma and Bash, 2002]
Return Heat Index (RHI) Higher RHI, Less by pass air
Next Steps • Continued system monitoring (Until Spring 15) • Elucidate seasonal variation effects • Large data repository for higher resolution of PUE prediction • Enables statistical models for efficiency improvements • Provides in-depth insights into performance • Upcoming milestones • Presentation at IIE ISERC, May 31-June 4, 2014 • Presentation at SME NAMRC, June 9-13, 2014 • Whitepaper, June 30, 2014 (draft mid-June for review)
PUE as a Function of Wet Bulb Temperature IT Aire: 1.1-1.2 Old System: 1.3-1.4 Wet Bulb Temperature °F Humidity Ratio (Pounds of moisture/Pounds of dry air) IT Aire: 1.0-1.1 Old System: 1.2-1.3 Dry Bulb Temperature °F ASHRAE Envelope
Acknowledgments We gratefully acknowledge the funding support of this project from Portland Development Commission (PDC), Oregon Best, and Energy Trust through Oregon Best Commercialization Program. We wish to express our appreciation from the assistance we received from the City of Gresham and IT Aire.
References • R. Sharma, C. Bash, and C. Patel, “Dimensionless Parameters for Evaluation of Thermal Design and Performance of Large-scale Data Centers,” in 8th AIAA/ASME Joint Thermophysics and Heat Transfer Conference, 2002, pp. 1–11. • R. Brown, “United States Environmental Protection Agency Energy Star Program, Report to congress on server and data center energy efficiency, public law, 109-431,” 2008. • Y. Joshi and P. Kumar, Energy efficient thermal management of data centers. Springer, 2012. • “The green grids opportunity: decreasing data center and other IT energy usage patterns,” The Green Grids, Technical Report, 2007. • L. Stahl and C. Belady, “Designing an alternative to conventional room cooling,” presented at the IEE conference publication, 2001, pp. 109–115. • B. Kenneth, “Heat Density Trends in Data Processing, Computer Systems, and Telecommunications Equipment,” UpTime Institute, White paper, 2005. • C. Bash, C. D. Patel, and R. K. Sharma, “Dynamic thermal management of air cooled data centers,” presented at the Thermal and Thermomechanical Phenomena in Electronics Systems, 2006, pp. 445–452. • G. Koutitas and P. Demestichas, “Challenges for Energy Efficiency in Local and Regional Data Centers,” Journal of Green Engineering, vol. 1, p. 32. • W. Tschudi, E. Mills, S. Greenberg, and P. Rumsey, “Measuring and Managing-Data Center Energy Use,” Ernest Orlando Lawrence Berkeley NationalLaboratory, Berkeley, CA (US), 2005. • S. Law, “Gresham startup says it can slash energy costs at power-thirsty server farms,” Portland Tribune, 2013.
References • V. Avelar, Azevedo, and A. French, “PUETM: A comprehensive examination of the metric,” The Green Grid, White paper 49, 2012. • J. Bruschi, P. Rumsey, R. Anliker, L. Chu, and S. Gregson, “Best practices guide for energy-efficient data center design,” U.S. Department of Energy, Energy Efficiency & Renewable Energy, Federal Energy Management Program, [Washington, DC], 2010. • M. K. Herrlin, “Rack cooling effectiveness in data centers and telecom central offices: The rack cooling index (RCI),” Transactions-American Society of Heating Refrigerating and Air conditioning Engineers, vol. 111, p. 725, 2005. • T. ASHRAE, “9.9 (2011) Thermal guidelines for data processing environments–expanded data center classes and usage guidance,” Whitepaper prepared by ASHRAE technical committee (TC), vol. 9, 2011. • M. K. Herrlin, “Improved data center energy efficiency and thermal performance by advanced airflow analysis,” presented at the Digital Power Forum, 2007, pp. 10–12. • M. K. Herrlin, “Airflow and cooling performance of data centers: two performance metrics,” ASHRAE transactions, vol. 114, no. Part 2, 2008. • M. K. Patterson, “Energy efficiency metrics,” in Energy Efficient Thermal Management of Data Centers, Springer, 2012, pp. 237–271. • A. Vijayaraghavan and D. Dornfeld, “Automated energy monitoring of machine tools,” CIRP Annals-Manufacturing Technology, vol. 59, no. 1, pp. 21–24, 2010. • J. W. Gardner and V. K. Varadan, Microsensors, MEMS and smart devices. John Wiley & Sons, Inc., 2001. • C.-Y. Chong and S. P. Kumar, “Sensor networks: evolution, opportunities, and challenges,” Proceedings of the IEEE, vol. 91, no. 8, pp. 1247–1256, 2003. • J. Yick, B. Mukherjee, and D. Ghosal, “Wireless sensor network survey,” Computer networks, vol. 52, no. 12, pp. 2292–2330, 2008.