170 likes | 191 Views
Data Centre Capacity and Electricity Use. How the EU Code of Conduct can help your data centre electricity bill Chris Cartledge Independent Consultant C.Cartledge@sheffield.ac.uk. Summary. ICT Electricity Use Footprint Data Centre Electricity Use The Electricity Bill PUE and DCIE
E N D
Data Centre Capacity andElectricity Use How the EU Code of Conduct can help your data centre electricity bill Chris CartledgeIndependent Consultant C.Cartledge@sheffield.ac.uk
Summary ICT Electricity Use Footprint Data Centre Electricity Use The Electricity Bill PUE and DCIE ASHRAE 2008 European Code of Conduct
University of SheffieldICT Electricity Use (2008) • More than £1M/year • ~ 20% Total Institution use • PCs dominate • Servers: 31% (including HPC & departmental)
University of SheffieldData Centres Electricity • Servers, Network, PABX • Over 40% of ICT use £400,000 p/a • Including departmental & remote cabinets
Data Centre Study Half a dozen Universities in North of England Primary 50 cabinets 120 KW Old, but possibly refurbished Was mainframe room Secondary • 25 cabinets • 75 KW • Recent, built to a price, since 2000 • Was a plant room
Typical University Data Centre Room UPS, but no generator Conventional aircon Dark, usually with the lights off... Open plan: no aisle containment Low density – typically 3kW/cabinet 1.5kW/m2 Up to about 10kW/cabinet for HPC Often not hot aisle/cold - cooling not efficient ALMOST FULL
ElectricityTypical bill: £350,000 Estates Building and plant Pays electricity bill Meter data often limited No input on IT spend Major projects, CDM, M&E, PABX, etc Computing Services Must deliver IT service No knowledge of bill Unable to monitor use Buys equipment blind VMWare, thin client, SAN, PoE, IPT, etc * Limited communication and understanding *
Power usage effectiveness (PUE) Preferred measure of data centre efficiency -some also quote data centre infrastructure efficiency (DCIE) PUE = Total data centre power / IT power OR, better, over a set period PUE = Total data centre energy consumption / IT energy consumption For example: 160kW of IT equipment (2 almost full data centres) Typical cooling, power conditioning, etc. overhead: 160kWPUE = (160+130) / 160 = 1.81Annual electricity cost @ 14p/unit = £356,000 Best Practice overhead: 40kWPUE = (160+35) / 160= 1.22Cost @14p/unit = £239,000 Potential saving: £116,500
Data Centre Set Point Reported average: 21.5oC values from 20oC to 25oC ASHRE recommendation now: 18oC to 27oC was 20oC to 25oC up to late 2008 4% saving claimed, for 1oC higher Up to 20% aircon saving by raising to 26oC? But at what risk to service? Less safe time in event of aircon failure What actual saving? Fans may work harder, aircon performance not simple Who initiates/manages such a change?
PUE and Plant not the whole story! Is dark machine room really dark – lights out? Is obsolete equipment actually switched off? Is idle equipment actually switched off? Are most efficient servers being purchased? Is storage being used efficiently – SAN? Is virtualisation used whenever practicable? Is equipment in a hot isle/cold isle arrangement Is power consumption part of software evaluation?
European Code of Conduct European Code of Conduct on Data Centres Energy Efficiency Best Practice Guidelines to enable change Plan for data centre management About 120 good practices: cover all aspects Unlikely to become compulsory HEFCE mindful of University independence But institutions can sign up – for brownie points Real savings to be made Standard plan for managing data centre capacity
Best Practice Guidelines Easy to read Clearly presented Complete Project Plan Description of Project Team Practices logically categorised eg. Deployment of new IT services Practices sequenced by time of implementation eg. On New IT Equipment or Plant Refit Practices scored 1 - 5 in terms of likely value
Group Involvement Establish a cross disciplinary change board Consider impacts, ensure effective solution Definition of standard IT hardware M&E implications of new services Audit existing equipment Optimise and consolidate where possible Virtualisation Set point Identify and deal with little used and unused services
Some Top Rated Practices Buy energy efficient IT devices Use virtualised servers and storage Switch off hardware for unused services Virtualise little used services Separate cold air from heated return air Use free or economised cooling Increase temperature set points
Conclusion There is a lot that can be done to Improve quality of provision Reduce electricity consumption and costs Meet wider agenda Good guidance, documentation training now available There are issues Split responsibilities Costs are currently hidden Investment may be needed to make progress
References European Code of Conduct on Data Centres Energy Efficiency http://re.jrc.ec.europa.eu/energyefficiency/html/standby_initiative_data_centers.htm ASHRAE - The American Society of Heating, Refrigerating and Air-Conditioning Engineers (advances technology to serve humanity and promote a sustainable world) http://tc99.ashraetcs.org/documents/ASHRAE_Extended_Environmental_Envelope_Final_Aug_1_2008.pdf The Green Grid PUE nomenclature and supporting information http://www.thegreengrid.org/en/Global/Content/white-papers/Usage%20and%20Public%20Reporting%20Guidelines%20for%20PUE%20DCiE Savings reference http://www.datacenterknowledge.com/archives/2009/01/29/hvac-group-says-data-centers-can-be-warmer/