120 likes | 362 Views
Progress Energy Corporate Data Center. Rob Robertson February 17, 2010. of North Carolina. Agenda. Background Systems Infrastructure Disaster Recovery Processes Future Plans. of North Carolina. Background. Data Center is 24 years old (1986) Downtown Raleigh, NC
E N D
Progress EnergyCorporate Data Center Rob Robertson February 17, 2010 of North Carolina
Agenda • Background • Systems • Infrastructure • Disaster Recovery • Processes • Future Plans of North Carolina
Background • Data Center is 24 years old (1986) • Downtown Raleigh, NC • 42,000 sq. ft. (14,400 sq. ft. is computer room) • 90+ employees
Systems • IBM Z10 Mainframe • Applications • Customer Information • Supply Chain • >3,500 mainframe batch jobs and 3 million transactions daily • Shark DASD • VTS and 12 Frame Tape Library • Virtual Tape Library incoming
Systems Cont’d • Open Systems • 400+ Applications • Windows, LINUX, some UNIX • 900+ Operating Systems • 450+ Virtual Servers • <400 Physical Servers • 20+ C7000 Blade Enclosures • HP Servers, some IBM • Storage • Hitachi, NetAPP • ~900 TB, growing daily
Systems Cont’d • Networking • Hub for company’s telecommunications infrastructure • 250+ routers, 2,100+ hubs/switches and 2,020 miles of fiber • Recently upgraded to 10GB • New switches • Cisco 6509, 4948, 2950 • New cabling (Cat6A) • Voice systems • ~100 serving Florida and the Carolinas • 900 MHz radio network • Florida – 45 tower sites • Carolinas – 65 tower sites • Provide enterprise network security operations
Infrastructure • Water and DX HVAC • Dual Input Power feeds • Dual UPS • GE UPS, installed 2007 • New Liebert UPS, installed 2010 • 25yr Gel cell battery strings • 1500KW Generator • 19 Floor Power Distribution Units • Redundancy to server power supplies • New Fire SystemL: FM-200 • New EPO Tier 3 by mid-2010
Disaster Recovery • Extended Data Center • Originally hosted at Sungard (Mainframe only) • Hosted solution at Inflow (which was in-turn bought by Sungard) • Decided to build new data center • Completed in 2008 • 4000 sq.ft., 2100 sq.ft. Computer Room • z10 Mainframe, Open Systems (Tier 1 apps), Storage • Unmanned Facility
Processes • ITIL • Problem, Knowledge, Configuration and Incident Management Structures • Change Management • Outage Windows • Data Center Best Practices • Rack Management, blank panels, etc. • Cold Aisle - Hot Aisle • Air Flow Analysis • Raised Floor Hole Plugs
Processes Cont’d • Management focus on “Green” • Projects • HVAC Replacement • Lighting Replacement • Blades, Virtualization, AMD Processors • Storage Consolidation • 14% Savings in 2008 • 9.5% Savings in 2009 • 5% Savings Goal for 2010
Future Plans • Complete Tier 3 Requirements • 100% redundancy in computer room • Meet all security requirements • Major Projects (e.g., Smart Grid) • “The Incredible Shrinking Data Center” • 14,400 sq.ft. down to ~9000 sq.ft • Cooling and power savings analysis