610 likes | 796 Views
Data Center Remodeling. Presented by Jim Rarus, Wayne RESA MAEDS 48 October 24, 2012. Agenda. Brief history Overview of our facility Challenges Positives Remodeling Process Remodeling in pictures Hardware highlights Summary. Wayne RESA.
E N D
Data Center Remodeling Presented by Jim Rarus, Wayne RESA MAEDS 48 October 24, 2012
Agenda • Brief history • Overview of our facility • Challenges • Positives • Remodeling Process • Remodeling in pictures • Hardware highlights • Summary
Wayne RESA • One of 57 regional educational support centers • Established by Legislation in 1962 – 50th anniversary • Supports 34 school districts in Wayne County • 110 Public school academies • 313,000 students • Largest in the state – 13th nationwide • Services • Consolidating technical services • Teacher training • Special education and vocational services
Computer Services Consortium • Services to 72 districts, ISDs and other agencies statewide • Business services – payroll, finance • Student Services – grade reporting, attendance, transportation • Technical Services • Countywide WAN • Connectivity to application servers • Internet – Merit affiliate since 1994 • Network management and consulting • 62 staff members
Facility Education Center build in 1976 Computer Services wing 10,000 square feet of raised floor Original data center – 4,000 square feet 1970’s and 80’s needed all the space The “big one” IBM 3033 water cooled mainframe
Systems Operation Equipment and People Tapes Disk Printing Communications Adjust modems
Changes Hardware from multiple vendors New functions Modems Gateway PCs Not rackable Early networking We survived dial up Internet Endless number of modem power bricks
Use of space evolves • Mainframe downsizes • One end of the data center • More communications equipment and file servers • Other end of the data center • Use the middle for office space (more consultant staff) • Partial walls help block noise • Air flow under the floor keeps space too cool • Lots of abandoned cables and power under the floor • Hard to clean – done only once
Connectivity – more in less space • Prior to 1994 • Leased analog circuits to mainframe and minicomputers • Dial up to timesharing minicomputers • 1994 to early 2000’s • Digital circuits • 56Kb and 1.5Mb (T1) • Early 2000’s • Fiber optic connections – up to 1 GB possible • Direct connect or AT&T services (tiered GigaMAN to Opt-E-MAN) • LAN like performance over a Wide Area
Improved Connectivity leads to.. • Increase in Application Servers vs file service • Offer more that just web service • Client server applications • Wayne RESA starts migration of administrative services • Need for rack space increases • Minimum amount of cabling • Limited bandwidth and redundancy • All gear still not rack mountable – shelves • Especially early Applc servers • Cabling relatively modest
Life after AV (After Virtualization) • Modest start which GROWS • Wayne RESA installs Vmware 3.0 on release 7/2006 • 2 host HP AMD systems with fibre channel SAN • Add a few more host systems • Move to iSCSI storage – Dell Equallogic in 2008 • Cabling becomes more complex • Trunking Ethernet connections to host computers and storage • Becomes clear • Future expansion will head us to a cabling migraine
Other Factors • Space showing its age • HP installer in 2006 comments on poor quality of facility – electrical only grade of “A” • Several minor electrical problems in 2010 (Cisco 6509 power supply “pops”) • Internet demand closing in on 1 Gb by spring 2010 • AT&T GigaMAN at 100 Mb and AT&T Opt-E-MAN at 600 Mb • Other building wings at RESA remodeled • Recent remodel of Media and Technology area • Waiting for payroll application to migrate off mainframe
Server room space limited • Server racks against office wall
Limited Expansion Office space behind air handler (CRAC unit)
Facility concerns • Audit issues • No swipe card access to machine room • Push code lock • No fire suppression system • No under floor leak detection • Cooling issues on extremely hot days • Limited space behind racks perhaps the reason • People and equipment same space - don’t mix well • Temperature and noise • Cleanliness of area • Security
Factors in our favor • Cooling • Replaced Glycol system in 2007 • Two new CRAC units • Power • UPS replaced in 2009 • Plenty of excess capacity • Diesel generator installed in early 2000’s • 10,000KV • Powers the entire building • Board of Education and administrative team support
Connectivity limits • Cisco 6509 installed in 2001 • 1 Gb max per interface • Upgrade supervisor and added FWSM (firewall) • Trunking multiple gig connections from servers • Some through old gigabit aggregation switches • Potential reliability issues • Didn’t want to trunk Merit connections • Packet shaping with Exinda unit – messy routing • Server storage connections limited to trunked gig connections as well through stackable switches
Solving connectivity challenge • Any upgrade would be expensive • Discussion on remodeling made easier to also discuss new network hardware • New place – new gear – made sense • Considered an upgrade to a Cisco Nexus 7000 • No firewall blade • Very expensive • More suitable to data center only use – not WAN
Connectivity improves 6/2011 • Decided on a mixed upgrade • Upgrade Cisco 6509 chassis to an E series • Backplane supports 10Gb modules • Use existing supervisors • FWSM used to firewall server VLANs • Purchase Nexus 5000 unit for storage aggregation • Use TOR (Top Of Rack) switches to connect both servers and storage over trunked 10Gb • Highly reliable and scalable at a relatively low cost • Use 6509E for all routing • Purchase 2 Cisco ASA 5585 Firewall and IPS units • Multi gigabit firewall and IPS capable
Process starts • Preliminary meeting with Plante Moran 11/2010 • Information gathering phase and budget discussions • Divide the project into 2 teams • Facilities • Headed by Rick Crosby, RESA Building Manger • Technical team members as needed • Technical • Headed by Jim Rarus, LAN and Network Manager • What about the fiber optics • Facilities or Technical ? • Added to the Facilities budget • Decided to re-run cabling to 7 building IDF closets • Previous cable path down the middle of the computer room
Complications • Drawings assembled by Barton Marlow and IDS • RFP released • General contractor selected • Electrical Challenge • UPS was large enough but “over tapped” • Complete power down needed to provide service to new space • Just power down for the day! (Sure, last time was 7/09) • Wire the new space • Tap into the UPS as first step of the move
Technical Team • Learning Strategies • Take advantage of Merit Professional Learning • Global Knowledge Data Center Design Webinar – 3 days • Several annual meetings • MJTS sessions • Trusted vendors • Cisco and resellers • JEM Computer • Dell Computers • Data Strategy • Site visit - Steelcase • Personal research • Challenge to staff – you know what’s wrong with our current infrastructure – here is your chance to correct it! Don’t come back later with a “should of done…”
Fun Begins • Formal Project Kick Off Meeting - 7/27/11 • Team meetings at least once per month • Establish space requirements – 20 racks over 2 aisles plus 1 aisle for future growth (1200 square feet) with a work area (450 square feet) • Tech Team previously selected rack vendor • Ordered 6509E, Nexus, ASA firewalls, 3 HP DL 585 host systems and 2 shelves of Dell Equallogic storage • Connect in new racks in existing space • Demand for computer resources can’t wait for construction
Timeline • Technical team continue to upgrade hardware as needed to meet demands • Short term electrical work • Construction to start Spring Break, (4/2012) • Wall off construction area from existing server area • Set up temporary cooling • 3 units needed for server area • 2 needed for mainframe • Shutdown and move to start Friday evening July 6, 2012 • Date set in February
Steps along the way • Upgrades • Need to support 10Gb connectivity in 9/2011 • Downtime needed – Sunday 7/24/11 – close to start of school • Implement Nexus and servers/storage as time allowed in new racks • No downtime • Bring 10Gb servers and storage on line over next few months • Firewall over winter break, 2011 • Re-rack most servers and storage • 3 days over Winter Break, 2011 • Lots of preliminary pre-wiring • Major reason why final July move was so smooth!
Construction work proceeds • Starts with Spring Break after my staff relocates • New office space in another wing – a concern! • Computer room will be a “lights out operation” • Temporary Cooling • Works fine – uses lots of flowing water to remove heat • Construction wall • Door left open several times between area – Opps! • Perhaps plastic with no door a better solution • More of everything • More wiring, removal of more plumbing, still under budget