70 likes | 169 Views
System Hardware *Draft* 08/09/2000. What hardware will we need for: Development of DSpace Initial rollout of DSpace to customers DSpace usage through ~06/2002. Proposed Hardware FY’2000. Web Requests. switch. App Server. App Server. App Server. Production Environment. Little Brother
E N D
System Hardware*Draft* 08/09/2000 • What hardware will we need for: • Development of DSpace • Initial rollout of DSpace to customers • DSpace usage through ~06/2002.
Proposed HardwareFY’2000 Web Requests switch AppServer AppServer AppServer ProductionEnvironment Little Brother N-ClassStaging Server Big Brother N-ClassProduction Server Little Brother upgradableto carry production loadif necessary, whileupgrades and/ormaintenance occurson Big Brother RAID WS / App Server - Mirrored Metadata- Unmirrored Data Siloon each of two partitions:- staging & regression- production Backup Server DevelopmentEnvironment WS / App Server WS TrailBlazerN-ClassDev Server Tape BackupSystem WS WS Legend: WS New FY2000 Existing
Proposed HardwareFY’2001 Web Requests switch switch AppServer AppServer AppServer AppServer VideoServer ProductionEnvironment Little Brother N-ClassStaging Server Big Brother N-ClassProduction Server Little Brother upgradableto carry production loadif necessary, whileupgrades and/ormaintenance occurson Big Brother xtra CPU / DRAM xtra CPU / DRAM WS / App Server Backup Server RAID DevelopmentEnvironment WS / App Server TrailBlazerN-ClassDev Server Prod Capacity Tape BackupSystem Tape BackupSystem WS Prod Capacity Prod Capacity WS WS Legend: WS New FY2001 Existing
Rationalization (1) • Need distinct development and production environments • Must develop & improve DSpace while customers are using it. • Set up this infrastructure sooner rather than later • Need staging server distinct from production server • primary problem solved: regression before production installation • secondary problem solved: prod/staging servers are switchable (no installation/upgrade induced downtime) • Need app server web front ends • for scaling traffic. Start with two to prove it works.
Rationalization (2) • Need mirrored metadata (to avoid reconstructing a DB) • Do NOT need mirrored data silo(tape backup, <24 hour latency will suffice) • Buy only enough CPUs, Disk, Tape to get by in FY 2000. • Get entire server infrastructure in place, but… • Price of CPUs, Disks will fall with time, so wait until FY 2001 to buy and install production capacity.
What to BuyFY2000 FY 2000 • (1) Local Director IP Switch prod load balance • (1) N-Class , 2way, 4GB RAM, 200GB RAID Big Brother • (1) N-Class , 2way, 4GB RAM Little Brother • (3) LpR, 2way, 1GB RAM App Servers • (1) N-Class , 2way, 4GB RAM Dev Server • (1) A-Class , ?? 1way, 1GB RAM ?? Backup Server • (1) Tape System Backup Storage
What to BuyFY2001 FY 2001 • +1 Local Director IP Switch Staging Load Balance • CPU/DRAM upgrades Big/Little Brother • +1 LpR App Server • Disk upgrades Prod Server • Prod Backup Latency Backup Server/Storage • (1) Streaming Video Server • Software Licenses