420 likes | 588 Views
Juraj Sucik System Architect CERN. Virtual Server Self-Service Provisioning. CERN. What means « »?. CERN. C. E. R. N. 1952. C onseil E uropé en pour la R echerche N ucl é aire. O rganisation E uropé enne pour la R echerche N ucl é aire. E uropean O rganization for N uclear
E N D
Juraj Sucik System Architect CERN Virtual Server Self-Service Provisioning
What means « »? CERN C E R N 1952 ConseilEuropéen pour laRechercheNucléaire OrganisationEuropéenne pour laRechercheNucléaire European Organization for Nuclear Research European Council for Nuclear Research 1954 EuropeanLaboratory for ParticlePhysics
The largest particle physics lab in the world People 2415 Staff730Fellows and associates200 Students9133Users2000 ExternalFirm Annual budgetin 2007982 MCHF (610 MEUR) Externalfundingfor experiments TwentyMember States Austria, Belgium, Bulgaria, CzechRepublic, Denmark, Finland, France, Germany, Greece, Italy, Hungary, Netherlands, Norway, Poland,Portugal, Slovakia, Spain, Sweden, Switzerland, United Kingdom Eight Observer StatesEuropean Commission, USA, Russian Federation,India, Israel, Japan, Turkey, UNESCO
Dofundamentalresearch By answering questions like the structure of matter… 4th - 5th century BC End of 19th century Beginning of20thcentury 1960s
Checkingexistingtheories: the standard model LEPTONS QUARKS ORDINARYMATTER ELECTRONNEUTRINO ELECTRON UP DOWN MUONNEUTRINO MUON CHARM STRANGE TAUNEUTRINO TAU TOP BOTTOM GLUONS Strong Force PHOTONS Electro-Magnetic Force BOSONS Weak Force GRAVITONS Gravity 4forces Images: www.particlezoo.net
Answering fundamental questions… HiggsBoson • How to explain particules have a mass? Newton could not explain, neithercanwe… • Whatis 96% of the Universe made of ? Wecanonlysee 4%of itsestimated mass! • Whyisn’tthereantimatterin the Universe? Nature shouldbesymetric… • Whatwas the state of matterjustafter the « Big Bang » ? Travelling back to the earliestinstants of the Universewould help…
Bringing nations togetherandeducate • Hundreds of physics institutes • Half of the world’s particle physicists • Biggest international scientific collaboration • Variousstudents programmes • Over 100 countries
Atincrediblelevels of energy! • E=mc2
The largest particle accelerators • 17 miles (27km)long tunnel • Thousands of superconductingmagnets • Ultra vacuum: • 10x emptierthan on the Moon • Coldest placein the Universe: -271° C • In safe conditions!
The biggest and mostsophisticated detectors • Cathedrals of science100m underground • 600 million collisionsper second detected • by hundreds of • million sensors • Thousands of collaborators • for each detector • In safe conditions!
Practical applications: the World Wide Web • Was developed in the frame of the LHC project in 1989! • Freely given to the World!
Practical applications: cancer treatment • For both detection and cure of cancers • PET Scans • Hadron Therapy
Practical applications: detectors • Scanning trucks in less than one hour without unloading them!
Practicalapplications:using the Grid • Ultra high-speed processing • of satellite imagery in the • case of natural disasters
And of course… some Nobel prizes! George Charpak “for his invention and development of particle detectors, in particular the multiwire proportional chamber” Carlo Rubbia(with Simon van der Meer) “for their decisive contributions to the large project, which led to the discovery of the field particles W and Z, communicators of weak interaction”
IT Infrastructure at CERN • General Purpose Computing Environment • Administrative Computing Services • Physics and engineering computing • Consolidation, coordination and standardization of computing activities • Physics applications(e.g., for data acquisition& offline analysis) • Accelerator design and operations
LHC Data Every Year • 40 million collisions per second • After filtering, 100 collisions of interest per second • > 1 Megabyte of data digitized per collision recording rate > 1 Gigabyte / sec • 1010 collisions recorded each year stored data > 15 Petabytes / year • analysis requires a computing power equivalent to ~ 100,000 of today's fastest processors
Computing power available at CERN • High-throughput computing based on reliable “commodity” technology • More than 35’000 CPUs in about 6000 boxes (Linux) • 14 Petabytes on 14’000 drives (NAS Disk storage) • 34 Petabytes on 45’000 tape slots with 170 high speed drives Nowhere near enough!
Computing for LHC • Problem: even with Computer Centre upgrade, CERN can provide only a fraction of the necessary resources • Solution: Data centers, which were isolated in the past, will be connected, uniting the computing resources of particle physicists worldwide Users of CERN Europe: 267 institutes 4603 users Out of Europe: 208 institutes 1632 users
What is the Grid? • The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations • In contrast, the Grid is an emerging infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe
The most extensive scientific computing grid • 15 Petabytes(15 millions of GB)of data everyyear • 100’000 processors • 200 computer centres around the planet • Should run 100 millions jobs • Used by 5000 scientists in 500 institutes
Why virtual? • Steady flow of requests for dedicated servers in the CERN computer centre • Excellent network connectivity • Reliable power supply, cooling • 24x365 monitoring with operator’s presence • Daily tape backup • Use the hardware without owning the responsibility (maintenance, procurement) • Focus on application without sharing the resources • Improve the CPU utilization of grid nodes • Optimize TCO
Infrastructure as a Service (IaaS) • Extra effort to procure and maintain HW • Delivery time in several weeks • Lack of flexibility • Not easy to adapt to dynamic patterns • Ready in ~ minutes • Highly flexible • Efficient capacity planning
Experience since 2006 • Server Self Service Center (S3C) • Choose your server from a set of predefined images • Take resources from the pool of available HW • Available within minutes
Requirements have evolved • New requirements identified • Flexibility of resource allocation • Higher performance • High-availability model adapted to customers • Larger scale • Efficient management Source: Gartner (August 2008)
Why Hyper-V? • A built-in component of the operating system • Create powerful VM • 64-bit support for guests • Linux support • High availability • Quick migration • Manageability • High performance, reliability, security • VHD compatibility
Why SCVMM? • One solution to centrally manage all virtual infrastructure • Windows Powershell API • V2V and P2V capabilities • Web portal • Intelligent placement • Library • Templates • Delegated management roles • Job history • Support for highly available VM • VM Migration
System Architecture CERN Virtual Infrastructure Web Interface Virtual Machine Manager Admin Console Application Management OS Maintenance Backups SOAP Services Windows Powershell LAN DB Microsoft VirtualMachine Manager
Challenges • Console access from Linux • Missing .Net API for SCVMM • Time sync issues in guests
Experiences • Cost efficient customized cloud computing infrastructure • Maintenance with limited downtime • Disaster recovery of VMs within minutes • Improved performance compared to Virtual Server 2005 Source: http://blogs.msdn.com/modonovan
Experiences • 172 running virtual machines • 17 templates • Scalable to large number of VM • Expiration handling • Green computing
Real life use cases • Video streaming for LHC First Beam Day • 6 virtual machines needed for ~1 week • Terminal Servers for Engineering Apps • A terminal servers installed with older version of the apps • Oracle Application servers
Real life use cases • CERN Media Archive • CERN Alerter web server • Physical server with 2xCPU, 4GB RAM • Upgrade necessary because of OS driver issue • Virtual server set up “on demand” • Resources limited to 1xCPU, 2 GB RAM • Physics analysis running in VM • Etc, etc.
Future work • Upgrade to Hyper-V 2.0 & SCVMM 2008 R2 • Use the new “Cluster Shared Volume” feature • Use the new “Rapid Provisioning” feature • VDI functionality
Conclusion • Innovativephysicsresearchlaboratory • Pushinglatesttechnology to itslimits • Moving services to the cloud Visitourwebsites:Informations: www.cern.ch CERN TV: www.youtube.com/cernRecruitment: www.cern.ch/jobs