180 likes | 325 Views
The LHC Computing Grid. Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher Education Poland Tuesday 23 rd February 2010. Frédéric Hemmer IT Department Head. The ATLAS experiment. 7000 tons, 150 million sensors
E N D
The LHC Computing Grid Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher Education Poland Tuesday 23rd February 2010 Frédéric HemmerIT Department Head
The ATLAS experiment 7000 tons, 150 million sensors generating data 40 millions times per second i.e. a petabyte/s The LHC Computing Grid, February 2010
A collision at LHC 3 The LHC Computing Grid, February 2010
The Data Acquisition Ian.Bird@cern.ch 4 The LHC Computing Grid, February 2010
Tier 0 at CERN: Acquisition, First pass processingStorage & Distribution 1.25 GB/sec (ions)
The LHC Data Challenge The LHC Computing Grid, February 2010 The accelerator will run for 10-15 years Experiments will produce about 15 Million Gigabytes of data each year (about 20 million CDs!) LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity
Solution: the Grid • The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations • The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe The LHC Computing Grid, March 2009 Use the Grid to unite computing resources of particle physics institutes around the world
Tier 0 – Tier 1 – Tier 2 • Tier-0 (CERN): • Data recording • Initial data reconstruction • Data distribution • Tier-1 (11 centres): • Permanent storage • Re-processing • Analysis • Tier-2 (~130 centres): • Simulation • End-user analysis The LHC Computing Grid, February 2010
The CERN Tier-0 The LHC Computing Grid, February 2010 • 24x7 operator support and System Administration services to support 24x7 operation of all IT services. • Hardware installation & retirement (~7,000 hardware movements/year) • Management and Automation framework for large scale Linux clusters • Installed Capacity • 6’300 systems, 39’000 processing cores • CPU servers, disk servers, infrastructure servers • Tenders planned or in progress: 2’400 systems, 16’000 processing cores • 13’900 TB usable on 42’600 disk drives • Tenders planned or in progress: 19’000 TB usable on 20’000 disk drives • 34’000 TB on 45’000 tape cartridges • (56’000 slots), 160 tape drives
The European Network Backbone LCG working group with Tier-1s and national/ regional research network organisations New GÉANT 2 – research network backbone Strong correlation with major European LHC centres Swiss PoP at CERN The LHC Computing Grid, February 2010
Overall summary • November • Ongoing productions • Cosmics data taking • November – December • Beam data and collisions • Productions + analysis • December – February • Ongoing productions • Cosmics • WLCG service has been running according to the defined procedures • Reporting and follow up of problems at same level • Middleware process – updates & patches – as planned The LHC Computing Grid, February 2010
2009 Physics Data Transfers Final readiness test (STEP’09) Preparation for LHC startup LHC physics data Nearly 1 petabyte/week More than 8 GB/s peak transfers from Castor fileservers at CERN The LHC Computing Grid, February 2010
Grid Computing Now The LHC Computing Grid, February 2010
Impact of the LHC Computing Grid in Europe Impact of the LHC Computing Grid in Europe • LCG has been the driving force for the European multi-science Grid EGEE (Enabling Grids for E-sciencE) • EGEE is now a global effort, and the largest Grid infrastructure worldwide • Co-funded by the European Commission (Cost: ~170 M€ over 6 years, funded by EU ~100M€) • EGEE already used for >100 applications, including… • Archeology • Astronomy • Astrophysics • Civil Protection • Comp. Chemistry • Earth Sciences • Finance • Fusion • Geophysics • High Energy Physics • Life Sciences • Multimedia • Material Sciences • … >250 sites 48 countries >50,000 CPUs >20 PetaBytes >10,000 users >150 VOs >150,000 jobs/day
Similarity Search Measurement of Pulmonary Trunk Temporal Modelling RV and LV Automatic Modelling Visual Data Mining Surgery Planning Genetics Profiling Treatment Response Personalised Simulation Inferring Outcome Semantic Browsing Biomechanical Models Tumor Growth Modelling
Sustainability • Need to prepare for permanent Grid infrastructure • Ensure a high quality of service for all user communities • Independent of short project funding cycles • Infrastructure managed in collaboration with National Grid Initiatives (NGIs) • European Grid Initiative (EGI)
www.cern.ch/lcg www.gridcafe.org www.eu-egi.org/ www.eu-egee.org For more information about the Grid: Thank you for your kind attention! The LHC Computing Grid, February 2010