210 likes | 383 Views
GridPP. Building a UK Computing Grid for Particle Physics. Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board. Outline. Why? – The CERN LHC and the Data Deluge What? - GridPP and the Grid What is the Grid? Applications and Middleware
E N D
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board
Outline • Why? – The CERN LHC and the Data Deluge • What? - GridPP and the Grid • What is the Grid? • Applications and Middleware • Tier-1 and Tier-2 Regional Centres • How? - GridPP Management • Summary and Challenges Public Service Summit - 22 September 2004
What is GridPP? 19 UK Universities, CCLRC (RAL & Daresbury) and CERN Funded by the Particle Physics and Astronomy Research Council (PPARC) GridPP1 - 2001-2004 £17m "From Web to Grid" GridPP2 - 2004-2007 £15m "From Prototype to Production" Public Service Summit - 22 September 2004
The CERN LHC The world’s most powerful particle accelerator - 2007 4 Large Experiments Public Service Summit - 22 September 2004
LHC Experiments Searching for theHiggs Particle and exciting new Physics e.g. ATLAS Starting from this event Looking for this ‘signature’ • > 108 electronic channels • 8x108 proton-proton collisions/sec • 2x10-4 Higgs per sec • 10 Petabytes of data a year • (10 Million GBytes = 14 Million CDs) Public Service Summit - 22 September 2004
What is the Grid? Your Program Single PC Grid Your Program PROGRAMS MIDDLEWARE User Interface Machine Word/Excel Games Email/Web Resource Broker OPERATING SYSTEM Information Service CPU Disks, CPU etc Middleware is the Operating System of a distributed computing system CPU Cluster CPU Cluster CPU Cluster Disk Cluster Public Service Summit - 22 September 2004
What is the Grid? From this: To this: Public Service Summit - 22 September 2004
EU DataGrid (EDG) 2001-2004 Middleware Development Project International Collaboration • LHC Computing Grid (LCG) • Grid Deployment Project for LHC • EU Enabling Grids for e-Science in Europe (EGEE) 2004-2006 • Grid Deployment Project for all disciplines • US and other Grid projects • Interoperability Public Service Summit - 22 September 2004
The LCG Grid Public Service Summit - 22 September 2004
Grid Snapshot Public Service Summit - 22 September 2004
GridPP1 Areas Grid Application Development LHC and US Experiments + Lattice QCD UK Tier-1/A Regional Centre Hardware and Manpower Management Travel etc LHC Computing Grid Project (LCG) Applications, Fabrics, Technology and Deployment European DataGrid (EDG) Middleware Development Public Service Summit - 22 September 2004
GridPP2 Areas LHC Computing Grid Project (LCG) Manpower Management Travel etc UK Tier-1/A Regional Centre Hardware Middleware, Security and Networking Manpower UK Tier-1/A Manpower Grid Application Development LHC and US Experiments + Lattice QCD, Phenomenology and Generic Portal UK Tier-2 Regional Centres Manpower Public Service Summit - 22 September 2004
Application Development AliEn → ARDA BaBar GANGA Lattice QCD SAMGrid CMS Public Service Summit - 22 September 2004
Middleware Development Network Monitoring Configuration Management Grid Data Management Storage Interfaces Information Services Security Public Service Summit - 22 September 2004
UK Tier-1/A Centre • High quality data services • National and International Role • UK focus for International Grid development • 700 Dual CPU • 80 TB Disk • 60 TB Tape (Capacity 1PB) Grid Operations Centre Public Service Summit - 22 September 2004
UK Tier-2 Centres ScotGrid Durham, Edinburgh, Glasgow NorthGrid Daresbury, Lancaster, Liverpool, Manchester, Sheffield SouthGrid Birmingham, Bristol, Cambridge, Oxford, RAL PPD, Warwick LondonGrid Brunel, Imperial, QMUL, RHUL, UCL Mostly funded by HEFCE Public Service Summit - 22 September 2004
GridPP in Context Apps Dev Grid Support Centre GridPP CERN LCG Tier-1/A Middleware, Security, Networking Experiments Tier-2 Centres Apps Int Institutes GridPP UK Core e-Science Programme EGEE Not to scale! Public Service Summit - 22 September 2004
Management Project Map Collaboration Board Production Manager Project Leader Project Manager Dissemination Officer Project Management Board Risk Register EGEE Leader CERN LCG Liaison User Board Deployment Board Tier-1 Board Deployment Team Tier-2 Board Public Service Summit - 22 September 2004
Summary BaBarGrid BaBar EGEE SAMGrid CDF D0 ATLAS EDG LHCb ARDA GANGA LCG ALICE CMS LCG CERN Tier-0 Centre CERN Prototype Tier-0 Centre CERN Computer Centre UK Tier-1/A Centre UK Prototype Tier-1/A Centre RAL Computer Centre 4 UK Tier-2 Centres 19 UK Institutes 4 UK Prototype Tier-2 Centres Separate Experiments, Resources, Multiple Accounts Prototype Grids 'One' Production Grid 2004 2007 2001 Public Service Summit - 22 September 2004
Challenges CD stack with 1 year LHC data (~ 20 km) Concorde (15 km) We are here (1 km) • Scaling to full size ~10,000 → 100,000 CPUs • Stability, Robustness etc • Security • Sharing resources (in RAE environment!) • International Collaboration • Continued funding beyond start of LHC! Public Service Summit - 22 September 2004
Further Info http://www.gridpp.ac.uk Public Service Summit - 22 September 2004