360 likes | 460 Views
Onward and Upward, Grid Futures and GridPP. GridPP 4 TH Collaboration Meeting, Manchester 10 May 2002 John Gordon eSC, RAL. HEP Grids. LCG. The Future?. DataGrid LCG DataTAG InterGrid UK eScience Core Programme US experiments GridPP Grid?. DataGrid Future.
E N D
Onward and Upward,Grid Futures and GridPP GridPP 4TH Collaboration Meeting, Manchester 10 May 2002 John Gordon eSC, RAL
HEP Grids LCG j.c.gordon@rl.ac.uk
The Future? • DataGrid • LCG • DataTAG • InterGrid • UK eScience Core Programme • US experiments • GridPP Grid? j.c.gordon@rl.ac.uk
DataGrid Future • Testbed 1.2 Integration starts next Monday (13th) • Experiments let loose 20th • Decision on production 24th • Make this a stable release • Special effort by WP1 • Pressure not to progress to TB1.3 until it is stable • Experiments have high expectations • TB2.0 September 2002 • Several major releases of software due • TB3.0 March 2003 • First feasible release using OGSA • Even if early adopters start now j.c.gordon@rl.ac.uk
DataGrid Future • EDG already adopted by other Grids • Pressure to deliver their requirements too • Pressure to expand testbeds or integrate with others • DataTAG, CrossGrid, GLUE • As an early Grid, pressure to demonstrate • Already interfering with the testbed j.c.gordon@rl.ac.uk
DataGrid Future • EDG already adopted by other Grids • Pressure to deliver their requirements too • Pressure to expand testbeds or integrate with others • DataTAG, CrossGrid, GLUE • As an early Grid, pressure to demostrate • Already interfering with the testbed • EDG has gelled, mustn’t get pulled apart • How much UK involvement? • In short-term – as many sites as possible to prove their grid credentials • In longer term – sites may prefer experiment grids j.c.gordon@rl.ac.uk
LCG The LHC Computing Grid Project Goal –Prepare and deploy the LHC computing environment • applications- tools, frameworks, environment, persistency • computing system services • cluster automated fabric • collaborating computer centres grid • CERN-centric analysis global analysis environment • foster collaboration, coherence of LHC regional computing centres • central role of data challenges This is not yet another grid technology project – it is a grid deployment project j.c.gordon@rl.ac.uk
LCG The LHC Computing Grid Project Two phases Phase 1 – 2002-05 • Development and prototyping • Approved by CERN Council 20 September 2001 • Funded by special contributions from member and observer states GridPP Phase 2 –2006-08 • Installation and operation of the full world-wide initial production Grid • Costs (materials + staff) included in the LHC cost to completion estimates j.c.gordon@rl.ac.uk
LCG Funding • The LCG Project has two facets • management of the resources at CERN • CERN base budget and special pledges • materials and people • coordination of the overall activities • important external resources – materials and personnel • regional centres • grid projects • experiment applications effort j.c.gordon@rl.ac.uk
LCG mountain molehill Funding at CERN – preliminary planning will evolve as the requirements and priorities are defined Special funding staff status arrived – 10 contract – 18 in-process – 4-5 Recruitment under way FZK - 8 PPARC – up to 15 EPSRC – up to 5 Italy - ~5 Funded by EU-Datagrid – 7 j.c.gordon@rl.ac.uk
Area Coordination • Applications – Torre Wenaus • common frameworks, libraries • general support for applications • Computing Fabrics – Wolfgang von Rüden • basic computing systems and technology • CERN Tier 0+1 • automated system management • Grid Technology – Fabrizio Gagliardi • ensuring that the appropriate middleware is available • Grid Deployment • Regional Centre & Grid Deployment Policy – Mirco Mazzucato • authentication, authorisation, formal agreements, computing rules, sharing, reporting, accounting, ..GDMB • Data Challenge & Grid Operation – open post • stable, reliable, manageable Grid for – Data Challenges and regular production work j.c.gordon@rl.ac.uk
LCG Grid Technology and the LCG • The LCG is a Grid deploymentproject • So LCG is a consumer of Grid technology, rather than a developer • There are many Grid technology projects, with different objectives, timescales and spheres of influence j.c.gordon@rl.ac.uk
LCG Collaboration with Grid Projects • Datagrid, iVDGL, PPDG, DataTAG, GriPhyN, CrossGrid, …. • LCG must deploy a GLOBAL GRID • essential to havecompatible middleware, grid infrastructure • better – have identical middleware • Coordination • HICB, HIJTB for agreement on common approaches, interoperability (e.g. GLUE) • and a great deal of work by Fab • The R&D grid projects cannot take commitments for implementation, support, operation of a grid for LHC j.c.gordon@rl.ac.uk
LCG iVDGL Datagrid LCG Production Grid developers’ testbed development testbed Grid projects - • responsible for testbeds on which developers can build and test their software • testbeds for demos, Beta testing, scheduled data challenges LCG - responsible for operating a production Grid – - 24 X 7 service - “as easy to use as LXBATCH” Multiple Testbeds j.c.gordon@rl.ac.uk
LCG LCG Grid Deployment DRAFT • ~April 03 - LCG-1 Global Grid Service • deploy a sustained 24 X 7 service • based on “converged” toolkit emerging from GLUE • and delivered by Datagrid and iVDGL • ~10 sites – including sites in Europe, Asia, North America • > 2 times CERN prototype capacity (= ~1,500 processors, ~1,500 disks) • permanent service for all experiments • ~Oct 03 – • reliability & performance targets • wider deployment • This Grid services evolves slowly through 2005 • new middleware – functionality, performance • grows with more sites and users • provides continuous service for the LHC experiments j.c.gordon@rl.ac.uk
LCG Short-term • Work closely with Datagrid and iVDGL during 2002 • Get experience, work on “productisation” • Agree on which areas are the responsibility of the Grid projects, and which are the long-term responsibility of LCG j.c.gordon@rl.ac.uk
stabilise middleware middleware maintenance and release process Datagrid testbed & data challenges iVDGL testbed & data challenges tests of converged testbed integrated distribution package middleware/apps support/expt environment infrastructure operation CA organisation, information services helpdesk, bug management licence management user rules installation guide operations manual user guide VO management LCG Global Grid operation ) ) Datagrid, iVDGL ) ) ) DataTAG, iVDGL ) ) share infrastructure and ) operation with Grid projects ) ) LCG – Regional Centres ) Institutes ) CERN ) …. ) …. ) LCG 2002 - Grow LCG Production Grid Servicesleveraging Grid Project Experience j.c.gordon@rl.ac.uk
LCG Dependencies on Datagrid, DataTAG, iVDGL • Summer 2002 – • demonstrate that the testbed can do real work for LHC experiments – used by data challenges • Sept 2002 – • GLUE recommendations available • Oct 2002 – • agree on the middleware for LCG-1 (based on GLUE) • December 2002 - • middleware for LCG-1 implemented & available j.c.gordon@rl.ac.uk
UK Interest • PPARC funding, UK staff, UK representation • Leverage as much back to the UK as possible • RTAG#6 – defining Tier Centres • Give us targetsn • Participation in prototype production grid • RAL + ? – more limited than EDG • More sites after October 2003 j.c.gordon@rl.ac.uk
Mainly a networking research/deployment project • Will affect the networks we run over during Grid production…… • …..but probably no effect on production networks during lifetime of GridPP • Well summarised yesterday by Richard • Let’s support them by trying their testbed networks… • … and get some useful bandwidth in the process • … to see what might be possible in future • But then there is WP4 j.c.gordon@rl.ac.uk
WP4 Interoperability Task 4.1 Networked resource discovery system (month 1-24) • Join Information Services between Grid domains • In the simple case the software and architecture is the same on both sides. • In the harder case, they are different. Task 4.2 Networked resource access policy, authorization and security (month 1-24) • Interoperability across authentication domains • Investigate CAS j.c.gordon@rl.ac.uk
WP4 Interoperability Task 4.3 Interworking between domain-specific collective Grid services (month 1-24) • A way to build large-scale Grids is to join a set of smaller independent Grids. To enable an effective interoperability between sub-Grids at least two conditions should be met: • 1. a relatively light Grid interface exists which abstracts all the underlying Grid services; • 2. the interface has to be based on standard protocols. • This task addresses both the above points. • GENIUS?? SL? j.c.gordon@rl.ac.uk
WP4 Interoperability Task 4.4 Test applications (month 1-24) • Run applications across domains • In practice, implement EDG RB and other components based on GLUE schema • Also working on adding EDG components to SAM?? • CMS, CDF, D0 +…. • So there may be (is) interest from UK physicists in joining • Not so obvious where the UK effort comes from (SL+?) j.c.gordon@rl.ac.uk
InterGrid • Higher Level Project • Much like DataTAG WP4 • But with more political clout • BaBar is UK designated application • GLUE came from InterGrid • but has been adopted by DataTAG j.c.gordon@rl.ac.uk
UK Core Programme • NeSC, regional centres, Grid Support Centre, GnT • They have contact and overlap with GridPP, but don’t seem overkeen to adopt EDG architecture • They implemented their own national GIIS • Much higher percentage of sites alive on their map • Strong network contacts • Database project overlaps with Spitfire • GridPP needs to work with them to demonstrate we are not a closed community • Other UK grids too but the Core programme is Tony Hey’s j.c.gordon@rl.ac.uk
US Experiments • Likely they will develop into working grids carrying real data • They mustn’t be impacted or held back by EDG or LCG • Before long they will be demanding real stability • All strongly driven by UK now • Hopefully this will continue, but others will catch on/up j.c.gordon@rl.ac.uk
As yet unknown Grids • Framework 6 • Should we be bidding (alone or with others) for a UK national grid? • Which international projects • DataGrid 2 • GridPP 2 j.c.gordon@rl.ac.uk
The GridPP Grid? • So what will it become? j.c.gordon@rl.ac.uk
GridPP Deployment Provide architecture and middleware Future LHC Experiments Running US Experiments Build Tier-A/prototype Tier-1 and Tier-2 centres in the UK and join worldwide effort to develop middleware for the experiments Use the Grid with simulated data Use the Grid with real data j.c.gordon@rl.ac.uk
…….. IN2P3 622 Mbps 2.5 Gbps RAL INFN FNAL Tier 1 155 mbps 155 mbps Uni n 622 Mbps Lab a Tier2 Uni b Lab c Department Desktop Monarc model of Regional Centres CERN Tier-0 j.c.gordon@rl.ac.uk
Lab m Uni x Uni a UK USA FermiLab Lab a France Tier 1 Uni n CERN Tier2 Physics Department Italy Desktop Lab b NL Lab c Uni y Uni b LHC Computing Grid USA Brookhaven ………. Germany les.robertson@cern.ch j.c.gordon@rl.ac.uk
Planned RAL Testbed Use • Testbeds • EDG testbed1, 2, 3 • EDG development testbed, • DataTAG/GRIT/GLUE • LCG testbeds • other UK testbeds • Data Challenges • Alice, Atlas, CMS, and LHCb confirmed they will use RAL • Production • BaBar and others j.c.gordon@rl.ac.uk
Building the GridPP Grid • EDG, LCG, BaBar, CDF, D0 will deliver a lot of working grid clusters • In the short term our task is to build them • In the medium term our challenge is to make them inter-working clusters • A GridPP grid • Not just for a GridPP identity • Also for flexibility, re-use, and a critical mass • Momentum and inertia can work for or against you j.c.gordon@rl.ac.uk
Components • Information Services - yes • Resource Brokers - yes • VO services - yes • Network Monitoring - yes • Fabric Services - no • Operations - no • We should build UK expertise and capability in each of these – action? • And move to a production quality service j.c.gordon@rl.ac.uk
A GridPP Grid will be good publicity • The identity will aid the project and its successors • But having the overview and common base will allow us to change things quickly and to share resources • As many sites will be in several Grids they can use this flexibility just for themselves j.c.gordon@rl.ac.uk
Conclusion • Continue to build our grids, and productionize them. • But keep an eye on the future and plan the uberGrid j.c.gordon@rl.ac.uk