180 likes | 298 Views
Steve Lloyd Tony Doyle John Gordon. GridPP Presentation to PPARC Grid Steering Committee 26 July 2001. Outline. Resource Allocation and Funding Scenarios Int l Financial Comparisons Int l Grid Collaborations - see addendum Grid Architecture Links with Industry - see addendum
E N D
Steve Lloyd Tony Doyle John Gordon GridPP Presentation to PPARC Grid Steering Committee 26 July 2001
Outline • Resource Allocation and Funding Scenarios • Intl Financial Comparisons • Intl Grid Collaborations - see addendum • Grid Architecture • Links with Industry - see addendum 1. VISTA and GridPP 2. GridPP monitoring page • Summary e-Science Presentation
GridPP Proposal • GridPP = Vertically integrated programme • = Component Model... • Provides input for £15-20M funding scenarios... e-Science Presentation
GridPP Workgroups Technical work broken down into several workgroups - broad overlap with EU DataGrid A - Workload Management Provision of software that schedule application processing requests amongst resources F - Networking Network fabric provision through to integration of network services into middleware G - Prototype Grid Implementation of a UK Grid prototype tying together new and existing facilities B - Information Services and Data Management Provision of software tools to provide flexible transparent and reliable access to the data H - Software Support Provide services to enable the development, testing and deployment of middleware and applications at institutes C - Monitoring Services All aspects of monitoring Grid services I - Experimental Objectives Responsible for ensuring development of GridPP is driven by needs of UK PP experiments D - Fabric Management and Mass Storage Integration of heterogeneous resources into common Grid framework J - Dissemination Ensure good dissemination of developments arising from GridPP into other communities and vice versa E - Security Security mechanisms from Certification Authorities to low level components e-Science Presentation
J 2.6% I: 11.9% Experiment CERN Staff Objectives 27.0% H*: 5.4% Software Support H: 3.2% G: Prototype Grid 9.7% CERN Hardware 6.8% UK Managers 1.9% 1.5% 1.9% Work Groups A - F 1.7% F* UK Capital 1.5% F 15.3% 2.7% E 0.6% 1.1% 1.9% D* 1.5% 0.4% 1.4% D C* C B* B A* A Components 1-4: £21M e-Science Presentation
Starting Points • PPARC EU DataGrid commitments are built in (£2.4M, noted in Table 3) - cannot be reduced • CERN component (1/3) scales with total • UK Capital (£3.2M, largest single item) assessed • then address workgroup allocations... e-Science Presentation
J 2.6% I: 11.9% Experiment CERN Objectives H*: 5.4% Software Support 96.3% G: Prototype Grid 9.7% UK Managers 1.9% 1.5% 1.9% Work Groups A - F 1.7% F* UK Capital 1.5% F 2.7% E 0.6% 1.1% 1.9% D* 1.5% 0.4% 1.4% D C* C B* B A* A £20M Project £7.1m £6.7m H: 3.2% £3.2 £2.9 e-Science Presentation
J 2.6% CERN H*: 5.4% Software Support 90.0% G: Prototype Grid 9.7% UK Managers 1.9% 1.5% 1.9% Work Groups A - F 1.7% F* UK Capital 1.5% F 2.7% E 0.6% 1.1% 1.9% D* 1.5% 0.4% 1.4% D C* C B* B A* A £17M Project I: £2.49m £1.2m Experiment £7.1m £6.7m £6.0m Objectives H: 3.2% £3.2 £2.9 £2.45m e-Science Presentation
Experiment Objectives 50% reduction? 23 SY • Vertically integrated programme? • Broken component model… • Specific experiments or overall reduction? • To be determined by Experiments Board e-Science Presentation
Workload/Data Management 10% reduction? 1.2 SY Reduced long-term programme? e.g. scheduler optimisation (WG A) query optimisation (WG B) … or overall reduction? e-Science Presentation
J 2.6% CERN H*: 5.4% Software Support 90.0% G: Prototype Grid 9.7% UK Managers 1.9% 1.5% 1.9% Work Groups A - F 1.7% F* UK Capital 1.5% F 2.7% E 0.6% 1.1% 1.9% D* 1.5% 0.4% 1.4% D C* C B* B A* A £15M Project I: £2.49m £0 Experiment £5m Objectives H: 3.2% £3.2 £2.9 £2.45m e-Science Presentation
Conclusions • Even a £21M to £20M reduction is not trivial.. • £17M budget cuts hard into the project: especially User layer of Grid architecture: Experimental Objectives (50% reduction) • Effect of 10% reduction on other groups e.g. WG A/B • Evaluation based on Component Model: Components 1-4 correspond to Grid exploitation • £15M budget is impossible within the Component Model e-Science Presentation
Tier-1 at FNAL and 5 Tier-2 centres • Prototype built during 2000-04, with full deployment during 2005-7 • Staff estimates for the Tier-1 centre are 14 FTE by 2003, reaching 35 FTE in 2007. • Integrated costs to 2006 are $54.7M • excluding, GriPhyN and PPDG • Tier-1 RC for all 4 LHC experiments at CC-IN2P3 in Lyon • BaBar Tier-A • an LHC prototype starting now • National Core Grid (2M€/year) • INFN National Grid based on EU DataGrid • Tier-1 RC and a prototype starting now in CNAF, Bologna • 15.9M€ is allocated during 2001-3 for Tier-1 hardware alone • Tier-1 staff rising to 25 FTE by 2003 • 10 Tier-2 centres at 1M€/year • Tier-1 starting up at Karlsruhe • BaBar Tier-B at Karlsruhe • Tier-2 for ALICE at Darmstadt • No National Grid - project led • ATLAS plans very similar to CMS with costs foreseen to be the same • Tier-1 at Brookhaven International Comparisons PP Grids under development • France • Germany • Italy • US • CMS • ATLAS e-Science Presentation
International Comparisons Summary - different countries, different models • France & Germany budget for hardware, assume staff • Italy - significant investment in hardware and staff • US - funds split between Tier1/2, Universities, infrastructure, and R&D • Italy > UK ~ France ~ Germany ~US (GriPhyN, PPDG and iVDGL characteristics within GridPP: single UK programme) ~ e-Science Presentation
GridPP Architecture The DataGrid Architecture Version 2 German Cancio, CERN Steve M. Fisher, RAL Tim Folkes, RAL Francesco Giacomini, INFN Wolfgang Hoschek, CERN Dave Kelsey, RAL Brian L. Tierney, LBL/CERN July 2, 2001 • GGF architecture via EU DataGrid incorporating PPDG and GriPhyN developments • EU DataGrid Architecture Status: Version 2 (2/7/01) • Key elements: • Evolutionary capability • Services via Protocols and Client APIs • Representation using UML (TogetherSoft) • Defines responsibilities of Work Packages • Based on PP Use Cases - direct application to GridPP • GridPP (and PPDG) project management using Xproject = implementation to defined timescales e-Science Presentation
GridPP and VISTA • Astrogrid will federate VISTA data with other large databases • requires that VISTA data has already been processed and catalogues and images are available • VISTA have a proposal (e-VPAS) that concentrates on producing the databases on which the Astrogrid tools will work. This work has much in common with GridPP: • a similar timescale • large data flows from one remote site • many distributed users • reprocessing of data • utilization of distributed computing resources • GridPP have started discussions with VISTA and EPCC in order to collaborate and share expertise and middleware e-Science Presentation
GridPP Monitoring Page • Various sites now set up with UK Globus certificates • Grid Monitoring • Polls Grid test-bed sites via globus-job-run command • Runs basic script producing XML encoded status information • Load average and timestamp information is retrieved • Current status and archived load information is plotted... e-Science Presentation
Summary • Balanced exploitation programme costs £21M • £20M-£17M-£15M 3-year funding scenarios examined • £20M = maintains balanced programme • £17M = significantly reduced experimental objectives • £15M = eliminates experimental objectives • Programme optimisation depends on funding allocation • International comparisons: Italy > UK ~ France ~ Germany ~US (GriPhyN, PPDG and iVDGL characteristics within GridPP: single UK programme) • Individual contributions to GGF architecture via leading role in EU DataGrid combined with strong links to GriPhyN, PPDG and iVDGL • Industry links: emphasis on partnership • Exploring common development programme with VISTA • Demonstrators in development ~ e-Science Presentation