140 likes | 296 Views
DataTAG / WP4 meeting Cern, 29 January 2002 Agenda. start at 10.30 Project introduction, Olivier Martin WP4 introduction, Antonia Ghiselli Security/authorization:interoperability issues, Roberto Cecchini
E N D
DataTAG / WP4 meetingCern, 29 January 2002Agenda • start at 10.30 • Project introduction, Olivier Martin • WP4 introduction, Antonia Ghiselli • Security/authorization:interoperability issues, Roberto Cecchini • Relationship with Applications, testbed and EU-US intergrid initiative (task 4.4), Mirco Mazzucato • Discussion on: • WP4 organization: man-power, EU-US testbed sites, DataGrid middleware and applications contacts. • proposal of DataTAG/WP4 to US InterGrid Parners. • End at 13.30 A.Ghiselli, INFN-CNAF
DataTAG – WP4 A.Ghiselli CERN, 29 january 2002 ESA A.Ghiselli, INFN-CNAF
WP4 objective • Interoperability between EU and US Grids services from DataGrid, GriPhyN, PPDG and in collaboration with iVDGL over an EU/US Testbed. A.Ghiselli, INFN-CNAF
iVDGL • International Virtual-Data Grid Laboratory • A place to conduct Data Grid tests at scale • Concrete manifestation of world-wide grid activity • Continuing activity that will drive Grid awareness • Scale of effort • For national, international scale Data Grid tests, operations • Computation & data intensive computing • Who • Initially US-UK-Italy-EU; Japan, Australia • & Russia, China, Pakistan, India, South America? • StarLight and other international networks vital A.Ghiselli, INFN-CNAF U.S. Co-PIs: Avery, Foster, Gardner, Newman, Szalay
Tier0/1 facility Tier2 facility Tier3 facility 10 Gbps link 2.5 Gbps link 622 Mbps link Other link iVDGL Map Circa 2003-2004 A.Ghiselli, INFN-CNAF U.S. Co-PIs: Avery, Foster, Gardner, Newman, Szalay
workload management Collective Layer (Application oriented) Data management Data Mover Monitoring tools Application monitoring Information Resource mgmt (GRAM) . . . Resource e connectivityLayer (Resource and appl. Independent services) Security (GSI) Data access Fault detection DataGrid and Grid Architecture WP8 From Batch traditional Applications Up to Grid enabled Applications Applications Layer WP2 WP1 WP3 Globus TK WP4 PC-Cluster Schedulers Comm. protocols WP5 Grid Fabric Layer Data servers Network Services (QoS) Optical technology Remote monitoring A.Ghiselli, INFN-CNAF
mkgridmap CVS DataGrid Release1 Services VO userLdap server CE/WN (PC Cluster) CA Ldap Tree Grid resources AUP Resource Broker II LB User Interface MDS LCFG server Replica Catalogue SE (GDMP) A.Ghiselli, INFN-CNAF
Present Grid Testbed 1 from the VO perspective VO/Alice VO/EO RC U I VO/CMS TO Nikhef CNAF CERN cnaf RB II RB II RAL PD CA roma cern Lyon CT U I RC … VO/INFN RC U I RB with different resource scope (2 grid domains) Some VOs share all CEs, SEs of the 2 domains Other VOs access to only one RB (1 grid domain) VO/Atlas A.Ghiselli, INFN-CNAF
RB RB RB II II II LB LB LB VO real framework for the LHC experiments VO/Alice VO Auth.Service VO/cms VO/EO RC VO Auth.Service U I T2 T2 T1 T1 T2 T2 CERN/T0,T1 T1 T1 T1 T2 T1 RC T1 T2 • CE, SE partially shared by all the VOs • CE, SE not shared • RBs per VO with VO resource scope • UIs per VO • VO-Authorization service • Resource discovery T2 U I VO Auth.Service VO/Atlas A.Ghiselli, INFN-CNAF
WP4 Tasks Assuming the same grid basic services (gram,gsi,gris) between the differen grid projects, the main issues are: • 4.1 resource discovery (INFN/2FTE, PPARC/0.5FTE) • 4.2 security/authorization (INFN/4 FTE(2new), UvA/ 1FTE) • 4.3 interworking of collective services (INFN/5 FTE (3new), PPARC/0.5FTE) on EU-US testbed. • 4.4 test applications (INFN/4-5 FTE(2new), PPARC unfunded, ESA/Esrin unfunded A.Ghiselli, INFN-CNAF
WP4 steps • LHC Applications extend existing use-cases (EDG WP8) to test interoperability of the grid components. • Setup Griphyn and PPDG lab environment in EU and DataGrid lab environment in US to test and analyze their services as different grid domains. • Define few testbed sites and man-power. • Identify concrete grid components that will be the targets of interoperability development efforts. • The previous steps provide input to an architectural schema collecting all the services, their relationships, interfaces and protocols following the Globus layered model. • Produce an assessment of interoperability solutions. • EU-US Integrated grid deployment A.Ghiselli, INFN-CNAF
Tier0/1 facility Tier2 facility Tier3 facility 10 Gbps link 2.5 Gbps link 622 Mbps link Other link iVDGL - DataTAG - DataGrid Map 2002-2003 DataTAG link A.Ghiselli, INFN-CNAF
The objectives of this meeting • To establish working relationship with Datagrid, • Middleware WPs, WP6 and ATF • Applicationswilling to run on the EU-US testbed(sites definition). • Definition of the initial EU work program to discuss with US counterpart (ex. I step: configure EU sites with Griphyn and ppdg software and US sites with DataGrid). • Define US counterpart for DataTAG • To agree on the technical coordination with the EU-US intergrid initiative. A.Ghiselli, INFN-CNAF
DataTAG WP4 meeting outcomes • Agrement on the WP4 workprogram steps (slide11) • relationship with Datagrid, • WPs, and ATF • ApplicationsWP8 • US counterpart for DataTAG: iVDGL • Implement direct contacts between WP4 and iVDGL • To agree on the technical coordination with the EU-US intergrid initiative. A.Ghiselli, INFN-CNAF