300 likes | 399 Views
The experience with LCG GRID in Russia. E.Lyublev, A.Selivanov, B.Zagreev ITEP, Moscow November 3, 2005. ITEP history. was founded on December 1, 1945. the heavy-water reactor was run in 1949
E N D
The experience with LCG GRID in Russia E.Lyublev, A.Selivanov, B.Zagreev ITEP, Moscow November 3, 2005
ITEP history • was founded on December 1, 1945. • the heavy-water reactor was run in 1949 • in 1961 the 7-GeV proton synchrotron started operating.It was the first Russian proton accelerator using the strong focusing principle. • Now ITEP is Russian scientific centers aimed at studying nuclear physics and physics of elementary particles. • Institute occupies the area of the old eighteenth century estate "Cheremushki". ITEP&GRID
ITEP Alikhanov Institute for Theoretical and Experimental Physics Russian Federation State Scientific Center ITEP&GRID
ITEP in winter ITEP&GRID
Research program Particle and nuclear physics: Theoretical study Experimental research: • @ITEP accelerator • @CERN, FNAL, DESY,KEK and other international centers • 2B decay (Ge, Mo, Xe …) ITEP&GRID
Research program • Low energy physics and chemistry • Accelerator techniques • Nuclear power facilities • Medical physics Details: www.itep.ru ITEP&GRID
International collaboration • DESY (Hamburg) – ARGUS, H1, HERA-B • CERN (Geneva) – AMS, CHORUS, L3, ATLAS, ALICE, CMS, LHCb • FNAL (Batavia) – D0, E781(SELEX) • GSI (Darmstadt) - CBM ITEP&GRID
Russia participation in EGEE/LCG RDIG Russian Data Intensive Grid ITEP&GRID
RDIG PNPI JINR KIAM ITEP SINP RRC KI IHEP IMPB ITEP&GRID
ITEP EGEE/LCG production cluster ITEP&GRID
ITEP EGEE/LCG production cluster ITEP&GRID
ITEP EGEE/LCG hardware • UI – user interface • CE – computing element • SE – storage element • WNs – nodes for batch system • Mon – RGMa server • VO Box – server of Virtual Organization • RDIG user support server • LFC – LCG File Catalog ITEP&GRID
ITEP LCG parameters • OS –SLC 3.05 • MW – LCG -2.6.0-9 • Bach system –PBS with maui • WNs – P4(HT) -2.4Ghz 1Gb 80Gb • SE • Classic SE -1Tb • dCache/srm – 4Tb ITEP&GRID
Network • ITEP Network Backbone • 1Gb Ethernet • ITEP LAN • 100Mb Ethernet • Wireless • WAN • 1Gb channel (RAS) • 100Mb channel (MSU) ITEP&GRID
Application SW • Alice – Alien 2_4 (VO Box) + AliRoot, ROOT, xrootd … • Atlas - VO-atlas-release-10.0.4 • CMS - OSCAR_3_6_5 ORCA_8_7_1 CMKIN_4_4_0_dar • LHCb- Gaudi-v15r5 DaVinci-v12r11 ITEP&GRID
Monitoring & statistics • GOC • Gridice • MonAlisa • Farm statistics • Network statistics ITEP&GRID
GridIce ITEP&GRID
RDIG Monitoring ITEP&GRID
RDIG User Support ITEP&GRID
ALICE DC04 statistics ITEP&GRID
DC04 Summary • About 7000 jobs have been successfully done at AliEn Russian sites in 2004. It is ~4% from the total Alice statistics. Job efficiency is about 75% • Quite visible participation in ALICE and LHCb Data Challenges ITEP part ~70% SE ~1.7 TByte ITEP&GRID
DC05 to be continued… ITEP&GRID
Timeline of PDC05/SC3 2005 Aug Sep Oct Nov Dec Prototype data analysis (Phase 3) ALICE data ‘push’: - reserved/shared bandwidth - test of FTS (Phase 2) Job submission through LCG interface Event production (Phase 1) SC3 – start of service phase ITEP&GRID
Participating in Alice SC3 All experiment specific SW ITEP&GRID
AliEn (AliCE Environment) • The AliEn framework has been developed as the ALICE userentry point into the Grid world, shielding the users from its underlying complexity and heterogeneity.Through interfaces, it can use transparently resources of different Grids (LCG and INFN Grids). In the future, the cross-Grid functionality will be extended to cover other Grid flavours. • The system is built around Open Source components and uses aWeb Services model and standardnetwork protocols. Less than 5% is native AliEn code (PERL). • None of other Grid flavours providesa complete solution for the ALICE computing model. All these Grids provide a different user interface and a diverse spectrum of functionality. • Therefore some of the AliEn services will continue to be used as the ALICE's single point of entryto the computing resources of other Grid and as a complement of their functionality. The foreign Grid will be accessed via interfaces. ITEP&GRID
LCG CE LCG CE LCG CE LCG SE/SRM LCG SE/SRM LCG SE/SRM AliEn services structure Central services: • Catalogue • Task queue • Job optimization -etc. File registration AliEn CE/SE Job submission LCG UI AliEn CE/SE AliEn CE/SE LCG RB ITEP&GRID
ALICE interface to LCG • Through a VO-Box, provided on the site • LCG UI full mapping • AliEn services (Cluster Monitor, CE, SES, MonALISA, PackMan, xrootd) • VO-Box requirements published: https://uimon.cern.ch/twiki/pub/LCG/ALICEResourcesAndPlans/alice_vobox_requirements.doc ITEP&GRID
ITEP LCG site as Tier2 in SC3 (ALICE) • LCG 2.6 • FTS client • SE Dcache with srm • LFC • Xrootd protocol • Alien 2_4 connectivity with TIER1 centers is an issue ! ITEP&GRID
Non LHC GRID activity • Russian VO PHOTON for SELEX colleagues was organized in 2005 • Regional centre for AMS (VO in preparation) • Collaboration with CBM project (GSI, Darmstadt) • ITEP theory department is very interested ITEP&GRID
Summary & plans • Ready for PDC05/SC3 • Further support of LHC experiments Data Challenges becomes a trivial task running automatically • Increase significantly the power of ITEP farm in 2006 current installation occupies only ~5% of infrastructure • Concentrate on distributed analysis connectivity with TIER1 centers is an issue ! ITEP&GRID