100 likes | 114 Views
Design and implement the data and computing infrastructure for the cutting-edge ELI-NP facility in Romania, enabling frontier fundamental physics research. Serve 800-1000 users annually, produce 2-3PB of data, and ensure secure data storage and transfer using innovative technologies.
E N D
Data and Computing for ELI-NP Mihai Ciubancan ELI-NP, Romania Amsterdam | EGI workshop May, 2019
Outline • Background • Users • Current Status • Discussions
Background • ELI-NP facilitywill consist of twocomponents: • A veryhighintensity laser system, withtwo 10 PW laser armsabletoreachintensities of 1023 W/cm2andelectricalfields of 1015 V/m • A very intense, brilliantγ beam, narrowbandwidth, with E γupto 19.5 MeV, whichisobtainedby incoherent Compton back scattering of a laser light off a verybrilliant, intense, classical electron beam. Thisinfrastructurewill create a new European laboratorywith a broadrange of sciencecoveringfrontier fundamental physics, new nuclear physicsandastrophysics as well as applications in nuclear materials, radioactive waste management, material scienceandlifesciences. The firstexperiments are expectedtorunnextyear
Background • I’m in charge of the design andimplementation of the data andcomputinginfrastructure for ELI-NP • 15 years of experience in WLCG • Involved in EGEE and SEEGRID projects • In charge of a Tier-2 grid site, dedicatedto Alice, ATLAS andLHCbexperiments, part of LHCONE network • Responsable for VOMS server dedicatedto ELI-NP (eli-np.eu VO)
Users • At ELI-NP is foreseen to host 800-1000 users per year whit a beam time up to 250 days/year • Will be produce around 2-3PB/year of data(raw, simulation ,etc.) • A copy of the date willbekeptlocally • The data willbe archive, preserving it for indefinite period of time • The authentication of the users will be based on x509 certificates together with VOMS for authorization, following the Virtual Organization (VO) paradigm • For validationpurposeweenvisagedto integrate the cluster in EGI community
Users • Non-local userswillbeabletoaccessand transfer their data fromoutside
Current Status • As mentionedbeforewehaveexperincewithtoolsprovidedby WLCG, EGI or NorduGridcommunities • Wehaveexperiencewithstoragesystems as DPM and EOS, that are in productionandserves LHC experiments • In production a HPC cluster for the ELI-NP users • Users are authenticated through ELI-NP VO (eli-np.eu) hosted on a local VOMS server • Planed to deploy and integrate an EOS dedicated to HPC cluster and a FTS3 server for file transfer tests • To deploy a CMVFS server for software repository
Discussions • Could EGI sharetheirexperiencewith DIRAC? • EGI haveexperiencewith FTS3?Can EGI sharetheirexperiencewith FTS3? • Is EGI interested in offering support for setting up a test bed(for data transfers)? • Discussions on EGI AAI solutions • There is in EGI community experience with HPC? More precisely EGI community has experience with storage systems for HPC?
Acknowledgements EUROPEAN UNION GOVERNMENT OF ROMANIA Structural Instruments 2007-2013 I would like to acknowledge the support from the Extreme Light Infrastructure Nuclear Physics (ELI-NP) Phase II, a project co-financed by the Romanian Government and the European Union through the European Regional Development Fund - the Competitiveness Operational Programme (1/07.07.2016, COP, ID 1334). I would like to acknowledge also the BMBF (05P18PKEN9) for partially supporting this work.