Data and Computing for ELI-NP Mihai Ciubancan ELI-NP, Romania Amsterdam | EGI workshop May, 2019
Outline Ø Background Ø Users Ø Current Status Ø Discussions
Background ELI-NP facility will consist of two components: • A very high intensity laser system, with two 10 PW laser arms able to reach intensities of 10 23 W/cm 2 and electrical fields of 10 15 V/m • A very intense, brilliant γ beam, narrow bandwidth, with E γ up to 19.5 MeV, which is obtained by incoherent Compton back scattering of a laser light off a very brilliant, intense, classical electron beam. This infrastructure will create a new European laboratory with a broad range of science covering frontier fundamental physics, new nuclear physics and astrophysics as well as applications in nuclear materials, radioactive waste management, material science and life sciences. The first experiments are expected to run next year
Background • I’m in charge of the design and implementation of the data and computing infrastructure for ELI-NP • 15 years of experience in WLCG • Involved in EGEE and SEEGRID projects • In charge of a Tier-2 grid site, dedicated to Alice, ATLAS and LHCb experiments, part of LHCONE network • Responsable for VOMS server dedicated to ELI-NP (eli-np.eu VO)
Users • At ELI-NP is foreseen to host 800-1000 users per year whit a beam time up to 250 days/year • Will be produce around 2-3PB/year of data(raw, simulation ,etc.) • A copy of the date will be kept locally • The data will be archive, preserving it for indefinite period of time • The authentication of the users will be based on x509 certificates together with VOMS for authorization, following the Virtual Organization (VO) paradigm • For validation purpose we envisaged to integrate the cluster in EGI community
Users • Non-local users will be able to access and transfer their data from outside
Current Status • As mentioned before we have experince with tools provided by WLCG, EGI or NorduGrid communities • We have experience with storage systems as DPM and EOS, that are in production and serves LHC experiments • In production a HPC cluster for the ELI-NP users • Users are authenticated through ELI-NP VO (eli- np.eu) hosted on a local VOMS server • Planed to deploy and integrate an EOS dedicated to HPC cluster and a FTS3 server for file transfer tests • To deploy a CMVFS server for software repository
Discussions • Could EGI share their experience with DIRAC? • EGI have experience with FTS3?Can EGI share their experience with FTS3? • Is EGI interested in offering support for setting up a test bed(for data transfers)? • Discussions on EGI AAI solutions • There is in EGI community experience with HPC? More precisely EGI community has experience with storage systems for HPC?
Acknowledgements EUROPEAN UNION GOVERNMENT OF ROMANIA Structural Instruments 2007-2013 I would like to acknowledge the support from the Extreme Light Infrastructure Nuclear Physics (ELI-NP) Phase II, a project co-financed by the Romanian Government and the European Union through the European Regional Development Fund - the Competitiveness Operational Programme (1/07.07.2016, COP, ID 1334). I would like to acknowledge also the BMBF (05P18PKEN9) for partially supporting this work.
THANK YOU
Recommend
More recommend