120 likes | 229 Views
University user perspectives of the ideal computing environment and SLAC’s role. Outline:. Bill Lockman. View of the ideal computing environment ATLAS Computing Structure T3 types and comparisons Scorecard. My view of the ideal computing environment.
E N D
University user perspectives of the ideal computing environment and SLAC’s role Outline: Bill Lockman View of the ideal computing environment ATLAS Computing Structure T3 types and comparisons Scorecard
My view of the ideal computing environment • Full system support by a dedicated professional • hardware and software (OS and file system) • High bandwidth access to the data at desired level of detail • e.g., ESD, AOD, summary data and conditions data • Access to all relevant ATLAS software and grid services • Access to compute cycles equivalent to purchased hardware • Access to additional burst cycles • Access to ATLAS software support when needed • Conversationally close to those in same working group • Preferentially face to face These are my views, derived from discussions with Jason Nielsen, Terry Schalk UCSC),Jim Cochran (Iowa State), Anyes Taffard (UCI), Ray Frey, Eric Torrence (Oregon),Gordon Watts (Washington), Richard Mount, Charlie Young (SLAC) SLUO/LHC workshop Computing Session Bill Lockman
ATLAS Computing Structure • ATLAS world-wide tiered computing structure where ~30 TB of raw data/day from ATLAS is reconstructed, reduced and distributed to the end user for analysis • T0: CERN • T1: 10 centers world wide. US: BNL. No end user analysis. • T2: some end-user analysis capability @ 5 US Centers, 1 located @ SLAC • T3: end user analysis @ universities and some national labs. • See ATLAS T3 report:http://www.pa.msu.edu/~brock/file_sharing/T3TaskForce//final/TierThree_v1_executiveFinal.pdf SLUO/LHC workshop Computing Session Bill Lockman
Data Formats in ATLAS Derived Physics Data ~25 kb/event ~30 kb/event ~5 kb/event SLUO/LHC workshop Computing Session Bill Lockman
Possible data reduction chain (possible scenario for “mature” phase of ATLAS experiment) SLUO/LHC workshop Computing Session Bill Lockman
T3g • T3g: Tier3 with grid connectivity (a typical university-based system): • Tower or rack-based • Interactive nodes • Batch system with worker nodes • Atlas code available (in kit releases) • ATLAS DDM client tools available to fetch data (currently dq2-ls, dq2-get) • Can submit grid jobs • Data Storage located on worker nodes or dedicated file servers • Possible activities: detector studies from ESD/pDPD, physics/validation studies from D3PD, fast MC, CPU intensive matrix element calculations, ... SLUO/LHC workshop Computing Session Bill Lockman
A university-based ATLAS T3g • Local computing a key to producing physics results quickly from reduced datasets • Analyses/streams of interest at the typical university: • CPU and storage needed for first 2 years: SLUO/LHC workshop Computing Session Bill Lockman
A university-based ATLAS T3g • Requirements matched by a rack-based system from T3 report: • The university has a 10 Gb/s network to the outside. Group will locate the T3g near campus switch and interface directly to it 10 KW heat320 kSI2K processing SLUO/LHC workshop Computing Session Bill Lockman
Tier3 AF (Analysis Facility) • Two sites expressed interest and have set up prototypes • BNL: Interactive nodes, batch cluster, Proof cluster • SLAC: Interactive nodes and batch cluster • T3AF – University groups can contribute funds / hardware • Groups are granted priority access to resources they purchased • Purchase batch slots • Remaining ATLAS may use resources when not in use by owners • SLAC-specific case: • Details covered in Richard Mount’s talk SLUO/LHC workshop Computing Session Bill Lockman
University T3 vs. T3AF Some groups will site disks and/or worker nodes at T3AF, interactive nodes at university SLUO/LHC workshop Computing Session Bill Lockman
Qualitative score card • Cost is probably the driving factor in hardware site decision • hybrid options are also possible A T3AF at SLAC will be an important option for university groups considering a T3 SLUO/LHC workshop Computing Session Bill Lockman
Extra SLUO/LHC workshop Computing Session Bill Lockman