230 likes | 483 Views
US ATLAS Operations Program SLUO ATLAS Workshop, July 16, 2009. Mike Tuts Columbia University. Topics. What is the US ATLAS Operations Program (OP)? Management Activities Funding – Core Program vs Operations Program Who is US ATLAS? Who we are What we are involved in
E N D
US ATLAS Operations Program SLUO ATLAS Workshop, July 16, 2009 Mike Tuts Columbia University
Topics • What is the US ATLAS Operations Program (OP)? • Management • Activities • Funding – Core Program vs Operations Program • Who is US ATLAS? • Who we are • What we are involved in • How does the US ATLAS OP interact with overall ATLAS? • How do you get involved? US ATLAS Operations Program, SLAC 7/16/09 Tuts
Some History • The US ATLAS construction project has been completed • $165M • US contributed to all major systems • Inner detector (SCT, Pixel, TRT) • Calorimeters (LAr, TileCal) • Muon system (MDTs, End Caps, CSC, Alignment) • Trigger and Data Acquisition (ROIB, HLT, DAQ) • Core software, Computing • About 20% contribution to overall ATLAS • As we moved to the operations phase, our responsibilities evolved from our hardware responsibilities US ATLAS Operations Program, SLAC 7/16/09 Tuts
US ATLAS Operations Program • Host laboratory is Brookhaven National Lab • Three principal components: • WBS 2 - Software & Computing (S&C)) • Computing infrastructure: Tier 1, Tier 2 • Core software, computing support to facilitate physics • WBS 3 - Maintenance & Operations (M&O) • Pre-operations, commissioning, long term M&O • WBS 4 - Upgrade R&D • To handle future luminosity increases, prepare for upgrade construction project • [WBS = Work Breakdown Structure] • The Operations program supports technical personnel, equipment, shared computing US ATLAS Operations Program, SLAC 7/16/09 Tuts
US ATLAS Organization Chart US ATLAS Operations Program, SLAC 7/16/09 Tuts
Funding Targets US ATLAS Operations Program, SLAC 7/16/09 Tuts
What is not supported by the US ATLAS Operations Program? • For personnel the Ops program does not support physicists salaries (grad students, postdocs, scientists, faculty) • Our test is whether the person is on a “physics” career path • It does not support physicist travel or COLA while at CERN • It does not support local computing resources (“Tier 3”) • All of the above we judge to be the responsibility of the core (“base”) program US ATLAS Operations Program, SLAC 7/16/09 Tuts
US ATLAS Institutions • ATLAS is an international collaboration • Individual institutes are voted in by the ATLAS Collaboration Board (CB) • Currently 44 US institutes (from 22 states) • Albany, ANL, Arizona, UT Arlington, Berkeley LBL and UC, Boston, Brandeis, BNL, Chicago, Columbia, UT Dallas, Duke, Fresno State#, Hampton, Harvard, Indiana, U Iowa#, Iowa State, UC Irvine, Louisiana Tech* , Louisville#, Massachusetts, MIT, Michigan, MSU, New Mexico, NIU^, NYU, Ohio State, Oklahoma, Oklahoma State, Oregon, Pennsylvania, Pittsburgh, UC Santa Cruz, SLAC, SMU, South Carolina*, SUNY Stony Brook, Tufts, Illinois Urbana, Washington, Wisconsin, Yale • Corresponding to 39 voting institutions • * = affiliated with BNL; # = affiliated with SLAC; ^= affiliated with ANL • As of Sept 30, 2008 (used to determine our “dues” to ATLAS) [7/15/09] • 38/169 voting institutions (22%) • 395/1817 “current M&O authors” = ~PhDs (22%) – for cat A/B (these are the shared operating costs for the detector) [428/1889 – 22.7%] • 592/2800 M&O authors + in process of qualifying + students (21%) [634/2893 – 21.9%] • 497.75/2347.25 Operations tasks share (students count .75) (21%) [551.75/2536.25 – 21.8%] US ATLAS Operations Program, SLAC 7/16/09 Tuts
US ATLAS Demographics • FY10 projection based on our personnel database (snapshot July 2009) • Imperfect, but useful: ~110 FTE technical personnel US ATLAS Operations Program, SLAC 7/16/09 Tuts
ATLAS Layout MUON SYSTEM Monitored Drift Tubes (MDT) Cathode Strip Chambers (CSC) Resistive Plate Chambers (RPC) Thin Gap Chambers (TGC) MAGNETS 8 Barrel Toroids Central Solenoid End Cap Toroids INNER DETECTOR (ID) Pixels Silicon Strip (SCT) Transition Radiation Tracker (TRT) CALORIMETERS EM - Liquid Argon – Lead HAD - Scintillator Tile Diameter 25m Length 46m Weight 7,000 tons CAP/DAP Review Jan 14-16, 2008
3.1 Silicon Trackers US Institutes (3.4FTE): LBNL, UCSC, OSU, Oklahoma , SLAC, UT Dallas, Columbia, Louisiana Tech, Iowa, Iowa State, UNM, Wisconsin • Areas of responsibility (derived from construction project) – under Alex Grillo (UCSC) • Pixel: disks; front end readout; hybrids; end caps; opto boards; support/ structures; services • SCT: front end readout; modules; electronics – design oversight, grounding, shielding • ROD: readout driver cards, firmware, software Cut-away view of inner detector Barrel Pixels during construction US ATLAS Operations Program, SLAC 7/16/09 Tuts
3.2 Transition Radiation Tracker (TRT) US Institutes (1.4FTE): Duke, Hampton, Indiana, Penn, Yale • Areas of responsibility (derived from construction project) – under Harold Ogren (Indiana) • Barrel modules; electronics 3D representation of inner detectors TRT (at outer radius) and Inner silicon detector in assembly hall in cosmic ray test US ATLAS Operations Program, SLAC 7/16/09 Tuts
3.3 Liquid Argon Calorimeter (LAr) US Institutes (14FTE): Columbia, SMU, Pittsburgh, Stony Brook, BNL, Arizona • Areas of responsibility (derived from construction project) – under Ryszard Stroynowski (SMU) • Mechanical: Barrel cryostat; feedthroughs; cryogenics; Fcal • Electronics: front end boards; links; cables; crates; L1 trigger; low voltage power supplies Liquid Argon barrel and end caps Barrel calorimeters in detector hall, surrounded by barrel toroids US ATLAS Operations Program, SLAC 7/16/09 Tuts
3.4 Tile Calorimeter US Institutes (3.6FTE): ANL, Chicago, NIU, MSU, UTA, UIUC • Areas of responsibility (derived from construction project) – under Larry Price (ANL) • Extended barrel; modules; electronics design; PMTs; intercryostat scintillator Tile Cal Barrel and extended barrel Tile Cal extended barrel in hall during assembly US ATLAS Operations Program, SLAC 7/16/09 Tuts
3.5 Muon System US Institutes (7.8FTE): Arizona, BU, BNL, Brandeis, Duke, Harvard, Illinois, MIT, Michigan, SLAC, South Carolina, Stony Brook, Tufts, UCI, UMass, Washington • Areas of Responsibility – under Frank Taylor (MIT) • End Cap Monitored drift tubes (MDT) • All chamber electronics • Cathode Strip Chambers (CSC) • Alignment • Maintain configuration database • Calibration center operation 3D cut-away view of muon system Muon end-cap big wheel during assembly in hall, surrounded by TGC chambers US ATLAS Operations Program, SLAC 7/16/09 Tuts
dual-CPU nodes CERN computer centre ~10 ~1900 ~100 ~ 500 Local Storage (SFOs) Event Filter (EF) Event Builder (SFIs) LVL2 farm Event rate ~ 200 Hz Second- level trigger Data storage pROS DataFlow Manager DFM Network switches stores LVL2 output Network switches LVL2 Super- visor SDX1 Gigabit Ethernet Event data requests Delete commands Requested event data Regions Of Interest Data of events accepted by first-level trigger 1600 Read- Out Links ~150 PCs VME Dedicated links Read- Out Drivers (RODs) ATLAS Detector ( UX15 ) Read-Out Subsystems (ROSs) USA15 RoI Builder First- level trigger Timing Trigger Control (TTC) 3.6 TDAQ US Institutes (10.9FTE) UCI, MSU, ANL, Wisconsin, NYU, BNL, Oregon, SLAC, UTA, Chicago, UIUC, Penn, Yale • Areas of Responsibility – under Andy Lankford (UCI)/ Reiner Hauser (MSU) • Hardware & Software commissioning & maintenance • Detector & TDAQ support • Operations support Schematic of TDAQ system US ATLAS Operations Program, SLAC 7/16/09 Tuts
2. Software & Computing • Analysis to be done on worldwide computing grid of 26k CPU’s (WLCG) • WLCG is the Worldwide LHC Computing Grid • Open Science Grid (OSG) is the US grid and part of WLCG • In the US we have • Tier 1 Computing center (T1) @ BNL • Five Tier 2’s (T2) ($600k/yr from OP) • NET2 - BU/Harvard • MWT2 – Chicago/Indiana • GLT2 – Michigan/MSU • SWT2 – UTA/Oklahoma • WT2 - SLAC • Local Tier 3’s (T3) at universities • Stimulus funding may prove to be a source of funding – many proposals submitted • We manage the T1 & T2 (& some T3) in such as a way to provide a seamless computing fabric • US Institutes (48 FTE): • BNL, ANL, LBNL, Iowa State, SLAC, Chicago, Indiana, BU, Harvard, UTA, Oklahoma, Michigan, MSU, Arizona, Duke, Louisiana Tech, Oregon, UMass • Computing management under Jim Shank (BU)/Srini Rajagopalan (BNL) ATLAS Tier 1 computing sites US ATLAS Operations Program, SLAC 7/16/09 Tuts
T1 & T2 Resources • US T1 and T2 facility anticipated pledges • From Feb 09 review – changing with LHC schedule US ATLAS Operations Program, SLAC 7/16/09 Tuts
4. Upgrade R&D • The Operations Program supports the R&D necessary needed for construction of an upgraded ATLAS • Phase 1 (~2014/15 install) envisions ~3x1034 and few hundred fb-1 • Inner B layer (IBL) silicon replacement • Muon and TDAQ • Phase 2 (2018/19??) luminosity goal of 1035 and enough integrated luminosity that inner detector requires full replacement, Calorimeters, Muon, TDAQ • Upgrade activities will be presented at this meeting by Abe Seiden and others US ATLAS Operations Program, SLAC 7/16/09 Tuts
Analysis Support • While the Operations Program does not manage physics, it does provide support for analysis activities – under Jim Cochran (Iowa State) • US ATLAS goal is to support entry into overall ATLAS physics activities • 3 Analysis Support Centers: BNL, ANL, LBNL (and SLAC) • Host “Jamborees” on various analysis topics, tutorials • Analysis Support Group • Team of “virtual” experts that can answer your questions • Physics & Detector Performance Forums • A more informal venue in which to discuss analyses, issues, in US time zone • Venue for practice talks for US conferences US ATLAS Operations Program, SLAC 7/16/09 Tuts
US ATLAS and DOE & NSF • US ATLAS Operations Program is jointly funded and managed by DOE & NSF • US LHC OP is reviewed annually by DOE/NSF with outside consultants • Joint Oversight Group (M. Procario (DOE) + M. Goldberg (NSF)) meet twice yearly • BNL Program Office handles budgets; we get input from Executive Committee on resource priorities • DOE funding distributed via BNL subcontracts & NSF funding distributed via Columbia subcontracts • Agencies seek advice from US ATLAS OP management on core program needs and priorities • DOE “help” funds (has been $600k-$850k) • We review core program needs individually with each of the 44 institutes in US ATLAS US ATLAS Operations Program, SLAC 7/16/09 Tuts
US ATLAS and ATLAS • The US ATLAS Operations Program management interacts closely with overall ATLAS management • Coordinating resource priorities with overall ATLAS priorities • Serving as “National Contact Physicists” for the US • Supporting the “per capita” operating costs for overall ATLAS (called cat A and B costs, ~$10k/year/per PhD collaborator) • Advising and serving as a point of contact for ATLAS management regarding new US groups US ATLAS Operations Program, SLAC 7/16/09 Tuts
US ATLAS - Conclusions • US activities on ATLAS in Computing & Software, Maintenance & Operations, Upgrade R&D are managed and funded (DOE &NSF) centrally through the US ATLAS Operations Program • Operations Program supports technical personnel & equipment • Operations Program pays for per capita costs for US PhD physicists • ATLAS is an international collaboration – after being accepted by ATLAS you are automatically in US ATLAS • New smaller institutions have typically joined affiliated through an existing US group until they prove themselves and build up group strength • We are anxiously awaiting first beam and the start of the physics program – we hope you will join us! • If you are considering joining ATLAS we can help to interface you to the overall ATLAS management • At this workshop you will hear more details of some of the ongoing activities US ATLAS Operations Program, SLAC 7/16/09 Tuts