470 likes | 488 Views
This talk provides an overview of the commissioning process of the ATLAS High Level Trigger, including system tests, single beam and cosmic tests, successes, and lessons learned.
E N D
John Baines Commissioning of the ATLAS High Level Trigger
Overview of Talk • ATLAS • LHC Parameters • The ATLAS Trigger • UK & RAL involvement • Commissioning • System Tests • Single Beam • Cosmics • Successes & Lessons Learned • Commissioning in 2009 • Summary Material taken from conference talks by: S. Farrington, C. Padilla, R.Hauser, F. Winklmeier, W. Wiedenmann, R. Goncalo, A. Ventura
LHC Parameters Parameters at full luminosity (L=1034cm-2s-1) - • Bunch crossing interval : 25ns (40MHz) • No. overlapping events : 23 • => event rate ~ 1GHz • Average no. particles : 1400 • About 108 channels to read out. • ➔ Event size: 1.5 Mbyte • ➔ Larger during special runs: > 15 Mbyte • ➔ Tier0/Reconstruction/Grid/Storage: output limit about 200 Hz/300 MByte/s • Example signal & background rates: • 100 GeV Higgs: ~0.1 Hz • SUSY <1 Hz • W ~500 kHz • Z ~80 kHz • Background • Inelastic: ~1GHz • Jets >1kHz
Trigger Architecture p p - 40MHz Calorimeter or Muon (or TRTfastOR) Hardware: FPGA, ASIC Identify Regions of Interest for HLT 2.5ms Level 1 75kHz Software Trigger, commodity PCs Seeded by L1 ROI Full detector granularity. Requests data in RoI from Read Out Buffers Level 2 40ms 2GHz CPU 2 - 3kHz Software Trigger, commodity PCs Seeded by L1 & L2 Has access to entire event. Event Filter 4s 2GHz CPU 200Hz
ATLAS Trigger & DataFlow ~40ms ~4s
ATLAS UK HLT Manchester Oxford Royal Holloway RAL UCL RAL: Fred Wickens Monika Wielers Dmitry Emeliyanov Julie Kirk Bill Scott John Baines Student: Rudi Apolle Trigger Selection Software Inner Detector Trigger Electron/photon Trigger B-Physics Trigger Trigger Release Coordination Trigger Validation Trigger Hardware & Farms
Level-1 3 sub-systems: • L1- Calorimeters • L1- Muons • Central Trigger Processor (CTP) Signature Identification • e/g, t/h, jets, μ • Multiplicities per pT threshold • Isolation criteria • Missing ET, total ET, jet ET CTP • Receive and synchronize trigger information • Generate Level-1 trigger decision (L1A) • Deliver L1A to other subdetectors • Sends the Regions of Interest to the Level 2 trigger
The HLT Farm Ultimately: 2300 processors (L2+EF) Now: ~1600 processors
Multi-core processors • Resource requirements are multiplied with number of process instances • Memory ~ 1–1.5 GByte/Application • file descriptors • network sockets, • number of controlled applications • ~ 7k presently • ~ 20k final systemTrigger
HLT Framework • Level-2 • HLT selection software runs in the Level-2 Processing Unit (L2PU). • Selection algorithms run in a worker thread. • Event Filter(3 kHz→200 Hz) • Independent Processing Tasks (PT) run selection software on Event Filter (EF) farm nodes • HLT Event Selection Software is based on the ATLAS Athena offline Framework • HLT framework interfaces the HLT event selection algorithms to online • Driven by run control and data flow software • Event loop managed by data flow software • Allows HLT algorithms to run unchanged in the trigger and offline environment
HLT Selection Software • LVL2: Reduce rate from up to 75 kHz to 2-3kHz in av. 40ms • Custom algorithms with some offline components • EF: Reduce rate from 2-3 kHz to 200-300Hz in av. 4s. • Offline algorithms run from HLT-specific wrappers • HLT: • Processing in Region of Interest • Only process ~few % of event • At LVL2, request data over network for few % of event • Early rejection – stepwise processing to minimize execution time for rejected events
Level1 Region of Interest is found and position in EM calorimeter is passed to Level 2 EMROI RoI-based, stepwise processing : e/g example L2 calorim. Event rejection possible at each step cluster? Electromagnetic clusters L2 tracking Level 2 seeded by Level 1 Fast reconstruction algorithms Reconstruction within RoI track? match? E.F.calorim. E.F.tracking Ev.Filter seeded by Level 2 Offline reconstruction algorithms Refined alignment and calibration track? e/ reconst. e/ OK?
Trigger Menus • Trigger Menu defines chains of processing steps starting from LVL1 RoI • Menu specified in terms of signatures e.g. mu6, e10, 2j40_xe30etc. • Chains can be prescaled at Level-1 or the HLT • Signatures assigned to inclusive data-streams: • egamma, jetTauEtmiss, muons, minbias, Lar and express Example of electron signatures
Commissioning • System tests with simulated & previously recorded cosmic data • Download data to Read Out Buffers • Can test with collision events • Exercise system at max. LVL1 rate • Cosmic tests: • Individual detectors (“slice weeks”) • Combined runs => Expose algorithms to real detector noise, data errors etc. • Beam: • Single beam • Collisions
Single beam configuration – injection energy protons circulating in LHC On collision with a collimator, a spray of particles entered the detector Single Beam - 10:19 10/9/2008 Online Offline
Level-1 Commissioning in Single Beam Each trigger component needs to be synchronised with the beam pick up - Bunch crossing 10 -10 -8 Bunch crossing 8
Differences in Cosmic v. Beam running • No beam clock • Muon trigger chambers provide timing • Phase issues in read-out of TRT (straw detector) & Muon Drift Chambers • No beam/no IP • Tracks distributed over d0, z0 • L2 dedicated algorithms for fast muon reconstruction (in MDTs) and fast tracking algorithms in inner detector optimized for trajectories pointing towards the beam line • Muons in HLT • The r-z view could not be fully reconstructed at L2 because algorithms are designed for pointing tracks and data access request is in trigger towers pointing to the IP • Possible to relax pointing requirements to study rejection/efficiency • Timing issues cause percent-level loss • Tracking • Level-2 algorithms optimized for tracks from Interaction Point
Calorimeter in e/g & t Triggers Study of performance of clustering algorithm in Tau trigger
e/g Example plot from eg FEX algorithms comparing L2 and EF: Shower shape in 2nd EM sampling Rη=E(3×7)/E(7×7).
Muon Trigger s=17mRad s=0.007
Muons in the Tile Calorimeter Df between tile cluster and ID track
Commissioning the InDet trigger • Want to commission the LVL2 collisions algorithms with cosmic. • But speed-optimisation of Level-2 algos means they are inefficient for tracks more that a ~5 mm from the nominal beam position. • Three strategies: • Use only the small fraction of events that pass close to the I.P. • Loosen cuts in Pat. Rec. (not possible for all Algs.) • Shift points.
Commissioning Level-2 tracking Add an initial step that applies a shifts to all the points, so the track seems to come from the Interaction Point
Level-2 ID Efficiency w.r.t. Tracks reconstructed offline
Cosmics for ID alignment HLT trigger used to select events passing through the ID, sent to the the IDCosmic stream & used for offline alignment
Commissioning with Cosmics 216 millions events 453 TB data 400k files several streams
Online Handling of Time-Out Events • Time-out Events go to the DEBUG stream • The events are re-processed and streamed as if they had been processed online. The only difference is the file name. • Files registered to the corresponding offline DB and processed normally, producing ESD, AOD, etc. , but still be separated and with the “recovered” tag.
Successes & Lessons learnt • Some highlights: • Trigger ready for First Beam • Single beam events triggered with LVL1 & HLT streaming based on Level-1 • HLT run offline on the CERN Analysis Farm • Trigger including HLT algorithms exercised in cosmic running • ~2 months running, 220 million events • incl. long runs of >2M events • Successfully streamed events incl. IDCosmic stream used for alignment. • Exercised processing of events from the Debug stream • Exercised procedures for evaluating new menus & code fixes on CAF prior to online deployment • Successfully exercised release management in data-taking conditions • deployed patch releases for P1 and HLT
Successes & Lessons learned Improvements for 2009 Running: • Ability to change LVL1 pre-scales during a run was invaluable • put in place infrastructure to enable HLT prescales to also be updated during run • Change of magnetic field required a menu change: => Algorithms now able to configure magnetic field automatically based on magnet current • Problems with calculating online Level-2 & EF trigger rates • Old system too susceptible to problems collecting information from farm nodes. • Improvements in rate calculation and collection of information from nodes • Removal of detectors from readout caused errors in HLT => events in debug stream • Allow algorithms to access Mask saying which detectors are in the run => modify error response • Problems with noisy detectors • Consolidate procedures for making noisy detector masks available online • Improve monitoring, especially detector & trigger info. displayed side-by-side
Plans for 2009/10 Luminosity : ~2x1032 Integrated : ~200pb-1
Collisions • Cosmics • Cosmics with combined L1 muon triggers • First beam menu: Cosmics + beam pickup trigger • Bunch groups commissioned (requires clock commissioning) • High Level Trigger performs streaming • HLT algorithms run offline • Add HLT one piece at a time in tagging mode • Switch on HLT rejection after algorithms validated online • Full 1031 Menu -
Collisions Cosmics Cosmics with combined L1 muon triggers First beam menu: Cosmics + beam pickup trigger Bunch groups commissioned (requires clock commissioning) -
Collisions • Cosmics • Cosmics with combined L1 muon triggers • First beam menu: Cosmics + beam pickup trigger • Bunch groups commissioned (requires clock commissioning) • High Level Trigger performs streaming • HLT algorithms run offline • Add HLT one piece at a time in tagging mode • Switch on HLT rejection after algorithms validated online • Full 1031 Menu -
Conclusion • The trigger was successfully commissioned in Single Beam and Cosmic running in Autumn 2008 • Data has been analysed to validate the trigger operation. • Improvements have been made in the light of experience from these runs Eagerly awaiting collisions!!