1 / 37

Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays

This paper presents the commissioning of the ATLAS High Level Trigger (HLT) using single beam and cosmic rays. It discusses the event selection software, trigger configuration, online and offline monitoring, and main HLT achievements. It also discusses the challenges faced by the ATLAS Trigger/DAQ system in handling the high interaction rates at the LHC.

Download Presentation

Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Commissioning of the ATLAS High Level Trigger with Single Beam and Cosmic Rays Alessandro Di Mattia Michigan State University On behalf of the Atlas Collaboration CHEP ’09, March 21-27, 2009

  2. Outline • Introduction to the LHC and the ATLAS Trigger/DAQ • The Event Selection Software • The Trigger Configuration • Online and Offline Monitoring • Experience with beam and cosmic • Main HLT achievements A. Di Mattia, MSU

  3. The LHC challenge to ATLAS Trigger/DAQ • Challenge to the ATLAS Trigger/DAQ • interaction rate 109 Hz, offline computing can handle O(102 Hz). • cross section of physics processes vary over many order of magnitude: • Inelastic: 109 Hz • W → l n: 102 Hz • tt production: 10 Hz • Higgs (100 GeV): 0.1 Hz • Higgs (600 GeV):10-2 Hz • ATLAS has O(108) read-out channels → average event size ~1.6 MByte LHC: proton-proton collisions @ ECM = 14 TeV L = 1034 cm-2 s-1 23 collisions per bunch crossing @ 25 ns interval 1 year at L = 1034 cm-2 s-1∫Ldt ≈100 fb -1 Muon System Endcap: Trigger Chamber: TGC Precision chamber: MDT,CSC Muon System Barrel: Trigger Chamber: RPC Precision chamber: MDT Inner Detector: Transition Radiation Tracker (TRT) Silicon Detector (SCT) Pixel A. Di Mattia, MSU

  4. ATLAS: the Trigger/DAQ system Other detectors CALO MUON CTP L1 2.5 ms ROD ROD ROD ROB ROB L2 ~40 ms High Level Trigger Read Out System ROIB L2SV Event Builder ~100 farm nodes L2P 500 farm nodes 120 GB/s 1 PB/s DFM SFI EF ~4s EFP ~1600 farm nodes EFN SFO Calo + Muon tr. ch. 40 MHz Details given by T. Pauly - The ATLAS Level-1 Central Trigger System in Operation Level-1(hardware FPGA/ASIC) • analyzes coarse granularity data from CALO and MUON detector; • identifies the Region of Interest (RoI), seeds Level-2. Level-2 (software based) • accesses full granularity data within the RoI (2% total event size); • uses algorithms optimized for fast rejection. Event Filter (software based) • uses offline algorithms; • potential full event access; • exploits the seed from Level-2. L1 accept, 75 KHz 75 KHz RoI RoI request, data (~2%) ROB DCN 3 GB/s L2 accept, ~3KHz ~3 KHz EF accept, ~ 200 Hz Dataflow Typical event size ~1.6 MBytes, up to 14 MBytes for CALO calibration 320 MB/s ~200 Hz A. Di Mattia, MSU

  5. ATLAS Trigger/DAQ: the resources DAQ (ROS, EB, SFO):100% of the final system available • delivered x2 design rate of event throughput with 5 SFO. Trigger farms (L2P, EFP):35% of the final system available • ~850 nodes on 27 racks; • 8 cores per node (2 x Harpertown quad-core @ 2.5GHz), 2 GBytes memory per core; • sufficient for the early data taking period. Homemade resource monitoring based on nagios Flexible resources assignment to DAQ/HLT: configuration allows for changing within a day the workload among Level-2 / Dataflow / Event Filter to cope with unexpected increase of data throughput (calo calibration runs, bad detector conditions etc..); Reconfiguration successfully exercised in the 2008 runs! Details given by A. Zaytsev - System Administration of ATLAS TDAQ Computing Environment and by R. Sjoen - Monitoring Individual Traffic Flows in the ATLAS TDAQ Network Details of the Dataflow given by W. Vandelli – ATLAS DataFlow Infrastructure: recent results from ATLAS cosmic and first-beam data-taking A. Di Mattia, MSU

  6. The HLT Selection Software Event Filter HLTSSW Processing HLT Core Software Task 1..* Steering Monitoring Service HLT Algorithms L2PU Application Data Manager HLT Algorithms ROB Data Collector 1..* Event Data Model MetaData Service <<import>> <<import>> <<import>> <<import>> Package Athena/ Gaudi Event Data Model Reconstr. Algorithms StoreGate Interface Offline Reconstruction Offline Core Software Dependency HLT Data Flow Software HLT Selection Software • Framework ATHENA/GAUDI • Reuse offline components • Common to Level-2 and EF Performance and functionalities tested in technical runs and combined detector runs Details given by W. Wiedenmann -The ATLAS Online High Level Trigger Framework: Experience reusing Offline Software Components in the ATLAS Trigger Offline algorithms used in EF A. Di Mattia, MSU

  7. The HLT Steering Inn. Det. tracking Trigger Element Track ? Comb. ? L2_MU20 Muon RoI Muon Tracking Combined Feature Muon Feature combined track L2TrigIn DetTrack Algorithms for extracting features of physics object from data Results (FEX) from feature extraction Trigger Elements (TE), marking the atomic selection steps (Sequences) Algorithms applying selection on FEX, thus confirming the TE Chain for Level-2 muon HLT steering manages the execution of the selection code • algorithms configurable by parameters; • applies early rejection: abort full chain as soon as a selection steps fails; • applies prescales and passthrough factors; • caches full history of TE and FEX and writes them into the HLT result: • allows navigation through the steps of the trigger decision; • avoids multiple execution of the same feature extraction; • allows offline re-run of trigger selection with different Hypo cuts; Collection of Chains implements the trigger menu • in python or xml, recorded in the Trigger Configuration Database Steering used to select events in the 2008 cosmic data taking. From LVL1 To EF selection A. Di Mattia, MSU

  8. Trigger Configuration • Trigger configuration: • active trigger Chains, algorithm parameters, prescale factors, passthrough fractions. • Relational Database (TriggerDB) with no duplication of objects • four Database keys: LVL1 & HLT menu, L1 prescales, HLT prescales, bunch number; • user interface (TriggerTool); • read and write menu into XML format • menu consistency checks • 2-month cosmic commissioning: over 3k chains, 6k components (algorithms, tools, services), 5k parameters. [Counting all versions of all objects] • After run, Trigger Configuration becomes conditions data • Run control can change complete menu at any run stop/start, prescales/passthroughs at any lumi block boundary • Database proxy mechanism in place to avoid direct connection from every application Database exploited in the 2008 data taking for both online and offline A. Di Mattia, MSU

  9. The Online monitoring Details given by A. Corso Radu, Y. Ilchenko, P. Renkel, A. Dotti  OHP Tool Trigger presenter Trigger Presenter (TriP) • Provide rate information and farm status accessing the online Information System; • displays instantaneous trigger rate per selection chain / level and history plots; • allows for fast reaction against unexpected beam/detector conditions; Algorithms online monitor • algorithms produce histograms for shifter and experts; • statistic from Processing Units is collected by the Online Histogram Presenter (OHP); • automatic checks are performed on a subset of histos for data quality assessment: Exercised by the shifter crews. Provided also feedback to detector people during the early phases of the 2008 data taking. Distance between the track from LVL1 and the muon hits in the precision chambers selected in Level-2. Allows for checking the time sinchronization among muon trigger and muon precision detectors. A. Di Mattia, MSU

  10. The Offline monitoring Tier0 [1600 cores, 2GB/core, CERN batch workers] Designed to reconstruct all events (~200Hz) from ATLAS within 1 day. Allows review of saved trigger quantities (used extensively) and comparison with offline (some tools yet in development). CAF (CERN Analysis Facility) [ 400 cores, 64 for trigger] Designed to rerun ~10% of collected events for calibration and commissioning. • Deployment of new code to HLT farm: separate patch branch of trigger code with its own nightlies, tests with real data at CAF • Check the HLT decision: run on minimum bias stream and on events taken in passthrough mode, deep monitoring of the algorithm functionalities • Handles debug stream: events with HLT, errors and timeouts. (~2.5% of total events collected in Sept-Oct run, but really less than 0.1% from Oct onwards. Expected to be much lower for real collisions.) A. Di Mattia, MSU

  11. First experience with LHC beam LHC beam Loss monitor Tertiary collimators, 140 m beam splash events when closed • Stability and reliability priority for the first beam. • Simple trigger configuration (LVL1 decision only). • Crucial to have the LVL1 triggers ( BPTX & MBTS) well timed-in. • HLT used only for tagging events and routing them to data streams. • Re-run the HLT offline on those events having Muon and Calo RoIs in time with BPTX or MBTS: • few statistic available (less than 1k event), due to short operation and non pointing tracks. BPTX, 175 m Details given by C. Ohm The ATLAS beam pick-up based timing system Details given by T. Pauly - The ATLAS Level-1 Central Trigger System in Operation Minbias Trigger Scintillator: 32 sectors on LAr cryostat Operation conditions • Pixel off; • Muon Sytem and Silicon Detector at reduced HV; • Other detectors on; A. Di Mattia, MSU

  12. Cosmic running and events collected • HLT provided streaming and event selection for detectors • Track selection for ID and Muon needed for alignment and calibration • Fast turn around exercised to accommodate detector requirements Few cosmic runs before Sept. 2008 First time we had the Pixel Detector in a global run Mostly selected by the LVL1 Muon trigger A. Di Mattia, MSU

  13. LVL1 Muon algorithm TGC Searches for TGC or RPC hit patterns compatible with tracks coming from Interaction Point. Uses coincidence windows. RPC TGC 2 MDT MDT BARREL ENDCAP A. Di Mattia, MSU

  14. Issues on cosmic event data No tracks from Interaction Point : selected tracks distributed over d0, z0 • track selection unbiased in the r-z view for most of the runs No beam clock: timing provided by trigger Muon Trigger Chambers • Phase issues in read-out/calibration of trigger and precision muon chambers (MDT), transition radition tracker (TRT), etc. Confirm plane high pt Pivot plane Confirm plane low pt Muon Spectrometer: RPC trigger setup Muon Algorithms The r-z view could not be fully reconstructed @ L2 because algorithms are designed for pointing tracks and data access happens in trigger towers heading to Interaction Point. Possible to relax pointing requirement to study efficiency / rejection. Inner Detector Tracking Significant modification to get tracks needed for Inner Detector alignment. A. Di Mattia, MSU

  15. Cosmic run: use of Physics Menu Details on Tau trigger given by M. Dam - The ATLAS Tau Trigger • Despite low expected statics, a full physics menu run in parallel to cosmic chains. • eγ, jets/missing ET, t, μ, minimum bias… • ROIs with eγ, t, etc. signatures not very common with cosmics, rarer to get events until the end of the chains. • A few thousand of events • Both L2 and EF algorithms exercised successfully Example plot from eγ FEXalgorithmscomparing L2 and EF: Shower shape in 2nd EM samplingRη=E(3×7)/E(7×7). A. Di Mattia, MSU

  16. LVL2 Calo: HLT feedback to the Detector Hardware issues addressed during shutdown • Hot cells in the eta region around 0.475 are seen by the HLT monitoring and by the detector monitoring. Plots normalized wrt the counting of the bin 0.475. • Cross checking possible. Calo Trigger functional and may help identifying hot detector regions. Hardware issues addressed during shutdown Detector Online monitoring Hardware issues addressed during shutdown Detector monitoring per partition (½ EM eta space)‏ MeV ATLAS preliminary Details given by D. Damazio – Atlas High Level Calorimeter Trigger Software Performance for Cosmic Ray Events A. Di Mattia, MSU

  17. HLT muon: commissioning with cosmic Display of a cosmic event, run 90272 Description of muon trigger algorithms given by A. Ventura: The Muon High Level Trigger of the ATLAS experiment High statistic available exercised some algorithms • at L2: • m Fast; • mIso; • TileRODMu; • at EF: • TrigMuonEF; mFast operated since the very beginning, being serving data for the Muon Spectrometer Calibration and online Data Quality. (for details on remote MDT calibration see A. De Salvo ATLAS MDT remote calibration centers) Re-run on data for cross checking: • basic distributions (track position, calo noise / m.i.p. signal) against montecarlo prediction; • track finding efficiency studies at L2; • studies on muon systems alignment at L2;  A. Di Mattia, MSU

  18. LVL2 muon: mFast MDT cluster finding MDT cluster residual (w.r.t. TGC seed) vs nr. of TGC hits used as seed • Cluster finding efficiency: 93% • design goal 99% • 4.7% inefficiency due to missing MDT data • 2% inefficiency due to bad MDT calibration Inefficiency position Drift space from time noisy chamber EMS5A14 missing Bad conversion (unphysical) A. Di Mattia, MSU

  19. L2 muon: calorimeter algorithm ATLAS Preliminary Cosmic Monte Carlo • Algorithm implemented into the CALO ROD DSP • Poor efficiency (<< 1%) due to lack of pointing • Back-to back distribution seen • Energy deposition agrees with that for a m.i.p. Run 91060 energy deposition and  distributions of muon tracks in Tile Calorimeter A. Di Mattia, MSU

  20. Muon Event Filter TrigMuonEF algorithm exercised on cosmic data angular resolutions(,): =0.007, =17mrad • Solenoidal and toroidal field on • Resolutions with respect to Offline fresolution hresolution A. Di Mattia, MSU

  21. L2 ID Tracking Earliest tracking possible at L2 (TRT can be read at LVL1 for cosmic) • Three L2 tracking algorithms: • Si Track: Combinatoric search for track seeds in innermost Si layers and their extension into tracks in outer Si layers. Si algo with TRT extension. • IDSCAN: use histogramming techniques to find z-position of the IP and identify tracks originating from there. Si algo with TRT extension. • TRTSegFinder: TRT-only algorithm looking for segments in the TRT. Goal: Record as many ID tracks as possible, do not introduce biases in selection, keep rate at acceptable levels. Secondary goal: to the extend possible, try to use machinery, setup, algorithms, etc. that are used for collisions. A. Di Mattia, MSU

  22. L2 ID Tracking: performance Details on ID tracking given by M. Sutton - Commissioning the ATLAS Inner Detector Trigger Details on data stream given by B. Pinto - Alignment data streams for the ATLAS Inner Detector • Trigger chain starting with all L1 accepted events and involving an OR of any L2 tracking algorithm finding tracks. • Allowed collection of a good fraction of cosmic muons passing through the inner detector, with no significant biases Performance: • Uniform event efficiency of >99% for “golden Si” tracks. • Fake rates 0.01%-1%. • Algorithms complementary. Rerun 1 month later: HLT tracking works out of the box, despite some changes in the detector configuration! A. Di Mattia, MSU

  23. Conclusions HLT system fully exercised All the HLT infrastructure (steering, monitoring, data streaming, L2 & EF algorithms) tested to work under actual data taking conditions. Physics menu run in parallel to the cosmic slice. HLT performed event selection L2 ID tracking provided the data for detector alignment, L2 muon served data for online detector Data Quality and Monitoring. HLT operation was robust HLT commissioning performed while serving the subsystems. Provided a good balance between stability and responsiveness to detector condition / request. Lot of understanding driving further the commissioning work A. Di Mattia, MSU

  24. Backup (i.e. not to be shown on Chep) A. Di Mattia, MSU

  25. 10^31 menu HLT Timing • Timing studies done in “technical runs”, where MC events are injected to the HLT. L2 EF A. Di Mattia, MSU

  26. First beam !!! A. Di Mattia, MSU

  27. The Trigger Tool One tool for the shifters, experts and offline users Offline user can easily get read-only access using java webstart. Trigger shifter can modify prescales and passthroughs. Experts can modify all aspects of trigger configuration. A. Di Mattia, MSU

  28. Definitions a q IP a q IP w.r.t. h=0 direction w.r.t. pointing direction • Angle of muon track defined as follows r-z view r-z view x-y view x-y view A. Di Mattia, MSU

  29. LVL2 muon: mFast cross check on alignment TGC MO M O Cross check of alignment: match a angle measurement from TGC and MDT with only Middle fit: • poor match due to noise and bad MDT calibration; with Middle+Outer fit: • good match, performance limited by alignment A. Di Mattia, MSU

  30. LVL2 muon: mFast track reconstruction No magnetic field: straight tracks, cross check pattern recognition and reconstruction MDT a angle: angle of MDT fit (in Middle Station) slope w.r.t. pointing direction Algorithm assumes tracks are pointing to the IP 2 out of 3 segments: • high efficiency on non pointing tracks; • poor performance on the Sagitta reconstruction. 3 out of 3 segments: • low efficiency on non pointing tracks; • good Sagitta reconstruction, performance limited by the alignment. A. Di Mattia, MSU

  31. ID Tracking: offline reconstructed tracks • d0 ~ 18 mm • would be rejected by the standard cut of the LVL2 ID tracking (efficiency up to d0=1 mm). A. Di Mattia, MSU

  32. L2 ID tracking: adjustments for Cosmics • A simple independent pattern recognition for Si • Start with hits in outermost layers, define a cigar-shaped road, if road contains enough hits, compute impact parameter for the road and apply a shift in x-y plane to all Si hits in detector. A. Di Mattia, MSU

  33. The MDT A. Di Mattia, MSU

  34. Assembling MDT+TGC for the Endcap A. Di Mattia, MSU

  35. The HLT Event Selection Software Event Filter HLTSSW Processing Task HLT Core Software 1..* Steering Monitoring Service HLT Algorithms L2PU Application Data Manager HLT Algorithms ROB Data Collector 1..* Event Data Model MetaData Service <<import>> <<import>> <<import>> <<import>> Package Athena/ Gaudi Event Data Model Reconstr. Algorithms StoreGate Interface Offline Reconstruction Offline Core Software Dependency HLT Selection Software • Framework ATHENA/GAUDI • Reuse offline components • Common to Level-2 and EF Integration into online proceeds through steps assess the same offline physics performance in • run single-node online emulator (check RoI-based data access) • run multi-node partition (full check of online infrastructure) Performance and functionalities tested in technical runs and combined detector runs A. Di Mattia, MSU

  36. The HLT Steering Trigger Element A Trigger Element B Muon RoI mFast Hypo Iso Hypo mComb CB reco Comb Hypo IDSCAN ID reco mFast MS reco Muon Feature Combined Feature m Iso isolation L2_MU20 L2TrigIn DetTrack Isol. Feature Algorithms for extracting features of physics object from data Results (FEX) from feature extraction Trigger Elements (TE), marking the atomic selection steps (Sequences) Algorithms applying selection on FEX, thus confirming the TE Example of a Level-2 Chain for muon From LVL1 HLT steering manages the execution of the selection code • algorithms configurable by parameters; • applies early rejection: abort full chain as soon as a selection steps fails; • applies prescales and passthrough factors; • caches full history of TE and FEX and writes them into the HLT result: • allows navigation through the steps of the trigger decision; • avoids multiple execution of the same feature extraction; • allows offline re-run of trigger selection with different Hypo cuts; Collection of Chains implements the trigger menu • in python or xml, recorded in the Trigger Configuration Database Steering used to select events in the 2008 cosmic data taking. To EF selection A. Di Mattia, MSU

  37. Cosmic event recorded • Cosmic event recorded as a function of the run number. Plots shows also the status of the magnetic field A. Di Mattia, MSU

More Related