180 likes | 327 Views
The ATLAS Level-1 Central Trigger Processor. On behalf of P. Borrego Amaral 1) , N. Ellis 1) , P. Farthouat 1) , P. Gallno 1) , J. Haller 1) , A. Krasznahorkay 1)2) , T. Maeno 1) , T. Pauly 1) , H. Pessoa Lima Jr. 3)4) , I. Resurreccion Arcas 1) ,
E N D
The ATLAS Level-1 Central Trigger Processor On behalf of P. Borrego Amaral1), N. Ellis1), P. Farthouat1), P. Gallno1), J. Haller1), A. Krasznahorkay1)2), T. Maeno1), T. Pauly1), H. Pessoa Lima Jr.3)4), I. Resurreccion Arcas1), G. Schuler1), J. M. de Seixa3), R. Spiwoks1), R. Torga Teixeira1), T. Wengler1) 1) CERN, Switzerland 2) University of Debrecen, Hungary 3) Federal University of Rio de Janeiro, Brazil 4) Brazilian Center for Physics Research, Brazil Thilo Pauly, CERN-PH
The ATLAS Experiment • General-purpose experiment at CERN’s Large Hadron Collider. • Watches collisions of two pulsed high energy and high intensity proton beams. • Bunch-crossings every 25ns (40 MHz). • Trigger system: Reduce the rate, but keep interesting events Thilo Pauly, CERN-PH
Calo MuTrCh Other detectors Trigger DAQ 40 MHz LVL1 custom hardware DETECTOR READOUT ELECTRONICS <2.5 ms Lvl1 acc = 75 kHz Region ofInterest H L T RoI requests D A T A F L O W Readout System (ROS) LVL2 ~10 ms RoI data event selection in software Lvl2 acc = ~3 kHz Event Building Event Filter event data Storage ~ sec EFacc = ~200 Hz The Trigger-DAQ System Pipeline Memories This Presentation Thilo Pauly, CERN-PH
LTP LTP TTC Partition TTC Partition TTC TTC Busy Busy The Level-1 Trigger System Calorimeter Detectors Muon Detectors • Synchronous, pipelined processing at 40MHz • LTP = Local Trigger Processor (Standard sub-detector interface to the CPT) • TTC = Timing, Trigger & Control • Busy = Tree of Busy Modules • ATLAS has about 40 TTC Partitions Pre-processor Barrel Muon Trigger End-cap Muon Trigger Cluster Processor (e/γ and τ/h) Jet/Energy Processor Muon-CTP-Interface Central Trigger Processor ... Detector Front-Ends/Read-out Thilo Pauly, CERN-PH
Central Trigger Processor - Functionality • Trigger Inputs: • Multiplicities from Calorimeter and Muon Triggers for e/γ, τ/hadron, jets, and muons • Energy flags from Calorimeter Trigger: ∑ET, ETmiss, ∑ETjet • Calibration requests from sub-detectors • Specialized triggers: Beam pick-ups, etc. • Up to a total number of 160 trigger inputs at any one time • Internal triggers from CTP Core Module: • Random triggers • Pre-scaled clock • Bunch crossing groups Thilo Pauly, CERN-PH
CTP – Functionality (2) • Level-1 Accept (L1A): • Derived from trigger inputs according to trigger menu: • Up to 256 trigger items are made from combinations of up to 256 conditions on the trigger inputs, e.g. 1EM10 = “At least one e/γ with ET≥10 GeV” XE20 = “Missing energy of at least 20 GeV” • Each trigger item has a mask, a priority and a pre-scale factor • Example: 2EM10 AND XE20, mask=ON, priority=LOW, prescale=100 • L1A = OR of all trigger items Thilo Pauly, CERN-PH
CTP – Functionality (3) • Additional Functionality: • Trigger Type word (8 bits) accompanying every L1A • Dead-time in order to prevent front-end buffers from becoming full • Information for the Level-2 Trigger • Event data for the Read-out and Monitoring • Scalers for monitoring • Constraints: • Trigger latency budget: 100ns (4 clock ticks) from trigger input to L1A output • Trigger menu changes with physics/beam/detector conditions. Thilo Pauly, CERN-PH
VME Bus CPT_MI PIT bus CTP_IN CTP_IN CTP_IN (pattern-in-time) CTP_MON CTP_CORE CTP_OUT CTP_OUT (Calibration requests) CTP_OUT CTP_OUT CTP_CAL CAL bus COM bus (common) CTP Design Machine Interface BC/Orbit 4xSPD [30..0] Input Modules: Trigger Inputs 4xSPD [30..0] 4xSPD [30..0] Bunch-to-bunch monitoring L2/Read-out Trigger Menu and Read-out Trigger fan-out/Busy fan-in 5xLTP-Link 5xLTP-Link 5xLTP-Link 5xLTP-Link Calibration Requests, Patch-Panel Beam Pick-up/ Other The CTP is in a single 9U VME64x crate + custom backplanes. Thilo Pauly, CERN-PH
VME Bus CPT_MI PIT bus CTP_IN CTP_IN Trigger Inputs CTP_IN (pattern-in-time) CTP_MON CTP_CORE CTP_OUT CTP_OUT (Calibration requests) CTP_OUT CTP_OUT CTP_CAL CAL bus COM bus (common) CTP – Trigger Path CTP_IN modules receive, synchronize and align1) the trigger inputs, and route them to the PIT bus CTP_CORE: receives the PIT signals, compares trigger menu and generates Level-1 Accept, and sends L1A to the COM bus L1A CTP_Out: Receives the L1A from the COM bus and fans it out to sub-detector LTPs (common solution for all sub-detectors) 1) See Poster #1010 presented by Ralf Spiwoks “The ATLAS Level-1 Trigger Timing Setup” Thilo Pauly, CERN-PH
The CTP at the Combined Testbeam Particle Beam Combined test-beam in 2004 with periods of 25ns structured beam in order to test prototypes and final modules of all ATLAS sub-detectors with full trigger and data acquisition chain. Thilo Pauly, CERN-PH
The CTP at the Combined Testbeam Calorimeter Trigger Common Merger Modules Barrel Muon Trigger Sector Logic End-cap Muon Trigger Sector Logic Test-beam specific scintillators Muon-CTP-Interface M I I N M O N C O R E O U T CTP Trigger Inputs: Calorimeter Trigger: 4x3 bit e/γ 4x3 bit jet multiplicities 1 bit total ET Muon Trigger: 6x3 bit muon multiplicities Scintillators: 3x1 bit LTP (fan-out to sub-detectors) Thilo Pauly, CERN-PH
The CTP at the Combined Testbeam CTP_CORE CTP_MI CTP_OUT CTP_IN CTP_MON Thilo Pauly, CERN-PH
Some Results • Trigger Generation: • 46 signals from the PIT bus were used to form 18 trigger items • Prescaling and masking tested to work correctly • L1A used as trigger for read-out of the combined sub-detectors • Latency Measurements: • Measured latency between reference scintillator and L1A as it arrives at the Muon Trigger read-out. Projection (cable length + time-of-flight corrections) for final ATLAS latency: 2.13μs (budget is 2.5μs) • CTP Latency (budget 100ns): • 125ns at testbeam (non-optimized timing) • 95ns in lab after further optimisation Thilo Pauly, CERN-PH
Conclusions • The CTP has been successfully tested during the test-beam to generate triggers using 46 trigger inputs and 18 trigger items • The CTP latency is measured to be 95ns • Work is continuing in the laboratory (read-out, additional firmware, software, monitoring, GPS-based time-stamp) • The CTP will be available for ATLAS commissioning in September 2005. Thilo Pauly, CERN-PH
Backup - Slides Thilo Pauly, CERN-PH
CTP – Timing & Control Signal Path VME Bus CPT_MI Bunch Clock, Orbit PIT bus CTP_IN CTP_MI module receives timing signals from LHC, generates additional timing signals and sends all to the COM bus CTP_IN CTP_IN (pattern-in-time) CTP_MON CTP_CORE CTP_OUT CTP_Out module receives busy signals from sub-detector LTPs and send them to the COM bus. It receives clock signal from the COM bus and fans it out to the sub-detectors. Busy from sub- detectors/ Bunch Clock to sub-det. CTP_OUT (Calibration requests) CTP_OUT CTP_OUT CTP_CAL CAL bus All CTP modules receive timing signals from the COM bus. COM bus (common) Thilo Pauly, CERN-PH
CTP – Read-out and Monitoring VME Bus CPT_MI PIT bus CTP_IN CTP_MON produces bunch-by-bunch histograms of signals on the PIT bus CTP_IN CTP_IN RoI to Level-2 (pattern-in-time) CTP_MON CTP_CORE CPT_CORE sends Region-of-Interest (RoI) information to the Level-2 Trigger and event data to the Read-out System Event data to Read-out CTP_OUT CTP_OUT (Calibration requests) CTP_OUT CTP_OUT CTP_CAL CAL bus All CTP modules provide monitoring data to the VME bus. COM bus (common) Thilo Pauly, CERN-PH
CTP – Calibration Requests VME Bus CPT_MI PIT bus CTP_IN CTP_IN CTP_OUT receive calibration requests from sub-detector LTPs and send them to the CAL bus CTP_IN (pattern-in-time) CTP_MON CTP_CORE • CTP_CAL: • Time-multiplexes calibration requests • Receives additional trigger inputs and sends them to a CTP_IN module CTP_OUT CTP_OUT (Calibration requests) CTP_OUT CTP_OUT CTP_CAL Additional Trigger Inputs (e.g. Beam-Pick-up) CAL bus COM bus (common) Thilo Pauly, CERN-PH