300 likes | 434 Views
Commissioning and initial experience with the ALICE on-line. V. Altini, T. Anticic , F. Carena, W. Carena, S. Chapeland, V. Chibante Barroso, F. Costa, E. Dénes , R. Divià , U. Fuchs, I. Makhlyueva, F. Roukoutakis, K. Schossmaier, C. Soós , T. Kiss, P. Vande Vyvre, B. von Haller
E N D
Commissioning and initial experience with the ALICE on-line V. Altini, T. Anticic, F. Carena, W. Carena, S. Chapeland, V. Chibante Barroso, F. Costa, E. Dénes, R. Divià, U. Fuchs, I. Makhlyueva, F. Roukoutakis, K. Schossmaier, C. Soós, T. Kiss, P. Vande Vyvre, B. von Haller For the ALICE Collaboration CHEP - Prague March 2009
Outline • ALICE experiment • ALICE online systems • Detector Commissioning • Global Commissioning • Lessons from 2008 • Goals for 2009 • Conclusion CHEP March 2009
ALICE @ LHC Point 2 ALICE CERN CH CHEP March 2009
ALICE: A Large Ion Collider Experiment at CERN-LHC • General-purpose heavy-ion detector • Study of strongly interacting matter and the quark-gluon plasma in nucleus-nucleus collisions at the LHC • Detector designed to cope with the highest particle multiplicities anticipated for Pb-Pb reactions • Complex experiment for online systems • 18 detectors (last one being installed now) • 10 detectors providing input to the trigger • 2 types of beams: pp and Pb-Pb • 3-level hardware trigger and 1 High-Level Trigger firmware/software • Multiple partitions and multiple detector clusters • Not surprisingly: complex commissioning ! CHEP March 2009
ALICE 18 detectors5 online systems Size: 16 x 26 metersWeight: 10,000 tonnes CHEP March 2009
Trigger – DAQ – HLT ‘08 PDS Rare/All CTP L0, L1a, L2 BUSY BUSY LTU LTU DDL H-RORC L0, L1a, L2 HLT Farm TTC TTC FEP FEP FERO FERO FERO FERO Event Fragment Sub-event Event File 10 DDLs 10 D-RORC 10 HLT LDC 120 DDLs 360 DDLs 430 D-RORC 125 Detector LDC D-RORC D-RORC D-RORC D-RORC D-RORC D-RORC LDC LDC LDC LDC LDC Load Bal. Event Building Network EDM 30 GDC 10 TDSM GDC TDSM DADQM DSS 20 DA/DQM 18 DSS Storage Network Archiving on Tapein the ComputingCentre (Meyrin) 25 TDS CHEP March 2009
Trigger – DAQ – HLT ‘09 PDS Rare/All CTP L0, L1a, L2 BUSY BUSY LTU LTU DDL H-RORC L0, L1a, L2 HLT Farm TTC TTC FEP FEP FERO FERO FERO FERO Event Fragment Sub-event Event File 10 DDLs 10 D-RORC 10 HLT LDC 120 DDLs 360 DDLs 430 D-RORC 125 Detector LDC D-RORC D-RORC D-RORC D-RORC D-RORC D-RORC LDC LDC LDC LDC LDC Load Bal. Event Building Network EDM 90 GDC 30 TDSM GDC TDSM DADQM DSS 40 DA/DQM 18 DSS Storage Network Archiving on Tapein the ComputingCentre (Meyrin) 75 TDS CHEP March 2009
Control Logical Model ECS: Experiment Control SystemDCS: Detector Control SystemCTP: Central Trigger ProcessorDAQ: Data AcquisitionHLT: High Level Trigger ECS HLT DCS CTP DAQ CHEP March 2009
Detector in Standalone Mode DCA: Detector Control AgentLTU: Local Trigger Unit ECSDCA HLT DCSTPC LTU TPC Control DAQRun Control TPC Farm HV Gas LV TPC LTU LDC 1 LDC 2 LDC 216 CHEP March 2009
Global Partition DCA: ECS Detector Control Agent PCA: ECS Partition Control Agent PCA SPD DCA TPC DCA Muon DCA SPD DCSTPC LTU TPC DAQTPC HLT DCSMuon LTU Muon DAQMuon HLT DCSSPD LTU SPD DAQSPD HLT CHEP March 2009
Data Quality Monitoring DQM framework is ready. Not yet used routinely by all detectors. CHEP March 2009
eLogBook Runstatisticswithfilter on all fields Electronics Logbook Online status displayof TRG and DAQ running conditions eLogBook: From replacing the paper logbook to a powerfulldata mining tool CHEP March 2009
ALICE Commissioning 2007-09 Dec Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Jan Feb Mar 2nd global run 4 Feb-9 Mar 1st global run 10-21 Dec 3rd global run 5 May – 20 Oct 1st Circulating beam 10 Sep Helium leak incident 19 Sep First particles from machine15 Jun Injection tests 8 Aug, 24 Aug Installation & Commissioning 24/7 operation Installation&Upgrade 2007 2008 2008 2009 CHEP March 2009
Detector Commissioning • Standalone commissioning • Services: cabling, cooling, power supplies, etc • Detector hardware, firmware, software • Interfaces to online systems • Online commissioning process • Control procedure up to the state “Ready for Data Taking” • Exposed to variety of trigger (cosmic, pulser up to 40 MHz, random) • Data taking stability tests • Calibration procedure • Integration to global partition CHEP March 2009
Detector Commissioning CHEP March 2009
Global Commissioning • Exercise stability • Detectors • Services • Online systems • 23 systems (18 detectors and 5 online systems) 0.97 23 < 0.5 ! • Measure performance • Detector readout time and event size • Online systems • Take data • Cosmic data for detector alignment • Detector calibration (E.g.: TPC calibration with krypton or laser) • Organize operation CHEP March 2009
ALICE Control Room Day… and night CHEP March 2009
Cosmic Run I (Dec ‘07) Global runs in 2 weeks 14 detectors participating 10-60 hours of data taking – 1-20 x 106 triggers CHEP March 2009
Cosmic Run I • TOF • Comparison of hit time distribution in ACORDE and random triggers • SPD • Comparison of number of clusters per event in the 2 SPD layers CHEP March 2009
Cosmic Run II (Feb-Mar ‘08) Global runs in 5 weeks 13 detectors participating 60-150 hours of data taking – 2-50 x 106 triggers CHEP March 2009
Cosmic Run II • TPC • Cluster distribution • Resolution over drift length • Muon Tracking • Online data monitoring vs offline analysis CHEP March 2009
Cosmic Run III (May-Oct ‘08) Global runs in 23 weeks 16 detectors participating 60-1250 hours of data taking – 5-3500 x 106 triggers CHEP March 2009
Cosmic Run III after alignment before alignment SPD Alignment: SSD: p-n charge correlation SDD: Drift speed calibration vsposition CHEP March 2009
Global runs Global runs with all detectors are still difficult to achieve CHEP March 2009
System Stability Stability of global runs has improved substantially during the cosmic runs CHEP March 2009
ALICE SPD: the first LHC “event” On 15 June 2008 the ALICE SPD in self-triggering mode (L0) sees one of the first “sign of life” of LHC during the beam injection test in Tl2Run 38795Time 18:10 CHEP March 2009
Lessons from 2008 • Services • Detectors: • Work in progress for firmware and software readout time and zero suppression (event size) • Noise and grounding • Online systems: • Control scalability: some systems isolated and load distributed • CPU needed for data formatting: late decision to format data into reconstruction ready format CPU needed higher than anticipatedPrice to pay for allowing single pass offline analysis • Spurious triggers : fixed during cosmic run • Global sequences of detector control still in the phase of being defined Not yet fully automated • But a successful commissioning of the whole experiment and ready for startup in September ‘08 CHEP March 2009
Goals for 2009… and after • Ready for a nominal data taking year 10 months pp + 1 month HI • Full deployment of the DAQ system: 40 100 % performance • Increase of HLT CPU power • Improve feedback to shift crew (DQM deployment) • Reduce the size of shift crews • Group of detectors • Automation of atomic operations (configuration, calibration) • Automation of global planning • Central system for the configuration/archiving of all the trigger detectors and the corresponding trigger processors CHEP March 2009
Conclusion • ALICE became reality after almost 15 years of design and installation • Commissioning of detectors and online systems lasted from Dec ‘07 to Sep ’08. Work intensive ! • Online systems contributed to the detector commissioning, alignment, and calibration • Experiment ready to start with beam in September ‘08 • Will be ready again in September ’09 ! CHEP March 2009