1 / 32

Online DAQ System: From Detector to Tape

Explore the comprehensive hardware components, robust control system, efficient data path, and diverse applications of the cutting-edge DAQ system presented at the Yasuda Fermilab NIU Workshop. Delve into the architecture, redundancy, event data rates, and operational insights of this intricate detector-to-tape setup.

mreader
Download Presentation

Online DAQ System: From Detector to Tape

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Online DAQ System: From Detector to Tape T. Yasuda Fermilab NIU Workshop

  2. Overview • Hardware • Control System • Primary/Secondary Data Path • DAQ Applications • DAQ in action • Conclusions NIU Workshop

  3. Overview • The DØ DAQ system is divided into two components: • Trigger system • Level 1 hardware trigger components • Level 2 specialized processors • Level 3 crate readout and software trigger components • Online or Host system • Detector controls • Data Logging • Monitoring • Control room applications NIU Workshop

  4. Overview • DAQ Architecture • Event data rate and operational redundancy achieved by a high degree of parallelism • Level 3 • Host • Capability for multi-user, multi-stream operation • With central resource configuration manager • Network-centric Host design NIU Workshop

  5. DAQ Components Detector Trigger and Readout Controls NT Level 3 Control Room PCs Linux PCs UNIX Servers FCC NIU Workshop

  6. Hardware Description • 3 Compaq/Digital Alpha Servers • d0ola: Alpha Server 4000, 1 processor, 466MHz, 500 MB memory • d0olb: Alpha Server 4000, 2 processors, 600 MHz, 500 MB memory • d0olc: Spec out by Aug 3. Probably Alpha Server ES40, 4 processors, 667 MHz • Clustered / redundant • 500 GB shared RAID disks for online apps and database (Mirrored) • 500+ GB local ‘data buffer’ disks, fiber channel based (40M/sec) NIU Workshop

  7. Hardware Description • Linux/NT nodes • Buying 6 nodes with dual PIII, 600 MHz, 500MB memory, 2 graphics cards • 3 Linux nodes and 3 NT nodes exist • Will run Vmware on Linux nodes • Control system embedded 68Ks and PowerPCs (VxWorks) • Network • Cisco 6509 GB Ethernet switch for all FCH nodes • Satellite 100 MB switches in MCH • GB fiber to FCC • Security • Access control filter to online machines • Kerberos authenticated ssh session only NIU Workshop

  8. Hardware Database Control & Monitoring 1553 Detector Vertical Interconnect Controls Trigger and Readout Controls Crate Readout Crate Ethernet EPICS Clients: Low Voltage High Voltage Rack Monitor 1553 Devices SMT Monitor FT Monitor etc EPICS DB Generator ORACLE Control Room PCs UNIX Servers NIU Workshop

  9. Control System • Built upon EPICS control system • A ‘standard’ toolkit upon which we’ve built DØ extensions • Lots of user-community supplied tools • ORACLE Hardware Database • Extract EPICS db from ORACLE • Web-based and batch interfaces • Hardware Control • Low Voltage, High Voltage, etc dedicated GUI applications • Downloading • Registers, Pedestals, etc • Significant Event (alarm) System • Interface to Accelerator and Cryogenics controls systems NIU Workshop

  10. Status of Control System • Calorimeter • Preamp PS, BLS PS, ADC PS control exist • Pulser control (in progress) • SMT • EIPCS records for Sequencer, Sequencer control, VRB, VRBC, Emulator (in progress) exist and used in the test stands • Muon • Tested communication using 1553 for PDT, MDT, SRC cards • FPD • Used RM support to control motors NIU Workshop

  11. Status of Control System • Luminosity • Scalers and FE processing results communicated to Accelerator via ACNET • Cryo • Communicated with DMAX system • Common Tools • generic 1553 support • generic VME support • HV used in SMT, Muon, Lum • V1 running for months, V2 work starting • Diagnostic support for busses • Standard operator interface (GUI) NIU Workshop

  12. Hardware Database • Describes Control aspects of electronics • Based on ORACLE • 2 instances of the database ( dev, user testing) • Web based interface for entering, modifying, deleting records • Python script for batch entries exists • Calorimeter records in the database NIU Workshop

  13. Hardware Database NIU Workshop

  14. Significant Event (Alarm) System • System to detect alarm conditions and state changes in the DAQ system • Server with DAQ components as clients • COOR sends alarm and run control messages • CR, DL, DD send alarm messages • Version 1 Display exists • Working on version 2 Display (Summer student) • Need to integrate EPICS alarms into the system (Fall 2000) • on IOC EPICS alarms -> ITC client • ITC client sends alarms to Server on host • EPICS Alarm Handler can be used for now NIU Workshop

  15. Front-End Significant Event System Filter F SE Message Process Filtered Message Periodic Heartbeat Heartbeat Significant Event Server F F F F Display Archiver F HV Control Fault Watcher Run Control (COOR) Run Suspend NIU Workshop

  16. Disk Configuration & Run Control 1553 Detector Vertical Interconnect Controls Trigger and Readout Controls Crate Readout Crate L1, L2, TCC Data Cable L3 VRC L3 Supervisor Data Cable Control Room PCs L3 Filter Run Control Client NT Level 3 Ethernet Collector / Router COOR Data Logger Comics Data Distributor EXAMINE DSM RIP UNIX Servers Linux PCs Ethernet FCC NIU Workshop

  17. Software Description • Configuration Management and Run Control • Coordination (COOR) • User interface (TAKER) • Download manager (COMICS) • Primary event path • DAQ State Manager (DSM) • Collector / Router • Data Logger • Event metadata manager (SAM) • Event data manager (enstore) • Secondary event path • Secondary DAQ Supervisor • Data Merger NIU Workshop

  18. Software Description • Event monitoring • Data Distributor • Analysis applications (EXAMINE) • DAQ Monitoring • Client/Server access to DAQ flow statistics, trigger rates, etc • Detector Monitoring • Front End active & parasitic monitors • Calibration • Client/Server interface to database • Infrastructure • Databases (ORACLE) • Task-to-task communication (ITC) NIU Workshop

  19. EPICS CA Server Detector DAQ Shared Data EPICS CA Client Data Merge Collector / Router Collector / Router Monitor GUI Data Logger Data Distributor Data Logger EXAMINE Examine GUI Disk Disk Examine RIP RIP Secondary DAQ Data Flow Detector 1553Bus VMEBus Controls/Readout Crates CA Link ITC Control Room PCs Linux PCs UNIX Server UNIX Server FCC NIU Workshop

  20. Secondary DAQ System • Alternative data path • Mainly used for monitoring and calibration • Takes advantage of powerful front-end processors • Uses the same data path as the primary path after Data Merger NIU Workshop

  21. DAQ Monitor • Monitors the status of DAQ subsystems (L1/2, CR, DL, DD) • Collects statistics information from the subsystems • C++ itc Server with python Display clients NIU Workshop

  22. DAQ Monitor NIU Workshop

  23. Event Monitoring:EXAMINE • Samples and reconstructs event based on Stream IDs and trigger IDs • Clients of Distributor • Network and file event transfer modes work • Calorimeter EXAMINE • used for preamp testing • CFT EXAMINE • Getting ready for raw data unpacking • MC packed data? • SMT EXAMINE • used for SiDet data • Muon EXAMINE • used for commissioning NIU Workshop

  24. Disk Event Monitoring: EXAMINE Detector Trigger and Readout Readout Crate L1, L2, TCC Data Cable L3 VRC L3 Supervisor Data Cable L3 Filter NT Level 3 Ethernet ROOT Client Collector / Router EXAMINE Data Logger Data Distributor Express Line ROOT Client Control Room PCs UNIX Servers Linux PCs NIU Workshop

  25. Event Monitoring:EXAMINE • Need: • L3 EXAMINE • Vertex EXAMINE • Preshower EXAMINE • Planned improvements • Histoscope -> Root after NIU workshop • on-the-fly histogram • e-browser • Framework improvement • name server for accessing only the histograms NIU Workshop

  26. Online Event Display NIU Workshop

  27. Online Calibration • Perform electronics calibration of sub-detectors and insert results to ORACLE database • COOR controlled via Taker • Common server and database interface for all sub-detectors • Calibration results transmitted as special event messages through DAQ paths • Current status • Successfully ran SMT calibration at 1% test stand and NW test stand NIU Workshop

  28. Calibration Manager Taker Configure Calib Manager Display Request start run Start run, End run COOR Calibration Manager Comparison Results End calib. Request download Pedestals Gains End run Validator COMICS Database Interface Pedestals Gains Start run Download Database Access Crates Data Calib. Data Processor Calibration Database NIU Workshop

  29. Online DAQ in Action • Electronics/DAQ Commissioning • 2 VRB crates with 10 cards each, synchronized with SCL from TFW to L3 • L1 muon crate • 1 Muon Scint crate with 2 MRCs • 1 Muon PDT crate with 1 MRC • 1 Calorimeter crate • Combinations of 2 systems done • but not with MCH2+MCH3 • 2 simultaneous runs done • 3 simultaneous runs require one more L3 node or script runner NIU Workshop

  30. Online DAQ in Action • SMT Test Stands • 1% and NW test stands • 1 HDI, 1 Sequencer, 1 VRB, 1 VRBC, 1 VBD • Download done with COMICS and database • 10% test stand • 3 HDIs, a few Interface Boards, a few Sequencers, a few VRBs, 1 VRBC, 1 1553 controler, 1 VBD + L3 • Download done by spread sheet for now • Databases (Electronics) exist for all three stands • Calibration run performed at 1% and NW test stands NIU Workshop

  31. Online DAQ in Action • Commissioning Run • Two detectors installed for the upcoming Commissioning Run • Run I Luminosity scintillation counters • Forward Proton Detector • Both detectors will be read out using the Run II Online system. • Data will be transferred to and from the Accelerator Controls System via the EPICS/ACNET Gateway. NIU Workshop

  32. Conclusions • All of the DAQ components exist and function. • Improvements are implemented daily following user suggestions. • We have been intimately involved in daily commissioning activities for the past few months. • Bring in your sub-detectors!! NIU Workshop

More Related