280 likes | 436 Views
XFEL Large Pixel Detector DAQ. Project Team. Technical Team: STFC Rutherford DAQ Glasgow University Surrey University Science Team: UCL Daresbury Bath University others …. Project Outline. 1) Phase 1: Develop a digitising pipelined XFEL detector (1k by 1k pixels)
E N D
Project Team • Technical Team: • STFC Rutherford DAQ • Glasgow University • Surrey University • Science Team: • UCL • Daresbury • Bath University • others …
Project Outline • 1) Phase 1: Develop a digitising pipelined XFEL detector (1k by 1k pixels) • 3Y Project given approval Dec 2007 • 2) Phase 2: Construct complete XFEL instruments as required before 2013 • - mass produce electronics • - match XFEL DAQ
Phase 1 Detector 1M Pixel 4 x 4 Super Modules
XFEL ASIC • New design matched to the XFEL: • - Time Structure • - Dynamic Range • - Channel Count • - System Interfaces etc.
XFEL ASIC Preamplifiers Sequencing and control Dynamic range stages ADC Stages Overload Control IO to DAQ Power supplies Pixel Resetting Store in Pipeline during bunch train Readout during long gap Deep pipeline memory (786,432 samples) Power supply Conditioning
X-ray photons 200 ns 100 fs FEL process Data Sampling to Memory Serialise and Transmit to DAQ XFEL Structure Electron bunch trains; up to 3000 bunches in 600 msec, repeated 10 times per second. Producing 100 fsec X-ray pulses (up to 30 000 bunches per second). 100 ms 100 ms 600 ms 99.4 ms XFEL ~ 30 000 bunches/s but 99.4 ms (%) emptiness
Multi-gain concept • Required dynamic range compression • Experience with calorimetry at CERN • Relaxes ADC requirements • Fits with CMOS complexity
Detector Summary • Phase 1 • Common super module design • Economic mass production • Eases test and maintenance • Scalable DAQ • Phase 2 • Mechanical design • Large scale replication • Industrial technology
Tracker View showing Higgs decay to 4 muons LHC Example
The CMS Tracker • ~210 m2 of silicon, 10M channels • 75000 FE chips, 40000 optical links • 15000 modules mass produced using automatic assembly techniques • Hybrids and assembly at CERN, FE ASIC Design at RAL Inner barrel layer Rod insertion Radiation environment ~10Mrad ionising ~1014 hadrons.cm-2 APV25 CERN assembly Petal assembly
The CMS Tracker DAQ • Collaboration with Imperial College and CERN • Massively Parallel Processing ~ 30 VME crates • 10 TERA-bits / sec • 15 Exa-Bytes of raw input per year! >500 cards >20,000 BGAs
Project Management • One overall project manager • Report to XFEL and provide information as required • Each workpackage • run as subproject in our QA system • Approved ISO9000 system • Formal design review processes • Drawing and record control
The First 3 Years: • Included • Sensor proving tests at LCLS • XFEL ASIC development • Build and test of 1Mpixel system • Excluded • Off detector DAQ • Sensor R&D
Work Packages • WP1: Sensors • WP2: Front End Electronics • WP3: Mechanical Design • WP4: On Detector Electronics • WP5: Data Acquisition • WP6: Software, controls and integration
WP5: Data Acquisition • Leader – John Coughlan
DAQ Approach • Up to now effort has been concentrated on WPs for Mechanics / Sensors / ASICs rather than Readout. • We intend to exploit Commercial Off The Shelf (COTS) based equipment where practical (e.g. FPGA Development boards, vendor/commercial FPGA cores) • Xilinx (Virtex 5) System on Chip • Rocket IO serial data links , Embedded Ethernet cores • Embedded PowerPCs, Fast memory interfaces • Embedded Development Kit • Industry standards for interface protocols vs custom • (GEthernet, PCIe, sFPDP …) • Develop a Scaleable system. Final detectors.
DAQ XFEL Integration • Stay Flexible to adapt to FE ASIC and common XFEL DAQ Architecture specifications. • DAQ/Timing/Trigger Interface Standards & Protocols need to be agreed with XFEL DAQ group. • STFC/Rutherford has a long history of successful partnerships with DESY on Particle Physics projects (e.g. H1 DAQ)
DAQ Experience • STFC has a large established base of DAQ hardware and firmware expertise (wide variety of projects… Particle Physics, Xray, Neutron). • STFC has a proven track record on the delivery and long term support of large scale readout systems (e.g. H1 DAQ, CMS Silicon Tracker)
DAQ On Detector Module Support Cards (MSC) : FPGA Gain Selection FEMs : Firmware Data Formatting Sample selection Traffic Shaping Switch Inputs / Farm FPGA COTS : FPGA Dev Boards Xilinx Virtex 5 + EDK PPC COTS : FPGA-> PC cores (Qx UDP) Electrical LVDS Links 3 Year Plan includes only elements local to detector.
Data Rate Challenge 1 M Pixels x 512 x ~ 2 bytes x 10 Hz ~ 10 GBytes/sec Protocol overheads Fixed length fragments? Data Selection, Sparsification? => 1 TeraByte recorded every 2 minutes !
Modularity 1 MPix Each FEM Fragment = 128 KB (64 MB / train) 128 KB x 512 x 10 = 640 MB/s @ 80 MB/s link = 8 links / FEM
DAQ EoI Off Detector Event Builder Advanced TeleComms Architecture COTS Carriers + AMC Mezzanines Or MicroTCA E.g. ATCA crate for Surface & Nuclear Science AGATA Daresbury & Padova ATCA card with FPGA->PC PCIe readout 6 TB SFPDP Disk Storage and Servers
WP6: Software, controls and integration • Leader – Tim Nicholls
Software, Controls & Integration • Delivery and Operation of Detector in beam-line environments. • System Operation Timing and Controls. • Software Development and Integration with Beam-line scientists. • Real Time Data Monitoring and Analysis. • Team of Integration engineers, leader with experience on DESY H1 experiment.
Summary • 1) Phase 1: Develop a digitising pipelined XFEL detector (1k by 1k pixels) 3Y starting Jan 2008 • 2) Phase 2: Construct complete XFEL instruments as required before 2013 • - mass produce electronics • - match XFEL DAQ Architecture