300 likes | 486 Views
Fysikalisk Systemteknik Christian Bohm Overview: About the group Overview of projects What is an FPGA Our major projects. Fysikalisk Systemteknik Personell: Professor Christian Bohm Lecturer Sam Silverstein Part time lecturer Magnus Engström Adjunkt Eddie Ahlestedt
E N D
Fysikalisk Systemteknik Christian Bohm Overview: About the group Overview of projects What is an FPGA Our major projects
Fysikalisk Systemteknik Personell: Professor Christian Bohm Lecturer Sam Silverstein Part time lecturer Magnus Engström Adjunkt Eddie Ahlestedt Forskningsingenjör (emeritus) Hans Eriksson Forskningsstuderande: Jonas Klereborn Abdelkader Bousselham Attila Hidvegi Florian Bauer (external) New Instrumentation Physics Experimental Physics Technology We collaborate with experimental physics groups, focusing on the development of new instruments. Make it easier to develop and maintain useful engineering skills while retaining an active grasp of the relevant physics. Use experience from projects to solve new problems. Look for general solutions and methodologies which are easier to carry over to new problems.
Different instrumentation projects: • In collaboration with particle physics, SU • RD-16 FERMI and RD27 first level trigger • Digitizing electronics for ATLAS TileCal • The Jet/Energy-sum processor for the • ATLAS first level trigger • In collaboration with physicists at KI • Development of a SPECT camera • Development of a PET-Camera • In collaboration with molecular physics, SU • Frequency stabilisation of semiconductor • lasers for laser trapping and cooling • of atoms • In collaboration with astroparticle physics, SU • Participation in the development of IceCube
What is a Field Programmable Gate Array? Firmware Configuration memory Logic with Data path switches
Configuration memory 111111111-------1--1---11111111-------1---1-111-11-1 a b c Programmable interconnect e d f g clk h Programmable IO-blocks FPGA Configurable Logic Blocks
Configuration memory pattern defines circuit 001011100-------1--0---01000000-------1---0-110-1010 a b c e d f g clk h b c a e d clk
Modifying the memory content changes the circuit 001101100-------0--0---11111000-------0---1-000-1010 a b c e d f g clk h b a e d clk
FPGAs have been around since mid-1980s Early components were programmed at a bit-level using graphic editors Increased complexity required better methods: High level languages (VHDL), or Schematic specifications
When designing complex circuits with FPGAs one has to consider: Does the design fit? Is it fast enough? Is it too expensive?
Functional simulation (no timing) Select FPGA type Synthesize – translate to simple primitives (compilation) Simple timing simulation Place and route Full timing simulation FPGA design process High level description (VHDL)
State-of-the-art FPGAs • Very complex (Xilinx, Alterra) • many gates ~ 8 million gates • many i/o pins ~ 800 (400 diff) • flexible interconnects ~ 7 metal layers • high costs ~ 50 kkr • Multiple clocks - 8 • Embedded memories – 4 MB • Embedded multipliers - 200 • Embedded processors – 4 PowerPC • High speed IO – 16 x 3 GB/s
Efficient tools required • Re-use of previously developed code blocks • Intellectual Property blocks • IP-blocks can be: • In-house developed • Commercially available • Freely available – open-core • Part of the design can be accomplished by • assembling compatible IP-block • Processors (embedded or IP) • Memories • Busses • Interfaces • Etc.
When designing complex FPGA modules one must decide: • What to implement in logic • What to implement in processor software • Hardware software co-design • VHDL System-C or Handle C
ATLAS data flow • LHC physics looks for rare • events – 1 in 1014 • High event rates and • High selectivity About 100 million channels new data every 25 ns Data from entire detektor but with low spatial resolution and reduced dynamic range from calorimeters and muon detector 40M Hz Since all data must be stored while waiting for the L1 decision the L1 processing must be quick – 1ns 1 event in 10000 Data from ROIs with high spatial resolution and full dynamic range from all subdetector 75 -100 kHz 1 event in 100 Entire detector with high spatial resolution and full dynamic range from all subdetector 1 kHz 1 event in 100 10 Hz
ATLAS first level trigger collaboration with particle physics SU Looks for typical features for event selection Calorimeter trigger L1 accept Analog Input signals Central Trigger (CERN) Muon Trigger (Italy) ROI info The calorimeter trigger is a Birmingham-London -Rutherford-Stockholm-Heidelberg-Mainz collaboration Looks for isolated clusters resembling single Electrons/hadrons in the ECAL and HCAL Calorimeter trigger Electron/Tau Processor (GBR) 64x64x2 8-bit 64x64x2 Analog signals Preprocessor (Heielberg) 32x32x2 8-bit Jet (Sthlm) and Missing energy (Mainz) processor Looks for energy clusters Looks for energy balance Digitizes determines amplitudes and pulse starts
Analog Tower 0.1 x 0.1 Calorimeter LAr, Tile S Realtime data path Pre-Processor Timing alignment 10-bit FADC FIFO BCID Lookup Table BC-MUX Sum 2x2 Pre-Processor RODs (DAQ) Cluster Processor (e/g and t/had) Cluster Finding Region Of Interest Builder (L2) .1 x .1 Count Level-1 CTP Jet/Energy-Sum Processor .2 x .2 ETEXEY Jets CP/JEP RODs (DAQ) SET, ET Count
The JET/energy sum trigger Look for .4x.4, .6x.6 and .8x.8 energy clusters centered around a local .4x.4 maximum Form global sums of total Et and missing Et Process ~1024 .2x.2 jet elements in parallel All requiring neighborhood information 32 processor boards with large FPGA for Jet and missing energy processing, sharing overlapping environment data Latency (processing time) 200 ns Many different module types Standardized modules
The JET trigger We have built a 18 layer backplane with >20 000 pins for the Jet and the E/t processors VME - - Communication with neighbors Report results
The JET trigger We have participated in the design of the JEM Processor board. And developed firmware for the algorithms and control functions
Experiences from the trigger project: Large scale system design Massive pipelined parallel processing Reliability Large FPGA design (> 1 Mgates) Draw on experience from earlier bit-serial trigger project to do pipelined processing of multiplexed data -> more efficient use of logic and interconnects!
The ATLAS TileCal Digitizer collaboration with particle physics SU Many prototypes – test beam tests – earliest ATLAS subsystem – lots of firsts – production experience – 2000 boards this year
The ATLAS TileCal Digitizer Task: to digitize pre-amplified PMT-pulses and to transfer data selected by the L1-trigger to the higher level triggers. • 16-bits dynamic range with limited precision • L1 buffer memory – 2.5 us • Storage of selected data • Format data • send to level 2 • Physical layout • Noise control • Radiation tolerance • Reliability (physical chain – electrical star) • We also made a optical link with matching • reliability
L1 buffer memory – 2.5 us • Storage of selected data • Format data • Send to level 2 • Physical layout • Noise control • Radiation tolerance Radiation tolerant custom ASIC - no FPGA Components Of The Shelf - COTS Radiation hard ASIC Hi gain Lo gain Analog part ADC ADC ADC ADC ADC ADC ADC ADC ADC ADC ADC ADC 10 Digital part L1a L1a L1 buffer L1 buffer Trigger and Timing Circuit Event storage Event storage Format and send Format and send System clock – level 1 accept Data till second level trigger • 16-bits dynamic range with limited precision • Reliability 10
Experiences from the digitizer project: Large scale system design System aspects – timing and grounding Reliability Radiation tolerant design Production Even if did not use FPGAs in the Digitizer we used them extensively when building prototypes and testbenches.
SU – SPECT Collaboration with Karolinska hosptal The design of a SPECT camera with an innovative cylindrical crystal 72 PMTs around crystal – position determination via light sharing Earlier design based on transputers discontinued Pulse detection + sampling ADCs Digital pulse processing + digital trigger Firewire network Xilinx FPGAs Texas Instrument DSP – TMS 320 6000 family
ICE-CUBE Collaboration with the astroparticle physics group at SU 1400 m 1000 m 60 modules/string 80 strings Volume 1 km3 Digital Optical Module (designed by D. Nygren) • Self-triggers on each pulse • Captures waveforms • Time-stamps each pulse • Digitizes waveforms • Performs feature extraction • Buffers data • Responds to Surface DAQ • Set PMT HV, threshold, etc • Noise rate in situ: ≤500 Hz
ICE-CUBE The DOM circuit board 2 DOMs share 1 twisted pair for power supply and communication 2 ATWD - 4 channel transient waveform recorder 300 MHz 256 samples 2 channels – hi and lo gain from PMT Symmetric timing pulses between hub and DOM sampled at 20 MHz 10bits Supports a higly stable local clock 3.3 ns rms FPGA and CPU combined in new Altera FPGA Our part feature extraction
ICE-CUBE Experimental Requirements IceCube • Time resolution: <5 ns rms • Waveform capture: >250 MHz -for first 500 ns ~40 MHz -for 5000 ns • Dynamic Range: >200 PE / 15 ns >2000 PE / 5000 ns • Dead-time: < 1% • OM noise rate: < 500 Hz(40K in glass sphere) Proposed IceCube DAQ Network Architecture Pair DOM String Subsystem: 60 DOMs 20 "DOM kB/sec HUB" N x 20 kB/sec N pairs Strings 8 0 String LAN 100 BaseT Total traffic: 0.6 MB/sec String Processor All Hits - 0.6 MB/sec String Coincidence LookbackRequests Messages - 170 kB/sec Fulfill Lookback Event Fulfill Lookback Messages Messages Event LAN Builder 0.6 MB/sec String 100 BaseT Coincidence Total traffic: 1.6 MB/sec Messages Built events ~ 1 MB/sec all event builders) Global ( Trigger Event Triggers / Lookback Requests for Online LAN all Strings - 0.8 MB/sec BaseT 100 Satellite Total traffic: 1MB/sec Offline SAN Data (Network Handling Disk Storage) Tape --
Digital Laser Control Collaboration with Anders Kastbergs group Frexghi Habte Modulation Absorption cell detector laser Lock on low frequency component 0 lock on maximum Lock-in amplifier Aim: to design a simple laser control that can manage a large number of units Our solution: use an FPGA based lock-in module Asin(wt+f) Asin(2wt+f)-Asinf Asinf x cos(wt)/2 Cordic algorithm to produce sine and cosine waveforms + second order Butterworth low pass Filter (4Hz) Hardware design based on SPECT module
Future Our involvement in the ATLAS projects will eventually decrease – the digitizer during 2004 and the trigger during 2006. Now they are quite intense The SPECT camera project should terminate In its present form this year. We are participating in a EU application Coordinated by Anders Brahme at KI. Our part here would be in the development of a high resolution whole body positron camera. Lars Eriksson from KI and CPS would be partner in this project. There will surely be other exciting new projects coming up.