270 likes | 287 Views
Detailed overview of VELO software shutdown planning, highlighting milestones, critical areas, organization responsibilities, and progress tracking. Includes key details on critical paths, key focus areas, data processing, monitoring status, and online presenter features.
E N D
VELO Software Overview &Shutdown Planning Organisation Milestones 3 Critical Areas
Areas & Responsibilities • Overall Co-ord: CP • PVSS: Stefano De Capua • DAQ Recipes: Karol Hennessy • Timing & Gain: Kazu Akiba • Error Bank Analyses: Ann van Lysebetten • Online Monitoring: Kurt Rinnert • Data Quality: Eduardo Rodrigues • Simulation & Reconstruction: Tomasz Szumlak • Tracking: David Hutchcroft • Alignment: Silvia Borghi • Closing Strategy: Malcolm John • > 20 people contributing • Milestones defined for each, with one person responsible and priorities assigned
Organisation • Weekly Monday commissioning meeting • Report on previous week milestones • News from all, forum for discussing issues • Work plans for the week • Integration Days: Thursdays • Integrate weekly releases (if any) at pit • Release: PVSS, recipes, Vetra • Brief report bi-weekly Friday meeting • Report progress to whole group, no details • Specific presentations on items of general interest • Shutdown progress logged on milestone Twiki page https://lbtwiki/bin/view/VELO/SoftwareMilestones
Milestone Progress • Proceeding close to schedule • Some delays due to FEST’08 production
Critical Path • In September identified three key areas where progress is needed before we start running this year • Timing • TELL1 Parameter Uploading • Monitoring
Timing studies • Set up timing for sampling of pulse train and for optimal analogue signal height • Automated timing scans implemented and being tested • Firmware release being tested
Delayed Sampling phase Delayed pulse Digitisation Delay ADC Counts 5 3 4 6 Time (CLK/channels) Measured Voltage ARx Clock
Analogue Sampling Delay Scan Points Sampled points for a given clock ADC Counts Delayed pulse 75 25 50 100 Time (ns) Beetle Clock First Time Sample Second Time Sample Third Time Sample Fourth Time Sample ...
TELL1 Data Processing Pedestal Following Beetle Cross-talk Correction Cable Cross-talk Filter (FIR) Common Mode Suppression (MCMS) Beetle baseline shift Reordering Common Mode Suppression (LCMS) Clusterization • Velo Data Processing Raw -> Clusters in TELL1 RAW • Require 1M parameters • Optimisation critical for data quality (see TED data talk) • Pedestal & Clusterisation Thresholds most important • Bit Perfect Emulation of Algorithms in full LHCb Software Framework Lower priority Lower priority CLUSTERS
Vetra – TELL1 Emulation • Parameter uploading achieved for first time in December • Firmware fixes made and used (November) • Testing & evaluation underway
Pedestal Processing Raw data Pedestal corrected data Pedestal correction monitoring Base line (zero) level after pedestal correction
Beetle X-talk correction Beetle X-talk effect first channel in each analog link is affected Noise after pedestal correction Measured noise in first channel before correction Beetle X-talk correction monitoring Noise in first channel after correction Average noise measured in unafected channel
Effect of tuning ADC count ADC count Constant Pedestal Only All Parameters tuned Non-Zero Suppressed Data critical – so that tuning parameters can be obtained Procedure to take automatically during data - One module at a time, under test
Monitoring • Monitoring package • Package for “high-level” (= ZS) data • Monitoring based on clusters and tracks • Package for NZS data • Noise calculation, time alignment study, beetle pulse shape, … • Scripts and macros are being developed to analyse data • Wiki pages with documentation and HowTo’s Review of Monitoring status in February
Online monitoring • Running since August • Implementation of several plots • New features to be exploited Online presenter
Cluster Monitoring • Cluster information: • Cluster ADC value • Active chip links • Number of strip in a cluster • Cluster ADC value versus sampling • Number of cluster per events • More… Some of these distributions versus sensor number and/or sensor strip
Track Monitoring • Tracks • Number tracks • Pseudo-rapidity • Azimuthal Angle • Pseudo-efficiency • Biased and unbiased Residuals versus sensor number • Total number of R cluster per track • Vertex information • Hits distribution in xy and xz • Mean sigma of residuals versus of sensors • More…
Track monitoring: J/() Ks Pseudo rapidity Azimuthal angle X(cm) BiasedRes R Sensor # Z(mm)
Script and Macros • Analysis of the data for the evaluation for: • Time alignment study, • Noise calculation, • High voltage scan, • beetle pulse shape, • More…
Noise monitoring macros – example of GUI Common mode subtraction No common mode
Noise performance HP1 HP2 HP3 HP4 • Common mode pickup from beam requires data • At pit and in previous testbeams parameters highly stable
Noise – individual / whole system No evidence that operation of full system induces more noise than single sensors
Noise versus Voltage Expected Signal / Noise
IV - scans • PVSS recipes available to automate IV scans • Set initial voltage, target voltage, step, single or set of sensors • A data file produced per sensor containing channel number, voltage, current, sensor temperature • Analysis scripts for plotting IV scan data
Software Commissioning Summary • All baseline algorithms • Completed for summer ’08 • Commissioning software • Milestones for data readiness in April 2009 • 3 critical areas all proceeding according to plan • TED data are the VELO ‘cosmics’ Tremendous success of first tracks This sample has been very useful for comissioning TED data this summer will allow us to: Optimise timing Test and tune FPGA algorithms Increase alignment accuracy