210 likes | 340 Views
ALICE Off-line Computing: Status and Plans. P.Hristov for the ALICE Collaboration 14/09/2007 NEC’2007, Varna. History. NEC’2001 AliRoot for simulation NEC’2003 Reconstruction with AliRoot NEC’2005 AliRoot for analysis
E N D
ALICE Off-line Computing: Status and Plans P.Hristov for the ALICE Collaboration 14/09/2007 NEC’2007, Varna
History • NEC’2001 AliRoot for simulation • NEC’2003 Reconstruction with AliRoot • NEC’2005 AliRoot for analysis • NEC’2007 Still no LHC data => Status and plans of the ALICE offline software P. Hristov
Outline • ALICE Experiment • Simulation • Reconstruction • Analysis • First physics • Conclusions P. Hristov
L3 magnet B<0.5 T HMPID TRD TOF Dipole magnet ALICE detectors PMD ITS PHOS MUON spectrometer TPC Absorber P. Hristov
Offline framework • AliRoot in development since 1998 • Directly based on ROOT • Used since the detector TDR’s for all ALICE studies • Few packages to install: ROOT, AliRoot, G3/G4/FLUKA, AliEn client • Ported on most common architectures/OS • Linux i686, ia64 and x86_64, Mac OS X, Digital True64, Solaris x86,… • Distributed development • Over 50 developers and a single CVS repository • 2/3 of the code developed outside CERN • Integration with DAQ (data recorder) and HLT (same code-base) • Abstract interfaces and “restricted” subset of C++ used for maximum portability P. Hristov
Software management • Regular release schedule • Major release every two months, minor release (tag) every month • Nightly produced UML diagrams, code listing, coding rule violations, build and tests , single repository with all the code • No version management software (we have very few packages!) • Advanced code tools under development (with IRST/Trento) • “Smell” detection • Aspect oriented programming tools • Automated genetic testing • Documentation for AliRoot and AliEn produced • Intensive training ongoing P. Hristov
G3 G3 transport User Code VMC G4 G4 transport FLUKA transport FLUKA Reconstruction Geometrical Modeller Visualisation Generators The Simulation P. Hristov
Event Generation: Kinematics Particle Transport: Hits (energy deposition at given Point, MC label) Detector Response: Summable Digits (low ADC threshold, no noise, MC label) Detector Response: Digits (noise + normal Threshold, MC label) Raw data formats: DDL files, DATE file, “rootified” raw Event Merging (MC + MC): Summable Digits Signal +Summable Digits Background Needed to reduce the simulation time SIMULATION Event embedding (MC + Raw): Summable Digits Signal +Raw converted to SDigits Studies of reconstruction efficiency Event mixing: tracks or clusters from one event are combined with tracks or clusters from another (but “similar”) event. Reconstruction P. Hristov
Visualisation P. Hristov
Plans - AliRoot • Framework • Improvements in the CPU and the memory consumption • Ongoing redesign of the ESDs classes • Finalize common AOD classes • Data size reduction: ESD, AOD • Optimize access to OCDB & metadata selection • Simplification of the IO management • RAW data: take into account the latest modifications in the hardware • Simulation • Urgent – document & implement ALICE geometry as built • Material in tracking region & fwd direction (services, cables) • Getting TFluka & G4 production ready • Visualization • Preparation for event scanning and visual quality assurance P. Hristov
Local Detector Reconstruction: Clusterization, vertex Seeding: 2 clusters in TPC + primary vertex Kalman Filter: Forward propagation TPCITS Kalman Filter: Backward propagation ITSTPCTRDTOF Kalman Filter: Refit inward TRDTPCITS RECONSTRUCTION TPC seeding, Forward tracking TPC V0 and kink finder ITS forward tracking, Combinatorial Kalman filter BARREL TRACKING ITS V0 finder ITS forward tracking ITS tracking Propagation to PHOS,EM,HMPID TPC tracking Update of V0s and kinks TRD tracking, seeding TOF PID TRD tracking, PID TPC tracking, PID Update of V0s and kinks ITS tracking, PID Update of V0s P. Hristov
TPC all detectors Tracking efficiency Challenge in high-particle density environment Efficiency normalized to number of “reconstructable” track For realistic particle densities dN/dy = 2000 – 4000 combined efficiency well above 90% and fake track probability below 5% P. Hristov
central Pb – Pb TPC pp Combined efficiency Efficiency normalized to number of generated particles at primary vertex within the central acceptance |η|<0.9 • protons – absorption • kaons and pions – decays At lower pt : TRD inclusion drops the efficiency due to unavoidable material between TRD sectors P. Hristov
Plans – Reconstruction • Alignment-aware reconstruction • Survey data usage • Development and tests of the alignment algorithms • Calibration-aware reconstruction • Ongoing review detector by detector • Development of detector algorithms • Reconstruction parameters in separate classes • Factorization of reconstruction features • Magnetic Field • Ongoing analysis of the measurements, close to completion • PID: consistent treatment of the mismatch between detectors P. Hristov
Alignment Simulation Reconstruction Ideal Geometry Misalignment Ideal Geometry Alignment procedure File from survey Raw data P. Hristov
DIM trigger ECS Run Logbook DAQ FXS FXS DB SHUTTLE DCS arch. DB TPC TRD MPH SPD ... DCS FXS FXS DB OCDB GridFile Catalog FXS HLT FXS DB Calibration: The Shuttle No alternative system to extract data (especially online calibration results) between data-taking and first reconstruction pass! P. Hristov
Online Offline ALICE File Catalogue lfn guid {se’s} DDL lfn guid {se’s} CAF lfn guid {se’s} Monit.Calib. lfn guid {se’s} lfn guid {se’s} WN LDC HLT GDC Publish in AliEn DAQ Network Data files Publish agent xrootd Condition files DAQFXS DAQ Logbook DB Run info SRM SRM Data file Castor cache T1’s FTS 240TB Condition files HLTFXS DCSFXS DCS Shuttle CASTOR DCSDB High-level dataflow P. Hristov
ALICE Analysis Basic Concepts • Analysis Models • Prompt data processing (calib, align, reco, analysis) @CERN with PROOF • Batch Analysis using GRID infrastructure • Local analysis • Interactive analysis PROOF+GRID • User Interface • Access GRID via AliEn or ROOT UIs • PROOF/ROOT • Enabling technology for CAF • GRID API class TAliEn • Analysis Object Data contain only data needed for a particular analysis • Analysis à la PAW • ROOT + at most a small library • Work on the distributed infrastructure has been done by the ARDA project p P. Hristov
WNPROOFXROOTD WN PROOFXROOTD WN PROOFXROOTD WN PROOFXROOTD WN PROOFXROOTD WN PROOF XROOTD CAF PROOF master The whole CAF becomes a xrootd cluster CASTOR xrootd • For prompt and pilot analysis, calibration and alignment, fast simulation and reconstruction • Working with the PROOF team on optimisation • S/W versions handling now in PROOF • Disk quotas are already partly implemented and under test • Data distribution at the moment “by hand" • Fairshare algorithm under development P. Hristov
Event and track quality selection. E.g. Events with reconstructed vertex Require track origin in proximity to the primary vertex Corrections Track-to-particle (track level):f(h, z vtx, pT) Geometrical acceptance, reconstruction efficiency, decay, feed-down Vertex reconstruction efficiency (event level) f(z vtx, multiplicity) Trigger efficiency (event level) Corrects to INEL, NSD f(z vtx, multiplicity) Studies of systematics Track selection Influence of secondaries Particle composition Pt cutoff Relative frequency of processes Beam-gas and pileup events Misalignment Bias from the vertex reconstruction First Physics Studies:Measuring the Pseudorapidity Density of Primary Charged Particles using the TPC (ALICE-INT-2007-005) PDC06 data:400K events (corrections)100K events (analysis)Different event sample for corrections and analysis! P. Hristov
Conclusions • The ALICE offline software is making steady progress towards readiness • Well prepared for real data and first physics studies • Optimization of the code performance (CPU, memory, file size) • Last chance to add missing functionality • Extensive tests during the detector commissioning • The area between online and offline is the most demanding at the moment • Contingency is small but positive P. Hristov