200 likes | 211 Views
Learn about the importance and structure of Online and Offline QA, including monitoring frameworks, structures, purposes, alarms, and tasks in data simulation, reconstruction, and analysis. Understand the requirements, creation steps, archiving, and QA analysis tasks for efficient data quality assurance.
E N D
Where do we need QA ? • Online • Offline
Online QA • Online • Raw data : within the DAQ monitoring framework • Reconstructed data : with the HLT monitoring framework • Structure of the framework • Collect and calculate parameters, fill histograms while taking data • Live display of histograms and out-of-range check or comparison with reference histograms (stored in OCDB) • Purpose • Check integrity of data • Alert the person on shift on a malfunctioning of the detectors
Offline QA ? • Online • Offline • Simulation: raw data, digits (same as online) • Reconstruction: all intermediate steps until ESD • Purpose • Check for software bugs • Provide crude alarms • Provide QA histograms which users control (visual examination) by comparing with (running) reference histograms (stored in OCDB)
Offline QA structure • Online • Offline • Structure of the framework: job level (1) • During simulation, calculate parameters and fill histograms, kept on local disk of WN (root session 1) • Run the QA tasks after simulation is done and update log file (root session 2) • Parse the simulation log, if error: set the job status, update Tags and optionally stop • During reconstruction, calculate parameters and fill histograms, kept on local disk of WN (root session 3) • Run the QA tasks after reconstruction is done and update log file (root session 4) • Parse the reconstruction log, if error: set the job status, update Tags and optionally stop • Run the QA on ESD, produce histograms, make tests and update the log file (root session 5) • Parse the ESD QA log, if error: set the job status and update Tags • Check all QA and optionally stop the run
Offline QA structure • Online • Offline • Structure of the framework: job level (2) • Run simulation (root session 1) • Run the QA tasks, fill QA histograms from RAW and Digits, and update log file (root session 2) • Parse the simulation log, if error: set the job status, update Tags and optionally stop • Run reconstruction (root session 3) • Run the QA tasks, fill QA and update log file (root session 4) • Parse the reconstruction log, if error: set the job status, update Tags and optionally stop the job • Run the QA on ESD, produce histograms, make tests and update the log file (root session 5) • Parse the ESD QA log, if error: set the job status and update Tags • Check all QA and optionally stop the run
Offline QA structure • Online • Offline • Structure of the framework: ≥ event level, ≤run level • Run simulations and reconstructions and associated QA tasks until the required statistics is reached • Merge the QA histograms (simulation, reconstruction ESD) • Run QA tasks • Parse the log, if error: set the jobs status and update Tags • Check all QA and optionally stop the production
Offline QA requirements • Online • Offline • Requirements • QA objects per detector producedeitherduringsim/rec or after • QA tasks per detector (simulation, reconstruction) • QA tasks per detector, per PWG4 (ESD QA) • Same QA tasks for real and sim RAW • Keep QA histograms in the AliEn catalog • New input in Tags • The framework must be able to set/update job status • The framework must be able to decide what to do in case of errors
QA: whatexists ? • QA Analysistasks in $ALICE_ROOT/ESDCheck • Input = ESD; Output/det=histograms (root file) + plots+log • Detectors: EMCal, FMD, HMPID, MUON, PHOS, PMD, T0, TOF, TRD, VZERO • Missing: ITS, TPC • Train: $ALICE_ROOT/ESDCheck/ana.C • Steering macro: $ALICE_ROOT/ESDCheck/CheckESD.sh • launch the train, • parse the log, • providesummarystatus
QA continued August 9, 2007
Summary • The QA object : • one bit map per detector • Signalling problems with different levels of secverity • In different tasks among: simulation (on line data), reconstruction, ESDs, analysis
Creation of QA • Three steps are needed • Creation and archiving of reference data • Valid for a set of runs • Creation and archiving of input data • One data set per run (sim ≡ several events; data ≡ an ALICE run) • Performing QA test and archiving test result • Separate job (different from the jobs which create the data) • Can be perfomed on a subset of data large than run
Requirements • Data must be mergeable • Archiving is done in AliEn using the standard directory structure (not OCDB), local storage is possible too • Reference : …..QA/Ref/ (no idea yet how to deal with versions) • Data and QA: ….Run/ (no idea yet how to deal with versions) • Data are build in the event loop of the appropriate task (Sdigitizer, Digitizer, Tracker) • It is the responsibility of the detectors to implement the QA control in their tasks
Requirementscontinued • QA results for all detectors and all tasks in a dingle root file with a directory structure • The framework provides • AliQualAss ::TNamed • Singleton • One bit array per detector • Naming conventions (detectors, tasks, files…) • Setting, Checking, Storing
Requirementscontinued • The framework provides • AliQualAssDataMaker::TNamed • Base Class • Creation of directory structure for QA results: Init() • Saving of results: Finish() • Steer for the processing Exec() • The detectors provide • AliDetQualAssDataMaker:: AliQualAssDataMaker • Implement initialisation of data objects (TH1,….) • Implement filling of data objects • Steer the QA data makingfromeachtaskproducing data
Requirements continued • The frameworkprovides • AliQualAssDataChecker::TNamed • Steers the QA check for all detectors and all tasks • Opens Ref and data file (Setters in a QA.C macro) • Does the QA for all objectsstored in the QA data file • Updates the QA result • Saves the QA results file • AliQualAssDataCheckerBase::Tnamed • Called by preceeding • Initialises the QA object and sets appropriate detector and task • Retrieves/Saves the QA result file • Does the Check
Requirements continued • The frameworkprovides • AliQualAssDataCheckerBase::TNamed • Called by preceeding • Initialises the QA object and sets appropriate detector and task • Retrieves/Saves the QA result file • Comparisonmethods ( Diff () Kolmogorov, Chi2,…) • A simple checker • Implements a simple Check (Diff of all objects (TH1) in the data file) • The detectors provide • AliQualAssDetChecker::AliQualAssDataCheckerBase • Overloads the comparison Check()
Implementation • Detectors • AliDetxxxxx (SDigitizer, Digitizer, etc.) • Data member : AliDetQualAssDataMaker * fqa • Ctor: fqa->Init(AliQualAss::kHITS(kSDIGITS) ; • Event loop: fqa->SetData(hits(sdigits)) fqa->Exec() • Its different for ESDs which is the responsibility of the framework • AliReconstruction • In the event loop, same as for the other tasks
Implementation • Aliroot –b –q QA.C • AliQualAssChecker qac ; • qac.SetRefDir(“…/…/…/) ; • qac.SetOutDir(“…/…/…/); • qac.Run() ; • To be run after sim.C and rec.C • The QA data file is on the local dir of the WN
To bedone (framework) • Implement the AliEn file access • Versionningscheme for Reference data and of QA results data • Granularity of QA data • Correct all the misconceptions and illegalimplementationswhichwillbediscovered by Peter. • What to do with the QA results • Publish • Test and action • Implement the QA online • QA data made on line in MOOD • At end of runship the data to FXS • PreprocessorAlgorithm to make the QA • Shipwith the shuttle the QA results and QA data to AliEn • Have the QA test running in MOOD