1 / 20

Where do we need QA ?

Learn about the importance and structure of Online and Offline QA, including monitoring frameworks, structures, purposes, alarms, and tasks in data simulation, reconstruction, and analysis. Understand the requirements, creation steps, archiving, and QA analysis tasks for efficient data quality assurance.

smathias
Download Presentation

Where do we need QA ?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. QA

  2. Where do we need QA ? • Online • Offline

  3. Online QA • Online • Raw data : within the DAQ monitoring framework • Reconstructed data : with the HLT monitoring framework • Structure of the framework • Collect and calculate parameters, fill histograms while taking data • Live display of histograms and out-of-range check or comparison with reference histograms (stored in OCDB) • Purpose • Check integrity of data • Alert the person on shift on a malfunctioning of the detectors

  4. Offline QA ? • Online • Offline • Simulation: raw data, digits (same as online) • Reconstruction: all intermediate steps until ESD • Purpose • Check for software bugs • Provide crude alarms • Provide QA histograms which users control (visual examination) by comparing with (running) reference histograms (stored in OCDB)

  5. Offline QA structure • Online • Offline • Structure of the framework: job level (1) • During simulation, calculate parameters and fill histograms, kept on local disk of WN (root session 1) • Run the QA tasks after simulation is done and update log file (root session 2) • Parse the simulation log, if error: set the job status, update Tags and optionally stop • During reconstruction, calculate parameters and fill histograms, kept on local disk of WN (root session 3) • Run the QA tasks after reconstruction is done and update log file (root session 4) • Parse the reconstruction log, if error: set the job status, update Tags and optionally stop • Run the QA on ESD, produce histograms, make tests and update the log file (root session 5) • Parse the ESD QA log, if error: set the job status and update Tags • Check all QA and optionally stop the run

  6. Offline QA structure • Online • Offline • Structure of the framework: job level (2) • Run simulation (root session 1) • Run the QA tasks, fill QA histograms from RAW and Digits, and update log file (root session 2) • Parse the simulation log, if error: set the job status, update Tags and optionally stop • Run reconstruction (root session 3) • Run the QA tasks, fill QA and update log file (root session 4) • Parse the reconstruction log, if error: set the job status, update Tags and optionally stop the job • Run the QA on ESD, produce histograms, make tests and update the log file (root session 5) • Parse the ESD QA log, if error: set the job status and update Tags • Check all QA and optionally stop the run

  7. Offline QA structure • Online • Offline • Structure of the framework: ≥ event level, ≤run level • Run simulations and reconstructions and associated QA tasks until the required statistics is reached • Merge the QA histograms (simulation, reconstruction ESD) • Run QA tasks • Parse the log, if error: set the jobs status and update Tags • Check all QA and optionally stop the production

  8. Offline QA requirements • Online • Offline • Requirements • QA objects per detector producedeitherduringsim/rec or after • QA tasks per detector (simulation, reconstruction) • QA tasks per detector, per PWG4 (ESD QA) • Same QA tasks for real and sim RAW • Keep QA histograms in the AliEn catalog • New input in Tags • The framework must be able to set/update job status • The framework must be able to decide what to do in case of errors

  9. QA: whatexists ? • QA Analysistasks in $ALICE_ROOT/ESDCheck • Input = ESD; Output/det=histograms (root file) + plots+log • Detectors: EMCal, FMD, HMPID, MUON, PHOS, PMD, T0, TOF, TRD, VZERO • Missing: ITS, TPC • Train: $ALICE_ROOT/ESDCheck/ana.C • Steering macro: $ALICE_ROOT/ESDCheck/CheckESD.sh • launch the train, • parse the log, • providesummarystatus

  10. QA continued August 9, 2007

  11. Summary • The QA object : • one bit map per detector • Signalling problems with different levels of secverity • In different tasks among: simulation (on line data), reconstruction, ESDs, analysis

  12. Creation of QA • Three steps are needed • Creation and archiving of reference data • Valid for a set of runs • Creation and archiving of input data • One data set per run (sim ≡ several events; data ≡ an ALICE run) • Performing QA test and archiving test result • Separate job (different from the jobs which create the data) • Can be perfomed on a subset of data large than run

  13. Requirements • Data must be mergeable • Archiving is done in AliEn using the standard directory structure (not OCDB), local storage is possible too • Reference : …..QA/Ref/ (no idea yet how to deal with versions) • Data and QA: ….Run/ (no idea yet how to deal with versions) • Data are build in the event loop of the appropriate task (Sdigitizer, Digitizer, Tracker) • It is the responsibility of the detectors to implement the QA control in their tasks

  14. Requirementscontinued • QA results for all detectors and all tasks in a dingle root file with a directory structure • The framework provides • AliQualAss ::TNamed • Singleton • One bit array per detector • Naming conventions (detectors, tasks, files…) • Setting, Checking, Storing

  15. Requirementscontinued • The framework provides • AliQualAssDataMaker::TNamed • Base Class • Creation of directory structure for QA results: Init() • Saving of results: Finish() • Steer for the processing Exec() • The detectors provide • AliDetQualAssDataMaker:: AliQualAssDataMaker • Implement initialisation of data objects (TH1,….) • Implement filling of data objects • Steer the QA data makingfromeachtaskproducing data

  16. Requirements continued • The frameworkprovides • AliQualAssDataChecker::TNamed • Steers the QA check for all detectors and all tasks • Opens Ref and data file (Setters in a QA.C macro) • Does the QA for all objectsstored in the QA data file • Updates the QA result • Saves the QA results file • AliQualAssDataCheckerBase::Tnamed • Called by preceeding • Initialises the QA object and sets appropriate detector and task • Retrieves/Saves the QA result file • Does the Check

  17. Requirements continued • The frameworkprovides • AliQualAssDataCheckerBase::TNamed • Called by preceeding • Initialises the QA object and sets appropriate detector and task • Retrieves/Saves the QA result file • Comparisonmethods ( Diff () Kolmogorov, Chi2,…) • A simple checker • Implements a simple Check (Diff of all objects (TH1) in the data file) • The detectors provide • AliQualAssDetChecker::AliQualAssDataCheckerBase • Overloads the comparison Check()

  18. Implementation • Detectors • AliDetxxxxx (SDigitizer, Digitizer, etc.) • Data member : AliDetQualAssDataMaker * fqa • Ctor: fqa->Init(AliQualAss::kHITS(kSDIGITS) ; • Event loop: fqa->SetData(hits(sdigits)) fqa->Exec() • Its different for ESDs which is the responsibility of the framework • AliReconstruction • In the event loop, same as for the other tasks

  19. Implementation • Aliroot –b –q QA.C • AliQualAssChecker qac ; • qac.SetRefDir(“…/…/…/) ; • qac.SetOutDir(“…/…/…/); • qac.Run() ; • To be run after sim.C and rec.C • The QA data file is on the local dir of the WN

  20. To bedone (framework) • Implement the AliEn file access • Versionningscheme for Reference data and of QA results data • Granularity of QA data • Correct all the misconceptions and illegalimplementationswhichwillbediscovered by Peter. • What to do with the QA results • Publish • Test and action • Implement the QA online • QA data made on line in MOOD • At end of runship the data to FXS • PreprocessorAlgorithm to make the QA • Shipwith the shuttle the QA results and QA data to AliEn • Have the QA test running in MOOD

More Related