90 likes | 257 Views
Offline software for ECAL test beam. The p re-processing model The offline software framework Reading test beam data into ORCA OSCAR: Monte Carlo The physics/software/protocol interface. Preprocessing. Advantages: Time-consuming calculations etc done once centrally Pulse reconstruction
E N D
Offline software for ECAL test beam • The pre-processing model • The offline software framework • Reading test beam data into ORCA • OSCAR: Monte Carlo • The physics/software/protocol interface ECAL testbeam Workshop
Preprocessing • Advantages: • Time-consuming calculations etc done once centrally • Pulse reconstruction • Drift chamber analysis • Everyone uses the same (for those calculations) • Puts data into convenient format (Root) • Disadvantages: • Have to get all the input right before running preprocessing • If there is a mistake in the input, the preprocessing has to be repeated • Input: • Pulse reconstruction constants etc • Pedestal values • Relative gain between ranges • Beam position measurement constants (2002: drift; 2003: SF Hodo) • Delays result in many people working direct from Zebra data, outside any framework… ECAL testbeam Workshop
Can we do without it? • Ought to be possible to do both beam position and pulse reconstruction sufficiently rapidly to make pre-processing unnecessary • Write out ROOT format raw data • Helps with the requirement to keep test beam raw data available for checking for life-time of CMS • Raw data format needs to be fixed — ‘new electronics’ implies readout of blocks of 25 crystals • How many crystals read out for beam data (4x25 ?) • Different number for Laser, Peds etc ECAL testbeam Workshop
Offline software • Need to develop a framework where tools and code developed can be incorporated and used generally/publicly • At present, problems/tasks are being solved many times • Everybody has to invent their own version of the wheel • Comprehensive breakdown of “preparatory tasks”, with timescales and defined responsibilities • Beam position reconstruction constants • Pedestal values • Pulse reconstruction code and constants • Gain constants (relating ranges) etc etc • We should try to use the maximum of CMS CCS resources — manpower, computing resources, tools etc • Easier said than done… But the effort is worthwhile… • Try to use standard CMS methods where reasonable • David Stickland will be asking soon for estimates of data volume for 03 and 04 • Need to agree on the direction; then design and put together the framework before the 2003 test beam run • Using 2002 data as “fake data” to test it ECAL testbeam Workshop
Reading data into ORCA • From PRS Annual Review Report, 2002 • The positive experience from the Tracker community in the use of the CMS OO software for their beam tests is extremely encouraging. It is hoped that the other sub-detectors will follow suit and start to incorporate at least some of the tools into their future beam tests, even though it may appear to be ”overkill” at first. At the very least the review panel feels it is mandatory for the sub-detectors to make comparisons of their test beam data with Monte-Carlo within the OSCAR/GEANT4 framework as soon as possible and they are encouraged to use the ORCA framework for their analysis. • Recommendation: All future beam-tests must include the usage of the OO framework; in particular the analysis and comparisons of data with MC should be made using the CMS software ECAL testbeam Workshop
Need for reading data into ORCA • “Detector response” – final validation of pulse reconstruction simulation • Despite additional problem of clock phase in test beam data • Final validation of Geometry description and Shower MC model (i.e. GEANT4/OSCAR) • Familiarization with CMS software environment for a wider ECAL community • Bridge the gap between test beam perspective and Reconstruction work • e.g. concepts used in position measurement ECAL testbeam Workshop
OSCAR/GEANT4 • Fully functional super-modules will provide the final validation of the Geometry and the shower MC • Issues: • Shower shape • Containment 1/ 9, 1/ 25 (relevant to calibration) • Gaps and material between crystals • Highly ‘stepped’ crystals at large • Effect of angle of incidence • Much is already known from PRS reconstruction studies, but it needs to be verified with data • Work underway using OSCAR to simulate test beam setup • (See Dan Holmes, http://cmsdoc.cern.ch/Physics/egamma/transparencies/m86-5.pdf and also talk at this workshop by Sasha Nikitenko) ECAL testbeam Workshop
The physics/software/protocol interface • There are a number of seemingly trivial small details that can have a large impact • Maybe we can only learn by tripping up, although we should try to exercise forethought • We need to tools for rapid evaluation in H4 • We took 25 time-samples – with hindsight we can see that this was a real waste, of DAQ, of disk-space, access time, pre-processing time… • Must have coherent noise measurement code, rapidly usable • Must have an analysis tool available to determine offset of table • was misaligned in one dimension; must use best dead-reckoning followed by offset tuning • Available data acquisition rate, and details of experimental protocol (autoscan, position scan, etc) can make enormous differences to the time it takes to take data to achieve a particular goal ECAL testbeam Workshop
Summary • M0’ was a big step forwards, but the size of the steps needed to get to 10 super-modules in 2004 are even bigger… • The discipline of collaborative software and analysis effort is a burden: it is painful, restrictive, irritating… but without it we will not achieve our goal ECAL testbeam Workshop