1 / 27

News from PandaRoot: what happened in the last months and what is going to happen

News from PandaRoot: what happened in the last months and what is going to happen. for the computing groups. Stefano Spataro. ISTITUTO NAZIONALE DI FISICA NUCLEARE Sezione di Torino. Tuesday, 17 th May, 2011. The PandaRoot framework. based on.

Download Presentation

News from PandaRoot: what happened in the last months and what is going to happen

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. News from PandaRoot: what happened in the last months and what is going to happen for the computing groups Stefano Spataro ISTITUTO NAZIONALE DI FISICA NUCLEARE Sezione di Torino Tuesday, 17th May, 2011

  2. The PandaRoot framework based on Virtual Monte-Carlo (2.7b  r537) ROOT (5.26  5.29) May11 dynamic data structure (based on ROOT Trees and Folders) use of many ROOT application (TGeo, EVE, TMVA, TMemStat) same geometry/code for Geant3 Geant4 (9.2  9.4) compiled and running on more than 10 Linux platforms + Mac OS X Alien2 based GRID Tracking TDR data challenge collaboration with external developers CERN/ALICE FAIR/HADES-CBM-R3B

  3. Central Tracker TDR p Deadline June 2011 MVD: Bonn, Torino, Julich STT: Pavia, Julich GEM: GSI TPC: Munich

  4. GRID Data Challenge 4 test of massive data production for the tracking TDR • Stability of PandaRoot code • GRID performance • Computing time for full reconstruction • Disk space required for full reconstruction

  5. Simulation Data Sample DPM Background events MIN=2° (reduced Coulomb scattering) • 3 energies • pbar p @ 1.64 GeV/c • pbar p @ 4 GeV/c • pbar p @ 15 GeV/c only Barrel Spectrometer TPC and STT setup – 150cm 3 GEM planes 1 Million events for each setup/energy Total ~ 6 Mevents

  6. Geometry Geant3 Alice settings for TPC Full Target Spectrometer No Forward Spectrometer Pipes (pipe+beam) MVD TPC/STT (150 cm) GEM (3 planes) DIRC DISC EMC MDT Coils Yoke + Muon Filter

  7. Full Reconstruction STT Geant3 simulation Full digitization Tracking (MVD+STT) Correlation for PID Available PID algorithms TPC Geant3 (Alice) simulation Full digitization Lhe Tracking (MVD+TPC+GEM) Correlation for PID Available PID algorithms Ready – minor fixes Something to improve Something missing

  8. The GRID Down Ready No Jobs • Dubna busy • GSI limited • Torino off (local shutdown) • Jobs submitted manually Not the full GRID capabilities

  9. Maximum 305 running jobs in parallel

  10. Evaluation Very high success rate! Average DONE rate before subjobs resubmission: 95% after resubmission: 100% Good files (STT): 99% Good Files (TPC): 98% http://panda-wiki.gsi.de/cgi-bin/view/Computing/DataChallenge4

  11. CPU Time Average events per day: 2.5M STT 1.64 GeV/c 500 events ~ 60 min 4.00 GeV/c 500 events ~ 70 min 15.0 GeV/c 1000 events ~ 3h 30 min TPC 1.64 GeV/c 500 events ~ 30 min 4.00 GeV/c 500 events ~ 40 min 15.0 GeV/c 1000 events ~1h 40 min STT Pattern Recognition slower than TPC LHE tracking TPC Pattern Recognition will need more time

  12. File Sizes STT 1.64 GeV/c ~ 40 kB/event 4.00 GeV/c ~ 50 kB/event 15.0 GeV/c ~ 60 kB/event TPC 1.64 GeV/c ~ 200 kB/event 4.00 GeV/c ~ 250 kB/event 15.0 GeV/c ~ 300 kB/event variations of 10-20% between different runs

  13. “Differential” File Size pbar p @ 4 GeV/c STT TPC Parameters 3 MB/file 2 MB/file Simulation 19 kB/evt 133 kB/evt Digitization 9 kB/evt 6 kB/evt Reconstruction 1 kB/evt < 58 kB/evt Pid 2.2 kB/evt 2.3 kB/evt Total ~40 kb/evt ~200 kB/evt MC ESD AOD

  14. Do we have really something useful? STT setup 445k Events - pbar p @ 4 GeV/c (DPM)

  15. STT setup 445k Events - pbar p @ 4 GeV/c (DPM) EMC MVD DIRC DISC

  16. Simulation of Benchmark Channels Still not assigned Realistic Reconstruction Central Tracker + MVD + GEM Event Mixing

  17. Reconstruction @ May • TPC • dE/dx something (?) • PR TPC big improvements • PR TPC+MVD no news • PR MVD+TPC+GEM no news • Event Time Mixing • STT • dE/dx done • PR with GEM done • Event Time Mixing done Still several points to fix before starting production

  18. Event Time Mixing G.Boca

  19. G.Boca

  20. G.Boca

  21. Event Time Mixing • STT • Mix with maximum 10 evts • PR 6 times slower • TPC • Mix with 1000 evts • PR 1 evt 30 min Room for improvements in pattern recognition code

  22. First test of TPC dE/dx algorythm TPC dE/dx Vs momentum SUM very primitive - it requires experts!

  23. Tracking Particles fine resolution something wrong

  24. Electron ID Ronald algorythm based on Bayes in PandaRoot Positive Particles Negative Particles

  25. Muon ID muons BARREL+EC EC+MF BARREL pions BARREL+EC EC+MF BARREL

  26. Muon ID  @ 1 GeV/c  @ 3 GeV/c BARREL BARREL  @ 3 GeV/c  @1 GeV/c BARREL BARREL

  27. Trying to Summarize Ongoing Activities for the tracking TDR The GRID is going to be used for data production Reconstruction is going to be finalized STT code almost ready TPC needs a bit more time For the analysis Electron ID possible Muon ID possible Studies on analysis tools

More Related