190 likes | 205 Views
This overview document by Bruce Schumm provides detailed insights into the progress towards the first results of the Diphoton MET project. It outlines the timelines, tasks, challenges, and optimization strategies involved in achieving the desired outcomes. The document elaborates on the analysis walkthrough, infrastructure requirements, event selections, preliminary studies, optimizations, backgrounds, and the conundrum of estimating backgrounds without final estimates. It also covers the common tasks shared by all groups, such as SM MC samples, xAOD framework issues, and object definitions. Stay informed about the latest developments in this fascinating project.
E N D
Diphoton+MET 2015: Overview of Path towards First Results A living document… Bruce Schumm SCIPP 18 May 2015
Timelines I • Is this still the best guess? • 50 nsec running • Few pb-1 by end of May (but trigger in commissioning) • As much as 1 fb-1 by end of June • 2-3 fb-1 by end of August • can use for CONF NOTE • 25 nsec running • 10 fb-1 end of October • For journal publication • Proposal: Push/optimize for 2-3 fb-1 result
Timelines II • If we push for “August” result… • Analysis walkthrough beginning of June • ~2 Hr process, with much discussion • Expected to present unblinding case during walkthrough, up to necessary lacunae associated with data-driven studies • Editorial Board formed at that point • Draft of support note expected at that point • Seems unlikely. Grids probably not event ready. Instead just present status
Tasks Overview • Code/Infrastructure [Some done] • xAOD • Derivations • Higher-level infrastructure (Ryan’s package) • Events variables (MET with photons, etc.) • Event selection • Preliminary studies • Optimization • Backgrounds • QCD • Electroweak • Irreducible • Overlap (?) • Models [Largely done] • SM samples • Strong & EW signal • Full vs. fast sim?
Models • SM samples defined; generation underway. Much overlap with other groups; much hard work by Milano group! • Gluino, wino grids defined • All BF and decay length issues resolved • Fast Sim sufficiently validated • Of order ~2m events at 10K/point, more or less approved • Final validation step (generator-level filter to ensure two binos in each event) underway • Need this soon! (optimization)
Backgrounds - QCD • Prior approach was to assume real diphotons are 7525% of low-MET background • Diphoton MC used to estimate high-MET contribution • Pseudophoton control sample scaled to remainder of low-MET events used to estimate -jet contribution • Using 8 TeV data to explore new approach (ABCD method with pseudophotons and relaxed isolation); preliminary results expected soon • If this doesn’t work, will need to fall back to old pseudophoton control-sample technique
Backgrounds - EW • Estimate with e control sample scaled by e fake rate • Need to select e control sample • Tag&probe study of e fake rate underway [Giacomo] MAYBE: • W MC suggests that ~25% of EW background doesn’t arise from e fakes • Some of this may be accounted for in QCD background • Some of QCD background may include e fake events • Prior approach was to include 25% systematic error on the EW background • Perform QCD/EW background overlap study?
Backgrounds - Irreducible • W contribution estimated via l control sample and simultaneous fit with SR • Question about comparison w/ VBFNLO expectation • Need to develop control sample and explore • Z contribution from Sherpa, scaled to VBFNLO (via MadGraph) in relevant kinematic region • Big difference between VBFNLO and Sherpa not understood (Sherpa much larger) • Need to revisit
TASKS POTENTIALLY COMMON TO ALL GROUPS SM MC samples xAOD framework issues (not completely sure what I mean by this…) Code snippets reflecting agreed-upon object definitions Isolation variable code Pseudophoton object definition code e->gamma fake rate study dPhi studies (one-sided or two-sided?) MAYBE? Common derivation (probably not since different triggers)? Common systematics tasks?
Code/Infrastructure • xAOD-based analysis: TokyoTech, UCSC need to catch up • Derivations followed through upon by Milano (status?) • Higher-level statistics and • plotting utility (Ryan…) • Past quantities that have required • study (do we need to look into • these?) • MET • Isolation definition • ???
Event Selection: Preliminary Studies • In past, formal optimization was last step, considering only M_eff (or HT) , MET • Individual, preliminary studies used to establish • Photon PT cut; see e.g. https://indico.cern.ch/event/165989/contribution/0/material/slides/0.pdf • Δφ-MET : make use of or not; cut value. Should we also cut on (Δφ-MET - )? • Δφjet-MET : cut value. Should we also cut on (Δφjet-MET - )? • For 8 TeV, used Meff vs. MET visualization plane (see below) • Will need signal grid points for this already!
Optimization: 8 TeV Approach • Last step done by inspection of Meff (or HT) vs MET plane • Can be confounded by statistics; also look at background and signal stats over same plane • See 8 TeV backup note WP2 Optimization NO YES
Optimization: The Conundrum • How to estimate backgrounds when final background estimates not available? • For 8 TeV analysis optimization, backgrounds estimated by • QCD background estimated by scaling 1 tight + 1 non-isolated pseudophoton sample to 2 tight pseudophoton sample with no Meff (HT) cut for 0 < 60 < MET (DATA) • EW background estimated by scaling e sample by uniform 2% e scale factor (DATA) • W, Z from MC • SUSY group will accept leaving final data-driven step and quick reoptimization before unblinding. Or, pre-optimize as a function of one to-be-determined background value
What SRs to Create? • For 8 TeV Analysis • Strong production: High Meff; backgrounds near 0 • EW production: Intermediate HT; backgrounds 1-2 events • Low mass bino, high mass bino for both • SP1, SP2, WP1, WP2 • Also: Model-independent SR (MIS), no Meff (HT) cut. Based on choosing MET cut at which EW and QCD backgrounds about the same (~1 event each)
Model-Independent SR (?) 8 TeV analysis: at MET=250, Meff = 0 backgrounds about same EW QCD Question: Should we rethink? What do we really want to do to minimize chance that we miss a signal? Hmmm…. How do we think about this?
What Physics Could Hide Signal with Dominant BF into Photons and DM? Degenerate SUSY scenarios? No – energy has to go somewhere. We would see it in photons and/or MET. Photons will not be soft because decaying state will either be high-mass or boosted. Low photonic BF? Would need to accelerate single-photon analyses. Not really practical. Long-lived scenarios? Need to re-create non-pointing photon reconstruction. Probably no competition from CMS here anyway. Perhaps most likely scenario is lower-than-expected cross section from non-SUSY process. Probably best addressed by what was done before, or perhaps just use no Meff or Ht cut and use lower MET cut of the other, model-dependent SRs. Could perhaps also maintain low photon ET cut but that could be a “can of worms”.
Wrap UP • I haven’t mentioned limit setting within HistFitter • Immediate motivation is to get to unblind before or simultaneous with CMS • I’m not assuming we’ll necessary be setting limits! • Our work is cut out for us. Thoughts? • We should start writing the skeleton of the backup note. If anyone is itching to do this, by all means. Otherwise, I’m very happy to do that.