80 likes | 88 Views
Reviewing the NS Inspiral S2 paper Inspiral review committee: V. Kalogera, W. Kells, A. Weinstein (chair), A. Wiseman. What we are reviewing.
E N D
Reviewing the NS Inspiral S2 paper Inspiral review committee: V. Kalogera, W. Kells, A. Weinstein (chair), A. Wiseman Inspiral Review, LSC meeting, Mar. 2004
What we are reviewing • We have a rather mature first draft of a BNS S2 paper: “Upper limit on the coalescence rate of Galactic and extragalactic binary neutron stars established from LIGO observations”, 3/11/04 • There is extensive documentation of the details, in the e-log notebook • The Inspiral Analysis Group has plans for more papers in the coming months: • (i) S2 Binary neutron star paper (present at March LSC) • (ii) S2 Binary black hole search (present at June LSC) • (iii) S3 Binary neutron star incl GEO (present at June LSC) • (iv) S2/S3 MACHO binaries (present at Aug LSC) • (v) S2 LIGO-TAMA search (present at Aug LSC) • (vi) S3 Binary black hole incl GEO (present at Aug LSC) Inspiral Review, LSC meeting, Mar. 2004
Some things to look closely at • The BIG PICTURE: is the IAG addressing the right (astrophysical) questions? Is it covering all the questions it can/should? Is it using appropriate approaches? Is it organizing the papers and analysis sub-groups sensibly? • Astrophysical motivations: Are they adequately articulated in the paper? Are there flaws or concerns in the argument which should be addressed? • Astrophysical models: Are the templates modeled correctly? Are the bank tiled sensibly? • Model dependence: effect of spin; higher order terms; astrophysical effects such as orbital decay? Inspiral Review, LSC meeting, Mar. 2004
things to look closely at (2) • Observation time: is it well understood? Effect of “chunks”, “segments”, chunk edges, overlaps, etc. Data handling: any data drop-outs due to bombed jobs? Due to incorrect overlapping? Forgotton cuts? What assurances / tests do we have? • Data conditioning, filtering, line removal… Sensitivity of result to data conditioning.Effect of spectral features and non-stationarity on the result. • Data quality: Science segments, quality flags, vetoes. Granularity of PSD, calibration calculations. What assurances / tests that these are understood and handled correctly? • Safety of vetoes and cuts – assurances that no loud GWs have vetoed themselves. Inspiral Review, LSC meeting, Mar. 2004
things to look closely at (3) • Hardware injections – understand what they were used for and the checks / assurances they do and do not provide. • Calibration – how well is it understood, quantitatively? Checks & assurances. • Analysis pipeline and event tuning – review of procedure, examination of code, searching for flaws, bugs, mechanisms for data to be lost or double-counted. Sensitivity to different tunings? Inspiral Review, LSC meeting, Mar. 2004
things to look closely at (4) • Single-IFO Cuts: SNR, chisq. Tuning. • Coherent (multi-detector) cuts: dt, matching of masses and D_eff. Handling of H1/H2 vs L1. • Efficiencies, Monte Carlo simulations - review of procedure, examination of code, searching for flaws, bugs… • Calculation of source reach. Model dependence: how well are source populations modeled? Inspiral Review, LSC meeting, Mar. 2004
things to look closely at (5) • Background estimation. Statistics, cut dependence. • Handling and evaluation of loudest triggers. • Statistical analysis for upper limit. • The paper: clear, readable, accurate, complete, succinct? Inspiral Review, LSC meeting, Mar. 2004
The review process • Over the next 2-3 weeks, the committee will examine all aspects of the analysis and the paper, formulate questions and request responses, and prepare a summary report • Expect to meet a couple of times without IAG, and maybe once or twice with IAG chairs and members • We welcome advice, comments, suggestions from all LSC members: ajw@caltech.edu . Inspiral Review, LSC meeting, Mar. 2004