370 likes | 521 Views
Data Analysis Techniques for Gravitational Wave Observations. S. V. Dhurandhar. I U C A A Pune, India. Great strides taken by experimentalists in improving sensitivity of GW detectors. Technology driven to its limits. Gravitational Wave Data Analysis
E N D
Data Analysis Techniques for Gravitational Wave Observations S. V. Dhurandhar I U C A A Pune, India
Great strides taken by experimentalists in improving sensitivity of GW detectors Technology driven to its limits Gravitational Wave Data Analysis Important component of GW observation
Signals with parametrizable waveforms • Deterministic Binary inspirals –modelled on the Hulse-Taylor binary pulsar Continuous wave sources • Stochastic Stochastic background • Unmodeled sources • Bursts and transients h ~ 10- 23 to 10-27
Source Strengths Binary inspiral : Periodic: Stochastic background:
Detector Sensitivity for the S2 run *http://www.ligo.caltech.edu/~lazz/distribution/Data
Data Analysis Techniques Techniques depend on the type of source • Binary Inspirals: Matched filtering • Continuous wave signals: Fourier transforms after applying Doppler/spin-down corrections • Stochastic background: Optimally weighted cross-correlated data from independent detectors • Unmodeled sources: Bursts Time-frequency methods: Excess power statistics
Inspiraling compact binary Waveform well modelled: - PN approximations (Damour, Blanchet, Iyer) - Resummation techniques: Pade, Effective one body – extend the validity of the PN formalism (Damour, Iyer, Sathyaprakash, Buonanno, Jaranowski, Schafer) Waveform: h Noise: Sh (f) Thematched filter : Stationary noise: Optimal filter in Gaussian noise: Detection probability is maximised for a given false alarm rate
Detection Strategy Signal depends on many parameters Parameters: Amplitude, ta , fa , m1 , m2 , spins Strategy: Maximum likelihood method • Spinless case: • Amplitude: Use normalised templates • ta : FFT • Initial phase fa : Quadratures – only 2 templates needed for 0 and p/2 • masses chirp times: t0 , t3 bank required • For each template the maximised statistic is compared • with a threshold set by the false alarm rate. (SVD and Sathyaprakash)
Thresholding , false alarm & detection Detection probability
Parameter Space Parameter space for the mass range 1 – 30 solar masses
Hexagonal tiling of the parameter space LIGO I psd Minimal match: 0 .97 Number of templates: ~ 104 Online speed: ~ 3 GFlops
Inspiral Search (contd) • Reduced lower mass limit .2 Msun , fs ~ 10 Hz , then online speed ~ 300 Gflops • Hierarchical search required • - 2 step search: 2 banks - coarse & fine (Mohanty & SVD) • Step I : coarser bank – fewer templates, low threshold - high false alarm rate • Step II: follow-up the false alarms by a fine search • - Extended hierarchical search: over ta and masses • (Sengupta, SVD, Lazzarini) (Tanaka & Tagoshi)
Hierarchical search frees up CPU for searching over more parameters LIGO I psd - mass range 1 to 30 solar masses 92% power at fc = 256 Hz Factor of 4 in FFT cost
Relative size of templates in the 2 stages of hierarchy Total gain factor 60 over the flat search
Multi-detector search for GW signals GEO: 0.6km VIRGO: 3km LIGO-LHO: 2km, 4km TAMA: 0.3km LIGO-LLO: 4km AIGO: (?)km
Inspiral search with a network of detectors • Coincidence analysis: • – event lists, windows in parameter space(S. Bose) • Coherent search: -phase informationused (Pai, Bose, SVD) (S. Finn) • *Full data from all detectors necessary to carry out the data analysis • * A single network statistic constructed to be compared with a threshold • * Analytical maximisation over amplitude, initial phase, orientation of binary orbit • * FFT over the time-of-arrival • * direction search: time-delay window • * Filter bank over the intrinsic parameters: masses – metric depends on extrinsic parameters • Computational costs soar up in searching over time-delays ( ~ x 103 for LIGO-VIRGO)
L S2 S1 Spin • Orbital-plane precesses –spin-orbit coupling -modulates the waveform (Blanchet, Damour, Iyer, Will, Wiseman, Jaranowski, Schafer) • Too many parameters – high computational cost (Apostolatus) • Detection template families – detection only(Buonnano, Chen, Vallisneri) • - few physical parameters, model well the modulation (FF > .97) • - automatic search over several (extrinsic) parameters – no template bank • For searching single-spin binaries: 7 M < m1 < 12 M , 1 M < m2 < 3 M • Templates in just 3 parameters: S1 , m1 and m2 • 76000 templates needed at .97 match (average) - LIGO I sensitivity
Periodic Sources Target sources: Slowly varying instantaneous frequency eg. Rapidly rotating neutron starsh ~ 10-25 , 10-26 Integration time: months, years - motion of detector phase modulates the signal Doppler modulation: depends on direction of GW : Df = (n . v) f0/c 1 kHz wave gets spread into a million Fourier bins in 1 year observation time Intrinsic: spin down
1013 patches in the sky Computational cost in searching for periodic sources Parameters: f0, q, f, spin down parameters Targeted search: known pulsar: window in parameter space, heterodyne `All sky all frequency search ‘ - ACHALLENGE f0 is also a parameter Number of Doppler corrections (patches in the sky): spin-down parameters not included Brady et al (1998) Parameter space large: typical Tobs ~ 107 secs – weak source Effective GW telescope size ~ 2 AU, thus resolution = l / D ~ .2 arc sec
Hierarchical Searches • Alternate between coherent & incoherent stages • Hough transform (Schutz, Papa, Frasca) • - short term Fourier Transforms • - Look for patterns in peaks in the time-frequency plane which • correspond to parameter values • -histogram in parameter space – do full time coherent search around the peak • Stack and slide search (Brady & T. Creighton) • - Given fixed computing power look for an efficient search algorithm • - Divide the data into N stacks, compute power spectra, slide and then sum • Results: gain 2-4 in sensitivity + 20-60% hierarchical , 99% confidence • Classes of pulsars: fmax = 1 kHz, t = 40 yr; fmax = 200 Hz, t = 1000 yr
Stochastic Background Cannot distinguish instrumental noise from signal with one detector Cross-correlate the output of two detectors: Q: filter (Allen & Romano) (E. Flanagan)
Stochastic Background Overlap reduction function g(f): Non-coincident & non-aligned detectors SNR : functional of g(f), WGW (f), P1(f), P2(f) LIGO detector pair, Tobs = 4 months, PF = 5%, PD = 95% Initial: WGW ~ 10-5 - 10-6 Advanced: WGW ~ 10-10 - 10-11
Unmodeled sources Burst sources: Supernovae, Hypernovae, Binary mergers, Ring-downs of binary blackholes Excess power statistics: Sum the power in the time-frequency window Anderson, Brady, J.Creighton, Flanagan E is distributed:c2 if no signal and noise Gaussian non-central c2 if signal is present Q: How to distinguish non-gaussianity from the signal? (statistic can detect non-gaussianity) Network of detectors: autocorrelation v/s cross-correlation Slope statistic:
Coherent detection of bursts with a network of detectors (J. Sylvestre) • Linearly combine the data with time-delays and antenna pattern functions for a given source direction: • Polarisation plane:Signal lies in theplane spanned by h+ (t) and hx (t) Y: data from a single synthetic detector andP = || Y ||2 P = z + h and r2 = z / E(h) and maximise r2 Only 2 parameters needed in addition to source direction: length ratio, angle Direction to the source can be found: LHV network ~ 1o – 10 o Source model required !
Dealing with real data • Algorithms, codes working - yielding sensible results • Real detector noise is neither stationary nor Gaussian • - algorithms have been developed for G & S noise • - need to adapt the algorithms to the real world • Vetos: • - Excess noise level veto • - Instrumental vetos • For inspirals • - time frequency veto(Bruce Allen et al)
Based on the fact that irrespective of the masses: Divide the frequency domain into p subbands so that the signal has equal power in each subband k and compute the c2 as : where rk is the SNR in subband k (normalised templates) Compare the value of c2 with a threshold for deciding detection Veto for inspirals (Allen et al) Better vetos: follow the ambiguity function
Clustering of triggers for real events • Condensing the `cloud of events’ – graph theory?
Setting upper limits • Although at this early stage no detection can be announced we can place upper limits for example on the inspiral event rate • S1 data from the LIGO detectors gives A rate > than above means there is more than 90% probability that one inspiral event will be observed with SNR > highest SNR observed in S1 data. (gr-qc/0308069)
Setting upper-limits (contd.) • Upper limits can be set for other types of sources: • Stochastic - WGW • Continuous wave sources - h for a given source < 23 for S1 data L1- H2 Source: PSR J1939+2134 (fastest known rotating neutron star) located 3.6 kpc from Earth - fGW ~ 1283.86 Hz Best upper limit from S1 data (L1) ~ 10-22
Data Analysis as diagonistic tool • Detector characterisation: • Understanding of instrumental couplings to GW channel • Calibration • Line removal techniques – adaptive methods
LISA : ESA & NASA project Space based detector for detecting low frequency GW
Laser Interferometric Space Antenna (LISA) • LISA is an unequal arm interferometer in a triangular configuration • LISA will observe low frequency GW in the band-width of 10-5 Hz- 1 Hz. Six Doppler data streams Unequal arms: Laser frequency noise uncancelled Suitably delayed data streams form data combinations cancelling laser frequency noise (Tinto, Estabrook, Armstrong) Polynomial vectors in time-delay operators (SVD, Vinet, Nayak, Pai) Coherent detection
LISA data analysis • Polynomial vectors in 3 time-delay operators • -Module of syzygies • 4 generators: a , b , g , z • - linear combinations generate the module • There are optimal combinations which perform better than the Michelson – LISA curve • The z combination can be used to `switch off ‘ GW • - calibration Current effort: generalise to moving LISA, changing • arm-lengths etc. (Tinto, SVD, Vinet, Nayak)
Summary • Data analysis important aspect of GW observation • Different types of sources need different data analysis strategies • Algorithms must be computationally efficient – sophisticated analysis is required • Algorithms, codes now being tested on real data • LISA data analysis: combining data streams for • optimal performance