100 likes | 193 Views
New data analysis for AURIGA. Lucio Baggio Italy, INFN and University of Trento. AURIGA. The (new) AURIGA data analysis. Since 2001 the AURIGA data analysis for burst search have been rewritten from scratch (G. Vedovato), in parallel with major upgrades taking place on the detector.
E N D
New data analysis for AURIGA Lucio Baggio Italy, INFN and University of Trento AURIGA
The (new) AURIGA data analysis Since 2001 the AURIGA data analysis for burst search have been rewritten from scratch (G. Vedovato), in parallel with major upgrades taking place on the detector The main goals and specifications to achieve were: • Be flexible and modular, with easy adaptation to new algorithms • Adopt the VIRGO/LIGO frame format for data storage and exchange • new data acquisition system • Recycle software (and be recyclable) open source project, C++ widespread use of supported and well known libraries: ROOT (http://root.cern.ch) VEGA (http://wwwlapp.in2p3.fr/virgo/vega) FrameLibs (http://wwwlapp.in2p3.fr/virgo/FrameL) FFTW (http://www.fftw.org) LAL (http://www.lsc-group.phys.uwm.edu/lal) MKFilter(http://www-users.cs.york.ac.uk/~fisher/mkfilter) • And, develop new algorithms, indeed! (highlight: Karhunen-Loeve decomposition)
DAQ FME DQ EVT FW MTC DAQS Overview (see poster) The data analysis of raw or simulated data for burst search divides in a series of tasks 1. Estimate parameters of the analytic part of the noise model (Full Model Estimate, FME) 2. Remove noise correlation (Full Whitening, FW) 3. Perform a matched template filtering and event search (EVT) 4. Define epoch vetoes based on Gaussianity monitors (Data Quality, DQ) 5. Compute distribution of errors in event parameters estimators (Monte Carlo, MTC)
EVT event search optimal filter & coarse interpolation max-hold raw data noise model template bank fine interpolation see poster Event search (1) Within this task the whitened data are optimally filtered in the frequency domain for a specified template signal. Then, the time series is passed to the event search algorithm
Event search (2) • The time series is downsampled to a convenient sampling rate • The absolute value of the downsampled time series is searched for the local maxima (max-hold algorithm with a given dead time), and when it is above a proper threshold a candidate event trigger is issued • For each event trigger, the exact time of arrival and amplitude are computed after fine interpolation of the samples, along with sum of squared residuals (for 2-test), Karhunen-Loeve components, etc time
MTC Phase 2 template injection whitened data Phase 1 event search coarse interpolation template bank see poster Event statistics from Monte Carlo (MTC) The goal of this task is to estimate numerically the distributions of time of arrival and amplitude errors, for a bank of filter templates, possibly not exactly matched with the input signal. Software signal injection takes place in the time domain, by adding a chosen template (properly rescaled in amplitude and time-shifted) to the actually measured white noise of the system. Template injection and search is automatically cycled for specified time and amplitude increments, and can be repeated for indipendently specified signal and filter templates.
http://www.ligo.caltech.edu/docs/P/P010019-01.pdf templateless suboptimal energy estimate KLD -filtered signal noise input: khkk+ knkk nk ~Gauss(0,k) f·f= k k-4 hk2 + … average power Gauss(0,1) SNRh= Karhunen-Loeve Decomposition (1) signal h +noise (Sh) optimally filtered amplitude with template Wiener filter -filter: F() = Sh()-1 R-1 (inverse autocorrelation matrix) (f-2 =1) Karhunen-Loeve eigenfunctions {k}k=1,…,N Rk = k2 k (kk-2 = 1) f = R-1 h = k k-2 hkk+ kk-2nk k Define: AKL2 = f·Rf = k k-2 hk2 + k k-2nk2+ 2 k k-2 hknk without signal: AKL ~ Chi(N) with signal: AKL= (kk-2hk2)1/2 + k k-2hknk (ij-2hj2)-1/2 + O2(n/AKL)
Karhunen-Loeve Decomposition (2) linear filter with mismatched template Karhunen-Loeve decomposition probability density amplitude SNR SNRh • Pros: The signal-to-noise ratio through KLD equals the maximum one achievable with template knowledge • Cons: increased tail of fake events • definition of event baricenter?
see also poster Summary • Brand new code, rewritten from scratch in C++, running on standalone PCs • Integrated ARMA noise simulator, generating stationary or time varying correlated gaussian noise, possibly polluted with power line harmonics, periodic signals and bursts. • Adaptive parametric noise model estimate • Support for non-parametric frequency-dependent calibration function • Support for template bank search. • Embedded Monte Carlo and tools for measuring efficency. • To do: • Re-implementation of data conditioning, study for optimization of the (still) empirical vetoing rules. Tuning on forthcoming sensitivity and stability of the detector. • Make the analysis more robust with respect to heavy data corruption by spectral lines and transient disturbances. • Training on templateless search, tuning of time interval size for K-L decomposition comparison with time-frequency methods • Intensify collaboration with other research groups, in order to share algorithms
Phase 1 Iterative fit and data conditioning Periodograms quality check FME Phase 1 Phase 2 Periodograms quality check Iterative fit and data conditioning Time series smoothing FFT1 raw data SDFT FFT1 FFT2 FFT2 FFT3 FFT3 Outlier removal FFTn FFTn Pararametric noise model estimator