210 likes | 284 Views
Lecture 3. First, a bit more python. Then some noise statistics. Python ins and outs. We’re going to mostly read our data from FITS files, using a module called pyfits . http://www.stsci.edu/resources/software_hardware/pyfits We’ll crunch up the data using a module called numpy .
E N D
Lecture 3 • First, a bit more python. • Then some noise statistics.
Python ins and outs • We’re going to mostly read our data from FITS files, using a module called pyfits. • http://www.stsci.edu/resources/software_hardware/pyfits • We’ll crunch up the data using a module called numpy. • http://numpy.scipy.org/ • For graphical output we’ll use module ppgplot, (2 ps) which is a (fairly crude) wrapper to a package called PGPLOT. • http://efault.net/npat/hacks/ppgplot/ • http://www.astro.caltech.edu/~tjp/pgplot/ BREAKING NEWS! pylab is better for graphical output.
Manuals and example code • You won’t need to download manuals for pyfits, numpy, scipy or pylab. I’ll do so and make them available from my home page. It’ll be much quicker for you to copy (or simply access) them from there. • Some of these manuals are huge – I recommend you neither print or read them in entirety, but rather: • Read the bits you need using acroread. • Look at the example code I’ll provide you.
Random variables – probability density p(x) Average μ: Units: probability per unit x. Variance σ2: x
Random variables – probability density p(x) Estimate of μ: Estimate of σ2: x
Random variables – probability density p(x) x x0
Noise • 2 most important distributions: • Gaussian • …but not necessarily ‘white’ p(x)
Noise • Poisson …but note Central Limit theorem. (= i)
Combinations of random variables Weighted average: Uncertainty in this: Best SNR when: Uncertainty propagation: if y = f(a,b), where:
Filtering and correlation Uncorrelated ie ‘white’ noise (with a Gaussian probability distribution.)
Filtering and correlation Fourier transformed – looks the same – same power at all f – hence ‘white’.
Filtering and correlation Autocorrelation function. Power spectrum: where F is the Fourier transform of signal f, eg: The autocorrelation function is:
Filtering and correlation Filtering operation correlates the noise. It is still Gaussian but no longer white.
Filtering and correlation FT is dominated in this case by low frequencies.
Filtering and correlation Autocorrelation function is broadened.
Signals and noise General: XMM-Newton: Interferometry: + uncorrelated noise + uncorrelated noise + uncorrelated noise Instrumental filtering/correlation Instrumental filtering/correlation Instrumental filtering/correlation + uncorrelated noise + uncorrelated noise + uncorrelated noise + background + background + background Natural signal Natural signal Natural signal
Signal detection An obvious source. Much harder.
Signal detection • Parent function; data; model. • Probability that the data results from a certain model: • Reduced chi2: divide by deg free. • Recall that Psignal = 1-Pno signal. • Many types of signal, but only 1 no-signal. • Hence test model with no source. • Called the ‘null hypothesis’.
Signal detection • Null hypothesis requires: • Perfect knowledge of background B. • A good estimate of σ. • Sparse sources among gaussian: ok. • But… • what about poisson bins with zero counts? • Answer: Maximum Likelihood. • what about crowded fields? • Answer: Bayes..? • can we trust the distribution? • Answer: Monte Carlo.