270 likes | 373 Views
The Hidden Message. Some useful techniques for data analysis Chihway Chang, Feb 18’ 2009. A famous example…. Hubble ’ s law v=H 0 d Expansion of the universe. What do we learn?. Seemingly crappy data can lead to astonishing discoveries Insight + imagination
E N D
The Hidden Message Some useful techniques for data analysis Chihway Chang, Feb 18’ 2009
A famous example… • Hubble’s law v=H0d • Expansion of the universe
What do we learn? • Seemingly crappy data can lead to astonishing discoveries • Insight + imagination • Nature laws are usually simple • Most parts in our observable Universe are linear, spherical symmetric, Gaussian or Poisson • Data analysis should be easy! • …theoretically
Process of data analysis: Sampling Central Limit Theorem Strategy of sampling Model fitting Linear regression Maximum likelihood Chi square Correlations Or… We all know how this happens Stare at your data Collect lots of data CLT ???@#$
Outline Useful techniques in data analysis: • Correlations • Linear correlation • Cross-correlation • Autocorrelation • Principle Component Analysis (PCA)
Correlations • Linear correlation • Data Standard scores Correlation coefficients (Pearson product-moment) Coefficient of determination Variance in common • Correlation matrix
Example – Hubble’s law We have 24 data points, we’d like to know how v and d correlate
Standard scores Correlation coefficient Correlation of determination
Example – Hubble’s law We have 24 data points, we’d like to know how v and d correlate
Significance and likelihood • One-tailed table usage • What is the likelihood for 24 random number sets to have by chance corr(X,Y) ≧0.79? • What if we only have 5 samples?
Limitations • Only capable of linear dependence • Sensible to outliers • Affected by correlated errors
Cross-correlation • Signal processing: search in a long series of data a short feature signal t (f*g)(t) f g t
Autocorrelation • Finding repeating patterns • Identifying fundamental lengths or time scales in noisy signal • Cross-correlation with self or simply f 0.1
Application • Correlation coefficient: • Well, um…everywhere? • Auto & cross-correlation: • Optics: laser coherence, spectra measurement, ultra-short laser pulse • Signal process: musical beats • Astronomy: pulsar frequency • Correlation in space: 2-point (n-point) correlation functions & power spectrum
Example: 2-point correlation in weak lensing • Assumption: galaxy shapes are entirely random • Correlation of shape parameter “e” 0 • Shear induces correlation at length scale ~arcmin • Atmosphere and systematics induce correlated noise
Typical 2-point correlation plots, no shear, but with noise and systematics • Shear signal is at 1% level • Controlling systematics is the key! 5 arcmin 1 arcmin
Principle Component Analysis • Revealing the internal structure of data in a way that best explains its variance • Conceptually, it is a transformation of coordinate system that rotates data into its eigen-space where the greatest variance by any projection of the data lie on the first coordinate • High-dimension analysis
Mathematical operation • Recognize important variance in data — the Principle Components (PCs) • Reconstruct data using only low orders of PCs thus compressing dimension of data • Assumption: • Data can be represented by a linear combination of certain basis • Data is Gaussian
Example — Hubble’s Law again Subtract mean {(Xi,Yi)}={(xi-ave(x),yi-ave(y))}24*2 Get data {(xi,yi)}24*2 Calculate & normalize 2 eigenvalue and 2 eigenvectors of C Calculate covariance matrix C2*2={(Xi,Yi)}T {(Xi,Yi)}/N The eigenvectors point to 2 PCs and the eigenvalues indicate relevant weightings
PC1, eigenvalue=132087 PC2, eigenvalue=0.1503
Recognize important PC and ignore others To form a new basis of compressed dimension {V}2*1 Reconstruct data using 1 eigenvector to rotate data back {X’i,Y’i}={V}T{Xi,Yi}{V} Shift data back and get final reconstructed data {X,Y}reconstruct ={X’i+ave(x),Y’i+ave(y)}
Example – characterize shape of CCD chips • Fit 27 chip shapes using 4th order polynomials • Data matrix of dimension 27*15 • 15 eigenvalues and 15 eigenvectors • Choose 15,5,1 PCs to reconstruct shapes
Applications • Pattern recognition (http://icg.cityu.edu.hk/private/PowerPoint/PCA.ppt) • Multi-dimension data analysis • Noise reduction • Image analysis
Conclusion • Data is only useful if we know how to interpret them • Various statistical techniques are developed • Analyzing correlations and PCA are two common techniques I introduce today “It can aid understanding reality, but it is no substitute for insight, reason, and imagination. It is a flashlight of the mind. It must be turned on and directed by our interests and knowledge; and it can help gratify and illuminate both. But like a flashlight, it can be uselessly turned on in the daytime, used unnecessarily beneath a lamp, employed to search for something in the wrong room, or become a play thing.” R.J. Rummel, department of political science, University of Hawaii
Reference • A tutorial on Principal Components Analysis, Lindsay I Smith • Understanding Correlation, R.J. Rummel …and yes, I learned everything from Wikipidia
FIN Thank you for your attention!