420 likes | 584 Views
Agenda. Dimension reduction Principal component analysis (PCA) Multi-dimensional scaling (MDS) Microarray visualization. Why Dimension Reduction. Computation: The complexity grows exponentially with the dimension. Visualization: projection of high-dimensional data to 2D or 3D.
E N D
Agenda • Dimension reduction • Principal component analysis (PCA) • Multi-dimensional scaling (MDS) • Microarray visualization
Why Dimension Reduction • Computation: The complexity grows exponentially with the dimension. • Visualization: projection of high-dimensional data to 2D or 3D. • Interpretation: the intrinsic dimension maybe small.
1. Dimension reduction • Principal component analysis (PCA) • Multi-dimensional Scaling (MDS)
Philosophy of PCA • A PCA is concerned with explaining the variance-covariance sturcture of a set of variables through a few linear combinations. • We typically have a data matrix of n observations on p correlated variables x1,x2,…xp • PCA looks for a transformation of the xiinto p new variables yithat are uncorrelated. • Want to present x1,x2,…xp with a few yi’s without lossing much information.
PCA • Looking for a transformation of the data matrix X (nxp) such that Y= TX=1 X1+ 2 X2+..+ p Xp • Where =(1 , 2 ,.., p)Tis a column vector of wheights with 1²+ 2²+..+ p²=1
Maximize the variance of the projection of the observations on the Y variables • Find so that Var(T X)= T Var(X) is maximal • Var(X) is the covariance matrix of the Xivariables
Good Better
And so.. We find that • The direction of is given by the eigenvector 1 correponding to the largest eigenvalue of matrix Σ . • The second vector that is orthogonal (uncorrelated) to the first is the one that has the second highest variance which comes to be the eignevector corresponding to the second eigenvalue • And so on …
So PCA gives • New variables Yi that are linear combination of the original variables (xi): • Yi= ei1x1+ei2x2+…eipxp ; i=1..p • The new variables Yiare derived in decreasing order of importance; • they are called ‘principal components’
Scale before PCA • PCA is sensitive to scale • PCA shouldbeapplied on data that have approximately the samescale in each variable
Johnson RA and Wichern DW. Applied multivariate Statistical Analysis. Pearson Education, 2003
SVD (singular value decomposition) Johnson RA and Wichern DW. Applied multivariate Statistical Analysis. Pearson Education, 2003
SVD Berrar DP and Dubitzky GM. A Practical Approach to Microarray Data Analysis. Springer 2003.
PCA application: genomic study • Population stratification: allele frequency differences between cases and controls due to systematic ancestry differences—can cause spurious associations in disease studies. • PCA could be used to infer underlying population structure.
Figure 2Nature Genetics 38, 904 - 909 (2006) Principal components analysis corrects for stratification in genome-wide association studies Alkes L Price, Nick J Patterson, Robert M Plenge, Michael E Weinblatt, Nancy A Shadick & David Reic
Chao Tian, Peter K. Gregersen and Michael F. Seldin. (2008) Accounting for ancestry: population substructure and genome-wide association studies.
1.1 PCA PCA, Case study Transcriptional regulation and function during the human cell cycle, Cho et al. (2001) Nature Genetics Vol 27, 48-54 -- to identify cell-cycle–regulated transcripts in human cells -- Primary fibroblasts prepared from human foreskin were grown to approximately 30% confluence and synchronized in late G1 using a double thymidine-block protocol9. Cultures were then released from arrest, and cells were collected every 2 hours for 24 hours, covering nearly 2 complete cell cycles. -- Messenger RNA was isolated, labeled and hybridized to sets of arrays containing probes for approximately 40,000 human genes and non-overlapping ESTs. We carried out the entire synchronization experiment in duplicate under identical conditions for 6,800 genes on Affy array. The two data sets were averaged and analyzed using both supervised and unsupervised clustering of expression patterns.
1.1 PCA PCA, Case study Un-synchronized
1.1 PCA PCA projection: 387 genes in 13-dim space (time points) are projected to 2D spaceusing correlation matrix; Gene phase 1: G1; 4: S; 2: G2; 3: M
1.1 PCA Variance in data explained by the first n principle components
1.1 PCA The weights of the 13 principle directions
1.1 PCA PCA projection: 13 samples (time points) in 387-dim space (genes) are projected to 2D spaceusing correlation matrix; Each sample is labeled by its time point
1.1 PCA Potential pitfalls of PCA: Principal components do not always capture important information needed. PCA projection from 2D to 1D: Cluster information will be lost.
1.2 Multidimensional scaling (MDS) Suppose we are giving the distance structure of the following 10 cities. And we have no knowledge of the city location/map of the US. Can we map these cities to a 2D space to best present their distance structure?
1.2 Multidimensional scaling (MDS) • MDS deals with the following problem: for a set of observed similarities (or distances) between every pair of N items, find a representation of the items in few dimensions such that the interitem proximities “nearly match” the original similarities (or distance). • The numerical measure of how close the original distances and the distances at lower dimensional coordinate is called stress.
1.2 MDS Mapping to 3D is possible but more difficult to visualize and interpret.
1.2 MDS • MDS attempts to map objects to a visible 2D or 3D Euclidean space. The goal is to best preserve the distance structure after the mapping. • The original data can be of high-dimensional or even non-metric space. The method only cares the distance (dissimilarity) structure. • The resulting mapping is not unique. Any rotation or reflection of a mapping solution is also a solution. • It could be shown that the results of PCA are exactly those of classical MDS if the distances calculated from the data matrix are Euclidean.
2. Microarray visualization Data matrix Data: X={xij}nd , ann (genes) d (samples) matrix.
2. Microarray visualization Heatmap Log-ratio of the target sample to reference sample. log(target/reference) Gradient color of RED: positive; GREEN: negative; BLACK: 0. LIGHT GREY: missing value.
2. Microarray visualization Treeview software developed by Mike Eisen
3. Software for dimension reduction & visualization PCA in R: prcomp(stats) Principal Components Analysis (preferred) princomp(stats) Principal Components Analysis screeplot(stats) Screeplot of PCA Results PCA in IMSL (a commercial C library) MDS in R: isoMDS(MASS) Kruskal's Non-metric Multidimensional Scaling cmdscale(stats) Classical (Metric) Multidimensional Scaling sammon(MASS) Sammon's Non-Linear Mapping MDS: Various software and resources about MDS http://www.granular.com/MDS/ Heatmap visualization: Treeview http://rana.lbl.gov/EisenSoftware.htm