450 likes | 876 Views
Face Recognition Using Eigenfaces. Kenan Gençol presented in the course Pattern Recognition instructed by Asst.Prof.Dr. Kemal Özkan Department of Electrical and Electronics Engineering, Osmangazi University. Agenda. Introduction Principle Component Analysis (PCA)
E N D
Face Recognition Using Eigenfaces Kenan Gençol presented in the course Pattern Recognition instructed by Asst.Prof.Dr. Kemal Özkan Department of Electrical and Electronics Engineering, Osmangazi University
Agenda • Introduction • Principle Component Analysis (PCA) • Eigenfaces for Recognition
Introduction • A method introduced by Turk and Pentland from MIT in 1991. • Uses Principle Component Analysis(PCA) as a mathematical framework.
Principal Component Analysis (PCA) • What is it? • It is a powerful tool for analysing data. • Patterns in data can be hard to find in complex data, or in high dimension. • PCA reduces a complex data set to a lower dimension. • and identifies patterns in data, highlights their similarities and differences.
Principal Component Analysis (PCA) • The goal of PCA is to find the most meaningful basis to re-express a data set. • PCA asks: Is there another basis, which is a linear combinationof the original basis, that best re-expresses our data set? • Uses variance and covariance for this goal.
PCA - Mathematical Foundations • The covariance measures the degree of the linear relationship between two variables. • If positive, positively correlated data. • If negative, negatively correlated data. • If zero, uncorrelated data. • The absolute magnitude of covariance measures the degree of redundancy.
PCA - Mathematical Foundations • The covariance matrix shows the relationship between higher dimensions. • If n dimensions, it is a nxn matrix. • It is a square symmetric matrix. • The diagonal terms are the variances, and off-diagonal terms are covariances. • The off-diagonal terms large magnitudes correspond to high redundancy.
PCA - Mathematical Foundations • Our goals re-stated: • (1) minimize redundancy, measured by the magnitude of covariance. • (2) maximize the signal, measured by the variance. • Diagonalize the covariance matrix! • This means: Decorrelate data!
PCA - Mathematical Foundations • The Diagonalization of Covariance Matrix: • All off-diagonal terms should be zero, or say another way, data is decorrelated. • Each successive dimension should be rank-ordered according to variance (large variances have important structure.)
A little linear algebra... • Some crucial theorems from linear algebra for PCA work: • A matrix is symmetric if and only if orthogonally diagonalizable. • A symmetric matrix is diagonalized by a matrix of its orthonormal eigenvectors.
PCA - Mathematical Foundations • So, finally, • Find eigenvectors of covariance matrix! • Order them by eigenvalue, highest to lowest (gives order of significance). • The eigenvector with the highest eigenvalue is the principle component.Second, third principles etc. • Ignore the components of lesser significance.
PCA - Conclusion • Results: • The final data set will have less dimensions than the original. • Aligned data in a basis with the axis of maximal variance (find another direction along which variance is maximized). • Rank-ordering each basis vector according to the corresponding variances show how ‘principal’ each direction.
Discussion of PCA • Principal components with larger associated variances show important, interesting structure, while those with lower variances represent noise. This is a strong but sometimes incorrect assumption. • The goal of the analysis is to decorrelate the data, or say in other terms, is to remove second-order dependencies in the data. In the data sets of higher dependencies exist, it is insufficient at revealing all structure in the data.
Eigenfaces for Recognition • Simply think of it as a template matching problem:
Computation of the Eigenfaces • Let Γis an N2x1 vector, corresponding to the NxN face image Ι. • Step1: obtain face images Ι1, Ι2,.... ΙM(training faces) • Step2: represent every image Ιi as a vector Γi
Computation of the Eigenfaces • Step3: compute the average face vector Ψ : • Step4: subtract the mean face:
Computation of the Eigenfaces • Step5: compute the covariance matrix C • Step6: compute the eigenvectors uiof AAT:
Computation of the Eigenfaces • The matrix AATis very large impractical! • Consider the matrix ATA(MxM matrix) and compute the eigenvectors of ATA. • The eigenvectors of ATA are also the best M eigenvectors of AAT. • They correspond to M EIGENFACES !! • Keep only K eigenvectors corresponding to the K largest eigenvalues.
Recognition using eigenfaces • Given an unknown face image Γ , follow these steps: • Step1: normalize Γ : Φ = Γ – Ψ. • Step2: project onto the eigenspace
Recognition using eigenfaces • Step3: represent Φ as: • Step4: Find face dist er = minl || Ω – Ωl || • Recognize Γ as face l from training set !!
Discussion: Eigenfaces • Performance is affected from: • Background • Lighting conditions • Scale (head size) • Orientation