490 likes | 882 Views
PCA Method and Face Recognition. CPE488 && CPE631 [ 2012 – KMUTT ]. Presented by Miss Chayanut Petpairote. Outline. What is PCA? PCA Method What are Eigenfaces ? Face Recognition Training Steps Testing Steps Experimental Results Conclusion Demo. What is PCA? (1).
E N D
PCA Method and Face Recognition CPE488 && CPE631 [ 2012 – KMUTT ] Presented by Miss ChayanutPetpairote
Outline • What is PCA? • PCA Method • What are Eigenfaces? • Face Recognition • Training Steps • Testing Steps • Experimental Results • Conclusion • Demo
What is PCA? (1) • Principal Component Analysis • Eigen Vectors show the direction of axes of a fitted ellipsoid • Eigen Values show the significance of the corresponding axis • The larger the Eigen value, the more separation between mapped data • For high dimensional data, only few of Eigen values are significant
What is PCA? (2) • Finding Eigen Values and Eigen Vectors • Deciding on which are significant • Forming a new coordinate system defined by the significant Eigen vectors (lower dimensions for new coordinates) • Mapping data to the new space
PCA Method (1) • Step 1: Get some data
PCA Method (2) • Step 2: Subtract the mean • Mean x = 1.81 , Mean Y = 1.91
PCA Method (3) • Step 3: Calculate the covariance matrix • Covariance can analysis any relationship between the dimensions (for more 1D) • Ex: 3-dimensional data set (x,y,z), to measure the covariance between cov(x,y), cov(x,z), cov(y,z) , cov(x,x) = var(x), cov(y,y) = var(y), cov(z,z) = var(z) • Note!!!
PCA Method (4) • Step 3: Calculate the covariance matrix
PCA Method (5) • Step 4: Calculate the eigenvectors and eigenvalues of the covariance matrix • A is an n×n matrix, A is a linear operator on vectors in Cn • Eigenvector of A is a vector v ∈ Cn Av=λv • where λ = eigenvalue, v = eigenvector
PCA Method (6) • Step 4: Calculate the eigenvectors and eigenvalues of the covariance matrix • Find eigenvalues : det(A- λI)=0 and solve for λ
PCA Method (7) • Step 4: Calculate the eigenvectors and eigenvalues of the covariance matrix • Find eigenvectors • Found eigenvectors =
PCA Method (8) • Step 4: Calculate the eigenvectors and eigenvalues of the covariance matrix
PCA Method (9) • Step 5: Choosing components and forming a feature vector • Data compression • Reduce dimensionality
PCA Method (10) • Step 6: Deriving the new data set • Transformed data = DataAdjust * Eigenvectors
PCA Method (11) • Step 7: Getting the old data back • Original Data = (eigenvectorT x transformed data) + original mean
What are Eigenfaces? • Eigenfaces are the eigenvectors of the covariance matrix of the probability distribution of the vector space of human faces • Eigenfaces are the ‘standardized face components’ derived from the statistical analysis of many pictures of human faces • Eigenfaces is a set of the eigenvectors of face images by using PCA
Face Recognition • Training Steps • Training Image Set • Preprocessing • PCA / Eigenfaces • Dimensionality Reduction • Calculation Weight • Testing Steps • Testing Image • Preprocessing • Transformed into Eigenface Components • Finding minimizing the Euclidean distance
Training Steps (1) • Training Image Set • Each Image: is Row * Col size • Training Image: M = 15 Images
Training Steps (2) • Preprocessing • Normalized Training Image Set • Reduce nose • Reduce light
Training Steps (3) • PCA / Eigenfaces • Each Image is transformed into a vector of size N (N = Row * Col) • Obtain a set Row* Col N*M N*1
Training Steps (4) • PCA / Eigenfaces • Find the Mean Image
Training Steps (5) • PCA / Eigenfaces • Find the difference between input image and mean image
Training Steps (6) • PCA / Eigenfaces • Find the Covariance matrix C • Find AT • L = ATA ATA vi = eigvali. vi • A ATA vi = A . eigvali. vi • CAvi= eigvali. Avi Eigenvectors uk = Avi
Training Steps (7) • PCA / Eigenfaces • Find Eigenvectors of Covariance matrix C • uk = Avi= size N * M (N = Row*Col)
Training Steps (8) • Dimensionality Reduction • Choose EigenVectors from large EigenValues most significant relationship between the data dimensions
Training Steps (9) • Calculation Weight • The weight describe the contribution of each eigenface in representing the training image set.
Testing Steps (1) • Testing Image: • Preprocessing • Transformed into Eigenface Components • The weight describe the contribution of each eigenface in representing the testing image.
Testing Steps (2) • Finding minimizing the Euclidean distance between the testing image and training image set weight vectors • If is minimum value that mean lower error between the testing image and training image so we can found the nearest image for face recognition.
Conclusion • Advantages • PCA can reduce the number of dimensions without much loss of information • PCA gives a high compression rate • Performance of recognition is good when noise is present • PCA can be used for face recognition and face reconstruction • Best low-dimensional space can be determined by the Best Eigenvectors of the covariance matrix • Limitations • Face Images must be the same image scale. If scale is changed, the performance of recognition is very bad • Face images are quite clear face and not occlusion
References • M. Turk and A. Penland, “Eigenfaces for recognition”, Journal of Congnitive Neuroscience, 1991. • M. Turk and A. Pentland, “Face recognition using eigenfaces”, In proc. Of Computer vision and Pattern Recognition, 1991 • “The ORL face database” , http://www.cl.cam.ac.uk/research/dtg/attarchive/facedatabase.html • http://www.pages.drexel.edu/~sis26/