1 / 32

EigenFaces and EigenPatches

This model offers a useful framework for understanding variation in a fixed region, initially designed for face recognition but applicable to object localization and recognition. The construction involves marking the face region, sampling, normalizing, and performing statistical analysis. Interpolation techniques are employed to estimate pixel values at non-integer positions. The process includes representing regions as vectors, normalizing for lighting variations, and applying linear or non-linear approaches. Multivariate statistical analysis techniques are used to classify regions using Gaussian fitting and Principal Component Analysis for eigenvector computation. Eigenvector decomposition simplifies normal distributions and enables dimensionality reduction. Applications include object and face detection using Eigen-Models and Eigen-Face models, with an emphasis on multi-resolution search strategies for improved accuracy in object recognition and detection tasks.

felicitast
Download Presentation

EigenFaces and EigenPatches

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EigenFaces and EigenPatches • Useful model of variation in a region • Region must be fixed shape (eg rectangle) • Developed for face recognition • Generalised for • face location • object location/recognition

  2. Overview • Model of variation in a region

  3. Overview of Construction Mark face region on training set Sample region Normalise Statistical Analysis

  4. Sampling a region • Must sample at equivalent points across region • Place grid on image and rotate/scale as necessary • Use interpolation to sample image at each grid node

  5. Interpolation • Pixel values are known at integer positions • What is a suitable value at non-integer positions? Values known at integer values Estimate value here

  6. Interpolation in 1D • Estimate continuous function, f(x), that passes through a set of points (i,g(i)) f(x) x

  7. 1D Interpolation techniques f(x) Nearest Neighbour x f(x) Linear x f(x) Cubic x

  8. 2D Interpolation • Extension of 1D case Nearest Neighbour Bilinear y interp at x=0 y interp at x=1

  9. Representing Regions • Represent each region as a vector • Raster scan values n x m region: nm vector g

  10. Normalisation • Allow for global lighting variations • Common linear approach • Shift and scale so that • Mean of elements is zero • Variance of elements is 1 • Alternative non-linear approach • Histogram equalization • Transforms so similar numbers of each grey-scale value

  11. Review of Construction Mark face region on training set Sample region Normalise The Fun Step Statistical Analysis

  12. Multivariate Statistical Analysis • Need to model the distribution of normalised vectors • Generate plausible new examples • Test if new region similar to training set • Classify region

  13. Fitting a gaussian • Mean and covariance matrix of data define a gaussian model

  14. Principal Component Analysis • Compute eigenvectors of covariance, S • Eigenvectors : main directions • Eigenvalue : variance along eigenvector

  15. Eigenvector Decomposition • If A is a square matrix then an eigenvector of A is a vector, p, such that • Usually p is scaled to have unit length,|p|=1

  16. Eigenvector Decomposition • If K is an n x n covariance matrix, there exist n linearly independent eigenvectors, and all the corresponding eigenvalues are non-negative. • We can decompose K as

  17. Eigenvector Decomposition • Recall that a normal pdf has • The inverse of the covariance matrix is

  18. Fun with Eigenvectors • The normal distribution has form

  19. Fun with Eigenvectors • Consider the transformation

  20. Fun with Eigenvectors • The exponent of the distribution becomes

  21. Normal distribution • Thus by applying the transformation • The normal distribution is simplified to

  22. Dimensionality Reduction • Co-ords often correllated • Nearby points move together

  23. Dimensionality Reduction • Data lies in subspace of reduced dim. • However, for some t,

  24. Approximation • Each element of the data can be written

  25. Normal PDF

  26. Useful Trick • If x of high dimension, S huge • If No. samples, N<dim(x) use

  27. Building Eigen-Models • Given examples • Compute mean and eigenvectors of covar. • Model is then • P – First t eigenvectors of covar. matrix • b – Shape model parameters

  28. Eigen-Face models • Model of variation in a region

  29. Applications: Locating objects • Scan window over target region • At each position: • Sample, normalise, evaluate p(g) • Select position with largest p(g)

  30. Multi-Resolution Search • Train models at each level of pyramid • Gaussian pyramid with step size 2 • Use same points but different local models • Start search at coarse resolution • Refine at finer resolution

  31. Application: Object Detection • Scan image to find points with largest p(g) • If p(g)>pmin then object is present • Strictly should use a background model: • This only works if the PDFs are good approximations – often not the case

  32. Application: Face Recognition • Eigenfaces developed for face recognition • More about this later

More Related