1 / 68

Statistical Models of Appearance for Computer Vision

Statistical Models of Appearance for Computer Vision. T.F. Cootes and C. J. Taylor July 10, 2000. Computer Vision. Aim Image understanding Models Challenge Deformable objects. Deformable Models. Characteristics General Specific. Modeling Approaches. Card Board Model

edita
Download Presentation

Statistical Models of Appearance for Computer Vision

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Statistical Models of Appearance for Computer Vision T.F. Cootes and C. J. Taylor July 10, 2000

  2. Computer Vision • Aim • Image understanding • Models • Challenge • Deformable objects

  3. Deformable Models Characteristics • General • Specific

  4. Modeling Approaches • Card Board Model • Stick Figure Model • Surface Based • Volumetric • Superquadrics • Statistical Approach

  5. Why Statistical Approach ? • Widely applicable • Expert knowledge captured in the system in the annotation of training examples • Compact representation • n-D space modeling • Few prior assumptions

  6. Topics • Statistical models of shape • Statistical models of appearance

  7. Subsections • Building statistical model • Using these models to interpret new images

  8. Statistical Shape Models

  9. Shape • Invariance under certain transforms eg: in 2-3 dimension – translation, rotation, scaling • Represented by a set of n points, in d dimensions by a nd element vector • s training examples, s such vectors

  10. Suitable Landmarks • Easy to detect • 2-D - corners on the boundary • Consistent over images • Points b/w well defined landmarks

  11. Aligning the Training Set • Procrustes Analysis • D = |xi – X|2 is minimized • Constraints on mean • Center • Scale • Orientation

  12. Alignment : Iterative Approach • Translate training set to origin • Let x0 be the initial estimate of mean • “Align” all shapes with mean • Re-estimate mean to be X • “Align” new mean w.r.t. previous mean and scale s.t. |X| = 1 • REPEAT starting from 3

  13. What is “Align” • Operations allowed • Center -> scale (|x| =1) -> rotation • Center -> (scale + rotation) • Center -> (scale + rotation) -> projection onto tangent space of the mean

  14. Tangent Space All vectors x s.t. (xt –x).xt = 0 => x.xt = 1 Method : Scale x by 1/(x.X)

  15. Modelling Shape Variation Advantages • Generate new examples • Examine new shapes (plausibility) Form • x = M(b), b is vector of model parameters

  16. PCA • Compute the mean of the data X = (xi)/s • Compute the covariance of the data, S = ((xi – X)(xi – X)T)/(s-1) • Compute the eigenvectors, i and corresponding eigen values i of S

  17. Approximation using PCA If  contains t eigenvectors corresponding to the largest eigenvalue, x X + b where  = (1| 2|..| t) and b is t dimensional vector given by b = T(x-X)

  18. Choice of Number of Modes t • Proportion of variance exhibited i=1ti / i > th • Accuracy to approximate training examples • Miss-one-out manner

  19. Uses of PCA Principal Components Analysis (PCA) exploits the redundancy in multivariate data, enabling us to: • Pick out patterns (relationships) in the variables • Reduce the dimensionality of our data set without a significant loss of information

  20. Generating Plausible Shapes Assumption : bi are independent and gaussian Options • Hard limits on independent b • Constrain b in a hyperellipsoid

  21. Drawbacks • Inadequate for non-linear shape variations • Rotating parts of objects • View point change • Other special cases • Eg : Only 2 valid positions (x = f(b) fails) • Only variations observed in the training set are represented

  22. Non-Linear Models of PDF • Polar co-ordinates (Heap and Hogg) • Mixture of gaussians Drawbacks : • Figuring out no. of gaussians to be used • Finding nearest plausible shape

  23. Fitting a Model to New Points x = TXt,Yt,s,(X+b) Aim : Minimize |Y-x|2 • Initialize shape parameter, b, to 0 • Generate model instance x = X + b • Find the pose parameters Xt,Yt,s, which best map x to Y

  24. Invert the pose parameters and use to project Y to the model co-ordinate frame : y = T-1 Xt,Yt,s,(Y) • Project y into the tangent plane to X by scaling by 1/(y.X) • Update the model parameter to match y b = T(y-X) • REPEAT

  25. Estimating p(shape) • dx = x – X • Best approximation of dx be b • Residual error r = dx - b • p(x) = p(r).p(b) • logp(r) = -0.5|r|2/σr2 + const • logp(b) = -0.5bi2/i + const

  26. Relaxing Shape Model • Artificially add extra variations • Finite Element Method (M & K) • Perturbing the covariance matrix • Combining statistical and FEM modes • Decrease the allowed vibration modes as the number of examples increases

  27. Statistical Appearance Models

  28. Appearance • Shape • Texture • Pattern of intensities

  29. Shape Normalization • Warp each image to match control points with the mean image (triangulation algorithm) Advantages • Remove spurious texture variations due to shape differences

  30. Intensity Normatization g = (gim - 1)/ where  = gim.G  = (gim.1)/n

  31. PCA Model : g = G + Pgbg G = mean of the normalized data Pg = set of the orthogonal modes of variation bg = set of gray level paramemters gim = Tu(G + Pgbg)

  32. Combined Appearance Model • Shape bs • Texture bg Correlation b/w the two • b = (Wsbs bg)T = (WsPsT(x-X) PgT(g-G))T

  33. Applying PCA to b b = Qc x = X + PsWs-1Qsc, g = G + PgQgc where Q = (Qs Qg)T

  34. Choice of Ws • Displace each element of bs from its optimum value and observe change in g • Ws = rI where r2 is the ratio of the total intensity variation to the total shape variation • Insensitivity to Ws

  35. Example : Facial AM

  36. Approximating a New Image • Obtain bs and bg • Obtain b • Obtain c • Apply x = X + PsWs-1Qsc, g = G + PgQgc • Inverting gray level normalization • Applying pose to the points • Projecting the gray level vector to the image

  37. Fitting a Model to New Points x = TXt,Yt,s,(X+b) Aim : Minimize |Y-x|2 • Initialize shape parameter, b, to 0 • Generate model instance x = X + b • Find the pose parameters Xt,Yt,s, which best map x to Y

  38. Invert the pose parameters and use to project Y to the model co-ordinate frame : y = T-1 Xt,Yt,s,(Y) • Project y into the tangent plane to X by scaling by 1/(y.X) • Update the model parameter to match y b = T(y-X) • REPEAT

  39. Example

  40. Active Shape Models

  41. Problem statement • Given a rough starting approximation, how do we fit an instance of a model to the image

  42. Iterative Approach • Examine a region of the image around each point Xi to find the best nearby match for the point Xi’ • Update the parameters (Xt, Yt, s, , b) to best fit the new found points X • REPEAT

  43. In Practice

  44. Modeling Local Structure • Sample the derivative along a profile, k pixels on either side of a model point, to get a vector gi of the 2k+1 points • Normalize • Repeat for each training image for same model point to get {gi} • Estimate mean G and covariance Sg • f(gs) = (gs-G)TSg-1(gs-G)

  45. Using Local Structure Model • Sample a profile m pixels either side of the current point (m>k) • Test quality of fit at 2(m-k)+1 positions • Chose the one which gives the best match

  46. Multi-Resolution ASM

  47. Advantages • Speed • Less likely to get stuck on the wrong image structure

  48. Complete Algorithm • Set L = Lmax • For L = Lmax:0 • Compute model point position in the image at level L • Evaluate fit at ns points along the profile • Update pose and shape parameters to fit the model to new points • Return unless more than pclose points satisfy the required criterion

More Related