330 likes | 576 Views
Spectral Methods. Tutorial 6. © Maks Ovsjanikov tosca.cs.technion.ac.il/book. Numerical geometry of non-rigid shapes Stanford University, Winter 2009. Outline.
E N D
Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical geometry of non-rigid shapes Stanford University, Winter 2009
Outline On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS and PCA review. Metric MDS. Kernel PCA, kernel trick, relation to Metric MDS. Summary. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS (classical scaling) recap. Given a dissimilarity matrix arising from a normed vector space: We want to find the coordinates of points that would give rise to E.g. given pairwise distances between cities on a map, find the locations: Can only hope to find up to rotation, translation
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS (classical scaling). Centering matrix H: Define , where Attention: Only works for normed vector spaces!
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 • Classic MDS (classical scaling). • Define , • Express to obtain • Note that if , then for any orthonormal • Since is symmetric, can find its eigendecomposition: • and
Multivariate Analysis Mardia K.V. et al., Academic Press., 1979 Classic MDS (classical scaling). Although is a matrix, it has only non-zero eigenvalues if was sampled from . Can project on the first eigenvectors, by taking:
Multivariate Analysis Mardia K.V. et al., Academic Press., 1979 Classic MDS (classical scaling). Although is a matrix, it has only non-zero eigenvalues if was sampled from . Can project on the first eigenvectors, by taking: • Optimality condition of classic MDS • Theorem: If is a set of points in with distances: • For any k-dimensional orthonormal projection , the distortion • is minimized when is projected onto its principal directions,
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Try to find a more natural basis to express the points in.
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Try to find a more natural basis to express the points in.
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Try to find a more natural basis to express the points in.
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. During standard Principal Component Analysis, one performs eigendecomposition of the covariance matrix: Using the centering matrix, we can express: For any eigenvalue of we have: which implies: The eigenvalues of and are the same and the eigenvectors are given by
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS – Relation to PCA. The eigenvalues of and are the same and the eigenvectors are given by: has the advantage that its size is and it is positive definite rather than positive-semidefinite. Eigendecomposition more stable. If we’re only given pairwise distances, cannot construct directly. Solving different problems!
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 • Metric MDS. • Suppose instead of minimizing distortion (stress), we want to minimize derived stress. Given pairwise distances , find a set of points to minimize: • Even if come from a Euclidean space, the problem is much more difficult. • Resort to numerical optimization. Differentiate w.r.t. to to get the gradient. • Alternative: perform classical MDS on derived distances. Eigensystem. • Problem: The matrix is no longer guaranteed to be positive semi-definite. Critchley F., Multidimensional Scaling: a short critique and a new algorithm, COMPSTAT, 1978
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA. Basic Idea: represent a point by its image in a feature space: Domains can be completely different! Kernel Trick: In many applications we do not need to know explicitly, we only need to operate if the kernel can be computed efficiently (e.g. can be infinite dimensional)
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA. Basic Idea: represent a point by its image in a feature space: Domains can be completely different! Kernel Trick: In many applications we do not need to know explicitly, we only need to operate if the kernel can be computed efficiently (e.g. can be infinite dimensional)
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA. Could do PCA in the feature space: compute covariance matrix of feature vectors, and perform its eigen-decomposition. However, instead of , could use If the dimension of feature vectors > , this is more efficient! To center the data, so that can use the centering matrix and find eigenvalues of Schölkopf, B., et al., Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 1998
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA and Metric MDS. Spherical (isotropic) kernel. Depends only on the distance between points: If we assume that then:
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 • Kernel PCA and Metric MDS. • Suppose we’re given a matrix of pairwise distances: • If we set then • In matrix form: , and moreover: • Thus, performing Classical MDS on is equivalent to performing it on A. • Classical MDS on attempts to approximate:which is a nonlinear function of distance. So classical MDS on is metric MDS on
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Kernel PCA and Metric MDS. Thus, performing Classical MDS on is equivalent to performing it on A. Classical MDS on attempts to approximate:which is a nonlinear function of distance. Classical MDS on is metric MDS on . Since , it is positive semi-definite if the kernel is chosen appropriately. This is not the case for arbitrary Metric MDS functions. An advantage of doing Kernel PCA is that a new point can be quickly projected onto a pre-computed basis. Difficult with numerical optimization.
On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Summary: If the distance matrix comes from points in a normed vector space, MDS reduces to an Eigenvalue Problem – classical scaling. This classical MDS is also closely related to PCA, which computes the optimal basis when positions are known. Kernel PCA transforms points to a feature space and uses the kernel trick to compute PCA in this space. Metric MDS approximates derived distances , for some given function . If the kernel is spherical, then Kernel PCA is a special case of Metric MDS, for the function
Outline On a Connection between Kernel PCA and Metric Multidimensional Scaling Williams C., Advances in Neural Information Proc. Sys., 2001 Classic MDS and PCA review. Metric MDS. Kernel PCA, kernel trick, relation to Metric MDS. Summary. Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007
Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Problem: Given 2 articulated shapes in different poses, find point correspondences : Many degrees of freedom, cannot apply rigid alignment. Images by Q.-X. Huang et al. 08
Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Approach: Embed each shape into a feature space, defined by the Laplacian. The embedding is isometry invariant: for any isometric deformation . The embedding is only defined up to a rigid transform in the feature space. Find the optimal rigid transform in the feature space to find the correspondences.
Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 • Approach: • The shape is defined as a point cloud. Approximate the Laplacian: • Solve the generalized eigenvalue problem: • Find the most significant eigenvalues/vectors. • For each data point , let • Where is the i-th eigenvector of
Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 • Approach: • For each data point , let • Where is the i-th eigenvector of . • Would like to have for corresponding points. However, each eigenvector is only defined up to a sign. Reflection: • If correspond to the same eigenvalue, then for any • is also an eigenvector. Rotation: • Points from the two point sets can be aligned using: • where is orthogonal.
Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Approach: Given point correspondences it is easy to obtain the optimal orthogonal matrix: SVD approach from optimal rigid alignment. Let , and compute its singular value decomposition: The optimal solution is given by: With this step, can perform ICP in the feature space to find the optimal correspondeces.
Articulated Shape Matching by Robust Alignment of Embedded Representations Mateus D. et al., Workshop on 3DRR, 2007 Results:
Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 Main Goal:Find a good, isometry-invariant shape descriptor. Good: Efficient, Easily Computable, Insensitive to local topology changes (unlike MDS)
Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 Main Idea: For every point define a Global Point Signature Where is an eigenvector of the Laplace-Beltramioperator. GPS is a mapping of the surface onto an infinite dimensional space. Each point gets a signature.
Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 • Properties of GPS: • If . • GPS is isometry invariant (since Laplace-Beltrami is) • Given all eigenfunctions and eigenvalues, can recover the shape up to isometry (not true if only eigenvalues are known). • Euclidean distances in the GPS embedding are meaningful: • K-means done on the embedding provides a segmentation.
Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 • Comparing GPS: • Given a shape, determine its GPS embedding. • Construct a histogram of pairwise GPS distances (note that GPS is defined up to sign flips, distances are preserved) • For any 2 shapes, compute the -norm difference between their histograms. • For refined comparisons use more than one histogram.
Laplace-Beltrami Eigenfunctions for Deformation Invariant Shape Representation Rustamov R., SGP, 2007 • Results:
Conclusions • Kernel methods attempt to embed the shape into a feature space, that can be manipulated more easily. • Laplacian embedding is useful because of its isometry-invariance. Can be used for comparing non-rigid shapes under isometric deformations. • Sign flipping and repeated eigenvalues can cause difficulties (no canonical way to chose them). • Limitations: • Embeddings are not necessarily stable or mesh independent. • Difficult to compute for large meshes (millions of points) • Both topological and geometric stability is not well understood.