1 / 18

Graph Embedding: A General Framework for Dimensionality Reduction

Graph Embedding: A General Framework for Dimensionality Reduction. Dong XU School of Computer Engineering Nanyang Technological University http://www.ntu.edu.sg/home/dongxu dongxu@ntu.edu.sg. What is Dimensionality Reduction?. PCA. LDA. Examples: 2D space to 1D space .

bian
Download Presentation

Graph Embedding: A General Framework for Dimensionality Reduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University http://www.ntu.edu.sg/home/dongxu dongxu@ntu.edu.sg

  2. What is Dimensionality Reduction? PCA LDA Examples: 2D space to 1D space

  3. What is Dimensionality Reduction? Example: 3D space to 2D space ISOMAP: Geodesic Distance Preserving J. Tenenbaum et al., 2000

  4. Why Conduct Dimensionality Reduction? LPP, 2003 He et al. Expression Variation Uncover intrinsic structure • Visualization • Feature Extraction • Computation Efficiency • Broad Applications Face Recognition Human Gait Recognition CBIR Pose Variation

  5. Representative Previous Work PCA LDA ISOMAP: Geodesic Distance Preserving J. Tenenbaum et al., 2000 LLE: Local Neighborhood Relationship Preserving S. Roweis & L. Saul, 2000 LE/LPP: Local Similarity Preserving, M. Belkin, P. Niyogi et al., 2001, 2003

  6. Any common perspective to understand and explain these dimensionality reduction algorithms? Or any unified formulation that is shared by them? Any general tool to guide developing new algorithms for dimensionality reduction? Hundreds Dimensionality Reduction Algorithms Statistics-based Geometry-based … PCA/KPCA ISOMAP LLE LE/LPP … LDA/KDA Matrix Tensor

  7. Our Answers Google Citation: 174 (until 15-Sep-2009) Direct Graph Embedding Linearization Kernelization Original PCA & LDA, ISOMAP, LLE, Laplacian Eigenmap PCA, LDA, LPP KPCA, KDA Tensorization Type Formulation CSA, DATER Example S. Yan, D. Xu, H. Zhang et al., CVPR 2005 and T-PAMI 2007

  8. Direct Graph Embedding Intrinsic Graph: S, SP: Similarity matrix (graph edge) Similarity in high dimensional space L, B:Laplacian matrix from S, SP; Data in high-dimensional space and low-dimensional space (assumed as 1D space here): Penalty Graph

  9. Direct Graph Embedding -- Continued Intrinsic Graph: S, SP: Similarity matrix (graph edge) L, B:Laplacian matrix from S, SP; Similarity in high dimensional space Data in high-dimensional space and low-dimensional space (assumed as 1D space here): Criterion to Preserve Graph Similarity: Penalty Graph Special case B isIdentity matrix (Scale normalization) Problem: It cannot handle new test data.

  10. Linearization Intrinsic Graph Linear mapping function Penalty Graph Objective function in Linearization Problem: linear mapping function is not enough to preserve the real nonlinear structure?

  11. Kernelization Intrinsic Graph Nonlinear mapping: the original input space to another higher dimensional Hilbert space. Penalty Graph Constraint: Kernel matrix: Objective function in Kernelization

  12. Tensorization Low dimensional representation is obtained as: Intrinsic Graph Penalty Graph Objective function in Tensorization where

  13. Common Formulation Intrinsic graph S, SP: Similarity matrix L, B:Laplacian matrix from S, SP; Penalty graph Direct Graph Embedding Linearization Kernelization Tensorization where

  14. A General Framework for Dimensionality Reduction D: Direct Graph Embedding L:Linearization K: Kernelization T: Tensorization

  15. New Dimensionality Reduction Algorithm: Marginal Fisher Analysis Important Information for face recognition: 1) Label information 2) Local manifold structure (neighborhood or margin) 1: ifxi is among the k1-nearest neighbors of xj in the same class; 0 :otherwise 1: if the pair (i,j) is among the k2 shortest pairs among the data set; 0: otherwise

  16. Marginal Fisher Analysis: Advantage No Gaussian distribution assumption

  17. Experiments: Face Recognition

  18. Summary • Optimization framework that unifies previous dimensionality reduction algorithms as special cases. • A new dimensionality reduction algorithm: Marginal Fisher Analysis.

More Related