380 likes | 469 Views
Class-oriented Regression Embedding. 报告人:陈 燚. 2011 年 8 月 25 日. 报告提纲. 1. Background 2. Related Works 2.1 Linear Regression-based Classification 2.2 Neighborhood Preserving Embedding & Sparsity Preserving Projections 3. Class-oriented Regression Embedding 4. Experiments. 1. Background.
E N D
Class-oriented Regression Embedding 报告人:陈 燚 2011年8月25日
报告提纲 1. Background 2. Related Works 2.1 Linear Regression-based Classification 2.2Neighborhood Preserving Embedding & Sparsity Preserving Projections 3. Class-oriented Regression Embedding 4. Experiments
Background The minimum reconstruction error criterion is widely used in the recent progress of subspace classification, such as in SRC and LRC J. Wright, A. Yang, S. Sastry, Y. Ma, Robust face recognition via sparse representation, IEEE Trans. Pattern Anal. Mach. Intell. 31 (2), 210–227, 2009. I. Naseem, R. Togneri, and M. Bennamoun. Linear Regression for Face Recognition. IEEE Trans. on PAMI, 2010.
A brief review SRC: LRC Classification rule: is the coefficients of the ith class
Nearest Space Classifiers Definition: The nearest subspace of a given sample Measurement: Reconstruction Error Stan Z. Li: Face Recognition Based on Nearest Linear Combinations. CVPR 1998: 839-844
LRC 线性子空间假设 最小二乘法 第i类的重构结果 样本的类别即是最小重构误差的类
NPE & SPP Objective Function The difference between NPE and SPP the reconstructive strategy. NPE: KNN SPP: Global Sparse Xiaofei He, Deng Cai, Shuicheng Yan, and HongJiang Zhang. Neighborhood preserving embedding, ICCV, 1208–1213, 2005. Qiao, L.S., Chen, S.C., Tan, X.Y., Sparsity preserving projections with applications to face recognition. Pattern Recognition 43 (1), 331–341, 2010.
Assumption of SRC and LRC A given sample belongs to the class with minimum reconstruction error Problem: Does this assumption holds well in real world applications?
Examples The training face images
Examples 20 3 17 14
Motivation LRC uses downsampled images directly for classification, which is not optimal for LRC. We aim to find the subspace that conforms to the assumption. In this low-dimensional subspace, A sample can be best represented by its intra-class samples.
Algorithm Objective function: To avoid degenerate solutions, we constraint Then we have: Where and
Example Reconstructive Strategy of CRE NPE and SPP CRE NPE SPP
SSS problem is singular in SSS case. We apply PCA to reduce the dimensionality of the origin sample to avoid SSS problem.
Ridge Regression-based Classification 线性子空间假设 最小二乘法 第i类的重构结果 样本的类别即是最小重构误差的类 May be singular
Steps Input: Column sample matrix Output: Transform matrix Step 1: Project the training samples onto a PCA subspace: Step 2: Construct the global reconstruction coefficient matrix using . Step 3: Solve the generalized eigenvectors of corresponding to the first d smallest eigenvalues.
Experiments on YALE-B Experiments on the YALE-B database
Comparisons of recognition rates using CRE plus NNC/LRC/SRC on the YALE-B database with 10 and 20 training samples each class respectively.
Comparisons of recognition rates using 5 methods plus LRC on the YALE-B database with 10 and 20 training samples each class respectively.
The recognition rates of CRE plus LRC, SPP plus SRC and direct LRC on the YALE-B databases with 20 training samples of each class.
Comparisons of recognition rates using CRE plus NNC/LRC/SRC on the FERET database with 5 and 6 training samples each class respectively.
A comparison of recognition rates using 5 methods plus LRC on the FERET database with 6 training samples each class respectively.
The recognition rates of CRE plus LRC, SPP plus SRC and direct LRC on the FERET databases with 6 training samples of each class.
The recognition rate curves of CRE plus RRC/SRC/NNC versus the dimensions on the CENPARMI handwritten numeral database. The recognition rate curves of PCA, LDA, NPE, SPP and LSPP plus RRC on the CENPARMI handwritten numeral database.
谢谢! 报告人:陈 燚 2011年8月25日