100 likes | 120 Views
Learn about Principal Component Analysis and Discriminant Analysis techniques to reduce feature space dimensions for improved data representation and classification efficiency.
E N D
Pattern ClassificationAll materials in these slides were taken fromPattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley & Sons, 2000with the permission of the authors and the publisher Pattern Classification, Chapter 2 (Part 2)
3.8 Component Analysis and Discriminants • Combine features to reduce the dimension of the feature space • Linear combinations are simple to compute • Project high dimensional data onto a lower dimensional space • Two classical approaches for finding “optimal” linear transformations • PCA (Principal Component Analysis) “Projection that best represents the data in a least- square sense” • MDA (Multiple Discriminant Analysis) “Projection that best separatesthe data in a least-squares sense” 8
3.8.1 Principal Component Analysis (PCA) • Finds the direction that best represents the data in the least squares sense • Solving the least squares optimization problem results in the so called scatter matrix • Which is a constant times the covariance matrix • So the best directions are simply the Eigenvectors corresponding to the largest Eigenvalues of the covariance matrix 8
3.8.2 Fisher Linear Discriminant • Whereas PCA seeks directions efficient for representation, discriminant analysis seeks directions efficient for discrimination • This is the classical discriminant analysis 8
3.8.3 Multiple Discriminant Analysis • Generalization of Fisher’s linear discriminant • Seeks the optimum subspace with the greatest separation of the projected distributions • Defines within-class and between-class scatter matrices • Because we want small within-class scatter and large between-class scatter, this transformation maximizes the ratio of the between-class scatter to the within-class scatter 8