620 likes | 840 Views
Prof NB Venkateswarlu Head, IT, GVPCOE Visakhapatnam venkat_ritch@yahoo.com www.ritchcenter.com/nbv. First Let me say Hearty Welcome to you All. Also, let me congrachulate Chairman, Secretary/Correspondent. Principal, Prof. Ravindra Babu Vice-Principal.
E N D
Prof NB Venkateswarlu Head, IT, GVPCOE Visakhapatnam venkat_ritch@yahoo.com www.ritchcenter.com/nbv
Also, let me congrachulate Chairman, Secretary/Correspondent
Principal, Prof. Ravindra Babu Vice-Principal
and other Organizers for planning for such a nice workshop with excellent themes.
My Talk Feature Extraction/ Selection
A Typical Image Processing System contains Image Acquisition Image Pre-Processing Image En-hancement Image Seg-mentation Image Unde-rstanding Image Class-fication Image Featu-re Extraction
Two Aspects of Feature Extraction Extracting useful features from images or any other measurements.
Identifying Transformed Variables which are functions of original variables and having some charcateristics.
Feature Selection Selecting Important Variables is Feature Selection
Shape based • Contour based • Area based • Transform based • Projections • Signature • Problem specific
Classification/Pattern Recognition • Statistical • Syntactical Linguistic • Discriminant function • Fuzzy • Neural • Hybrid
Dimensionality Reduction • Feature selection (i.e., attribute subset selection): • Select a minimum set of features such that the probability distribution of different classes given the values for those features is as close as possible to the original distribution given the values of all features • reduce # of patterns in the patterns, easier to understand • Heuristic methods (due to exponential # of choices): • step-wise forward selection • step-wise backward elimination • combining forward selection and backward elimination • decision-tree induction
A4 ? A6? A1? Class 2 Class 2 Class 1 Class 1 Reduced attribute set: {A1, A4, A6} > Example of Decision Tree Induction Initial attribute set: {A1, A2, A3, A4, A5, A6}
Heuristic Feature Selection Methods • There are 2dpossible sub-features of d features • Several heuristic feature selection methods: • Best single features under the feature independence assumption: choose by significance tests. • Best step-wise feature selection: • The best single-feature is picked first • Then next best feature condition to the first, ... • Step-wise feature elimination: • Repeatedly eliminate the worst feature • Best combined feature selection and elimination: • Optimal branch and bound: • Use feature elimination and backtracking
Why do We need? • A classifier performance depends on • No of features • Feature distinguishability • No of groups • Groups characteristics in multidimensional space. • Needed response time • Memory requirements
Feature Extraction Methods We will find transformed variables which are functions of original variables. A good example: Though we may conduct tests in more than test (K-D), finally grading is done based on total marks (1-D)
Principal Component Analysis • Given N data vectors from k-dimensions, find c <= k orthogonal vectors that can be best used to represent data • The original data set is reduced to one consisting of N data vectors on c principal components (reduced dimensions) • Each data vector is a linear combination of the c principal component vectors • Works for numeric data only • Used when the number of dimensions is large
X2 Y1 Y2 X1 Principal Component Analysis
Principal Component Analysis Aimed at finding new co-ordinate system which has some characteristics. M=[4.5 4.25 ] Cov Matrix [ 2.57 1.86 ] [ 1.86 6.21] Eigen Values = 6.99, 1.79 Eigen Vectors = [ 0.387 0.922 ] [ -0.922 0.387 ]
However in some cases it is not possible to have PCA working.
Unlike PCA which takes global mean and covariance, this takes between the group and within the group covariance matrix and the calculates canonical axes.
Standard Deviation – A Simple Indicator Correlation Coefficient