1 / 53

Introduction to Pattern Recognition & Classification Techniques

This lecture provides an introduction to pattern recognition and classification techniques, including the minimum distance classifier, optimum statistical classifier, and neural network classifier. It also covers topics such as k-means clustering, k-NN classifier, and validation.

regand
Download Presentation

Introduction to Pattern Recognition & Classification Techniques

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ENT363/4Machine Vision Systems(Core Course) Lecture 7 Introduction to Pattern Recognition & Classification Techniques

  2. Outline • Pattern & Pattern Classes • Minimum Distance Classifier • Optimum Statistical Classifier • Neural Network Classifier • k-Means Clustering • k-NN Classifier • Validation

  3. Introduction • The approaches to pattern recognition developed are divided into two principal areas: decision-theoretic and structural • The first category deals with patterns described using quantitative descriptors, such as length, area, and texture • The second category deals with patterns best described by qualitative descriptors, such as the relational descriptors.

  4. Pattern & Pattern Classes • Feature – descriptor notation • Pattern – arrangement of descriptor • Pattern Class • family of a pattern that share common properties • Normally denotes as ω1, ω2, …….. ωW ,where W is the number of classes • Pattern Recognition – technique of assigning patterns to their respective class

  5. FISHER’S IRIS DATA

  6. Pattern & Pattern Classes

  7. Minimum Distance Classifier • Suppose that we define the prototype of each pattern class to be the mean vector of the patterns of that class: • Using the Euclidean distance to determine closeness reduces the problem to computing the distance measures j=1,2,…,W(1) j=1,2,…,W (2)

  8. Minimum Distance Classifier • The smallest distance is equivalent to evaluating the functions • The decision boundary between classes and for a minimum distance classifier is j=1,2,…,W(3) j=1,2,…,W (4)

  9. Minimum Distance Classifier • Advantages: 1. Unusual direct-viewing 2. Can solve rotation the question 3. Intensity 4. Chooses the suitable characteristic, then solves mirror problem 5. We may choose the color are one kind of characteristic, the color question then solve.

  10. Minimum Distance Classifier • Disadvantages: 1. It costs time for counting samples, but we must have a lot of samples for high accuracy, so it is more samples more accuracy! 2. Displacement 3. It is only two features, so that the accuracy is lower than other methods. 4. Scaling

  11. Minimum Distance Classifier

  12. Optimum Statistical Classifier

  13. Optimum statistical classifiers • The probability that a particular pattern x comes from class is denoted • If the pattern classifier decides that x came from when it actually came from , it incurs a loss, denoted

  14. Optimum statistical classifiers • From basic probability theory, we know that

  15. Optimum statistical classifiers • Thus the Bayes classifier assigns an unknown pattern x to class

  16. Optimum statistical classifiers • The Bayes classifier then assigns a pattern x to class if, • or, equivalently, if

  17. Optimum statistical classifiers • Bayes Classifier for Gaussian Pattern Classes • Let us consider a 1-D problem (n=1) involving two pattern classes (W=2) governed by Gaussian densities

  18. Optimum statistical classifiers • In the n-dimensional case, the Gaussian density of the vectors in the jth pattern class has the form

  19. Optimum statistical classifiers • Advantages: 1. The way always combine with other methods, then it got high accuracy • Disadvantages: 1.It costs time for counting samples 2.It has to combine other methods

  20. Neural Network Classifier Some History • History traces back to the 50’s but became popular in the 80’s with work by Rumelhart, Hinton, and Mclelland • A General Framework for Parallel Distributed Processing in Parallel Distributed Processing: Explorations in the Microstructure of Cognition • Peaked in the 90’s. Today: • Hundreds of variants • Less a model of the actual brain than a useful tool, but still some debate • Numerous applications • Handwriting, face, speech recognition • Vehicles that drive themselves • Models of reading, sentence production, dreaming • Debate for philosophers and cognitive scientists • Can human consciousness or cognitive abilities be explained by a connectionist model or does it require the manipulation of symbols?

  21. An artificial neural network (ANN) is a mathematical model or computational model based on biological neural networks • It consists of an interconnected group of artificial neurons and processes information using a connectionist approach to computation • In most cases an ANN is an adaptive system that changes its structure based on external or internal information that flows through the network during the learning phase

  22. Neural Network Classifier How Does It Works – Perceptron Inputs I1 Weights w1 I2 w2 Output sumO w3  I3 f(sum) w4 w5 I4 sum = wnIn f: activation function I5  Oi= fi(wnIn)

  23. Neural Network Classifier NN Structures

  24. feedforward e31 I1 I4 I5 I2 I3 w31 e32 PATTERN w32 O1 e33 w33 w34 e34 Neural Network Classifier How Does It Works – Multi Layer Perceptron (MLP) e = d1 - o1 e: error d: desired output o: network output w: weights feedback

  25. Neural Network Classifier

  26. Neural Network Classifier Supervised vs Unsupervised Learning • Supervised learning • Supervision: The training data (observations, measurements, etc.) are accompanied by labels indicating the class of the observations • New data is classified based on the training set • Unsupervised learning • The class labels of training data is unknown • Given a set of measurements, observations, etc. with the aim of establishing the existence of classes or clusters in the data

  27. k-Means Clustering

  28. k-Means Clustering

  29. k-Means Clustering

  30. K-NN Classifier Non-parametric pattern classification. Consider a two class problem where each sample consists of two measurements (x,y). For a given query point q, assign the class of the nearest neighbour. Compute the k nearest neighbours and assign the class by majority vote. k = 1 k = 3 Type of distance metric normally used: Euclidean distance

  31. K-means Algorithm: • Place K points into the space represented by the objects that are being clustered. These points represent initial group centroids. • Assign each object to the group that has the closest centroid. • When all objects have been assigned, recalculate the positions of the K centroids. • Repeat Steps 2 and 3 until the centroids no longer move. This produces a separation of the objects into groups from which the metric to be minimized can be calculated.

  32. K-Mean Example: Random Dataset

  33. We randomly choose 2 group centroids!

  34. We assign each point to the group that has the closest centroit.

  35. We recalculate the positions of thecentroids.

  36. We assign each point to the group that has the closest centroid.

  37. We recalculate the positions of thecentroids.

  38. No matter how many times the algorithm will be executed, from now on the centroids won’t move!!  So the clustering it’s over!

  39. K-NN Classifier • Advantage • Nonparametric architecture • Simple • Powerful • Requires no training time • Disadvantage • Memory intensive • Classification/estimation is slow

  40. More Classification Algorithm • Artificial neural networks • Perceptron • Multi layer perceptron • Naive Bayes classifier • Probabilistic classification • Bayesian networks • Temporal classification • Hidden Markov models • Dynamic Bayesian networks • AND MANY MORE!! • Discriminant analysis • Linear / Quadratic discriminant • Logistic regression • Support vector machine • Decision tree • Density estimation • k-nearest neighbor • k-means clustering

  41. VALIDATION • After developing the classification algorithm, we surely wanted to check how ‘good’ is our classification. • Consider these: • The new sample taken is virginica, when put into classification algorithm, it was grouped in virginica group. • The new sample taken is versicolor, when put into classification algorithm, it was grouped in virginica group. • Do we know how many (1) will occur? And how much sample is (2)? • Mathematically,

  42. VALIDATION

  43. VALIDATION

  44. VALIDATION

More Related