350 likes | 368 Views
Dive into the world of pattern recognition through statistical and neural methods in this lecture at Nanjing University. Learn about discriminant functions, weight spaces, potential function approaches, and decision boundaries using examples and algorithms.
E N D
Nanjing University of Science & Technology Pattern Recognition:Statistical and Neural Lonnie C. Ludeman Lecture 18 Oct 21, 2005
Lecture 18 Topics 1. Example – Generalized Linear Discriminant Function 2. Weight Space 3. Potential Function Approach- 2 class case 4. Potential Function Example- 2 class case 5. Potential Function Algorithm – M class case
Classes not Linearly separable from C1 from C2 2 1 3 4 x1 1 2 -1 -2 Q. How can we find decision boundaries?? Answers: (1) Use Generalized Linear Discriminant functions (2) Use Nonlinear Discriminant Functions
Example: Generalized Linear Discriminant Functions x2 from C1 from C2 3 2 1 3 4 x1 1 2 -1 -2 Given Samples from 2 Classes
Find a generalized linear discriminant function that separates the classes Solution: d(x) = w1f1(x)+ w2f2(x)+ w3f3(x) + w4f4(x) +w5f5(x) + w6f6(x) = wT f(x) in the f space (linear)
where in the original pattern space: (nonlinear)
Use the Perceptron Algorithm in the f space (the iterations follow) Iteration # Samples Action Weights
d(x) Iterations Continue Iterations Stop
The discriminant function is as follows Decision boundary set d(x) = 0 Putting in standard form we get the decision boundary as the following ellipse
Decision Boundary in original pattern space x2 from C1 2 from C2 1 3 4 x1 1 2 -1 Boundary d(x) = 0 -2
Weight Space To separate two pattern classes C1 and C2 by a hyperplane we must satisfy the following conditions Where wTx = 0 specifies the boundary between the classes
But we know that wTx = xTw Thus we could now write the equations in the w space with coefficients representing the samples as follows Each inequality gives a hyperplane boundary in the weight space such that weights on the positive side would satisfy the inequality
Potential Function Approach – Motivated by electromagnetic theory + from C1 - from C2 Sample space
Given Samples x from two classes C1 and C2 C C S1 S2 C1 C2 Define Total Potential Function K(x) = ∑ K(x, xk) - ∑ K(x, xk) xk S1 xk S2 Potential Function Decision Boundary K(x) = 0
Example – Using Potential functions Given the following Patterns from two classes Find a nonlinear Discriminant function using potential functions that separate the classes
Algorithm converged in 1.75 passes through the data to give final discriminant function as
Potential Function Algorithm for K Classes Reference (3) Tou And Gonzales
Summary 1. Example – Generalized Linear Discriminant Function 2. Weight Space 3. Potential Function Approach- 2 class case 4. Potential Function Example- 2 class case 5. Potential Function Algorithm – M class case