310 likes | 340 Views
Nanjing University of Science & Technology. Pattern Recognition: Statistical and Neural. Lonnie C. Ludeman Lecture 13 Oct 14, 2005. Lecture 13 Topics. 1. Multiple observation Multiple class example: (review) Sufficient statistic space and Likelihood ratio space
E N D
Nanjing University of Science & Technology Pattern Recognition:Statistical and Neural Lonnie C. Ludeman Lecture 13 Oct 14, 2005
Lecture 13 Topics 1. Multiple observation Multiple class example: (review) Sufficient statistic space and Likelihood ratio space 2. Calculation of P(error) for 2-class case : several special cases 3. P(error) calculations examples for special cases – 2-class case
Example 1: Multiple observation - multiple classes Given: the pattern vector x is composed of N independent observations of a Gaussian random variable X with the class conditional densities as follows for each component A zero one cost function is given as Find: (a) the Bayes decision rule in a sufficient statistic space. (b) the Bayes decision rule in a space of likelihood ratios
Solution: (a) Since the observations are independent the joint conditional density is a product of the marginal densities and given by for i = 1, 2, 3 and mi = i, i=1, 2, 3 Bayes decision rule is determined form a set of yi(x) defined for M=3 by
Substituting the given properties gives The region to decide C1is found by setting the following inequalities Therefore the region R1 to decide C1, reduces to the x that satisfy
Similarly the regions R2 and R3 become Substituting the conditional densities, taking the ln of both sides and simplifying the decision rule reduces to regions in a sufficient statistic s space as follows
Which is shown below in the sufficient statistic s space s An intuitively pleasing result !
(b) Bayes Decision Rule in Likelihood ratio space: M-Class Case derivation We know that Bayes Decision Rule for the M-Class Case is if yi(x) < yj(x) for all j = i Then decide x is from Ci M where yi(x) = Cijp(x | Cj) P(Cj) j=1
Dividing through by p(x | CM) gives sufficient statistics vi(x) as follows LM(x) = p(x | CM) / p(x | CM) = 1 Therefore the decision rule becomes
Bayes Decision Rule in the Likelihood Ratio Space The dimension of the Likelihood Ratio Space is always one less than the number of classes ( M - 1)
Back to Example: Define the likelihood ratios as We have already determined the region to decide C1 as Dividing both sides of the inequalities by p(x|C3) gives the following equations in the Likelihood Ratio space for determining C1
The other regions are determined in the same fashion giving the decision regions in the likelihood ratio space
Calculation of Probability of error for the 2-class Gaussian Cases Special Case 1: We know Optimum Bayes Decision Rule is given by
The sufficient statistic Z conditioned on C1 has the following mean and variance
The conditional variance becomes thus under C1 we have : Z ~ N( a1, v1 ) a1= v1=
Similarly the conditional mean and variance under class C2 are a2= v2= The statistic Z under class C2 is Gaussian and given by thus under C1 we have : Z ~ N( a2, v2 )
Determination of the P(error) The total Probability Theorem states where
Since the scalar Z is Gaussian the error conditioned on C1 becomes:
Similarly the error conditioned on C2 becomes Finally the total P(error) becomes for Special Case 1
Special case 2: Equal scaled identity Covariance matrices Using the previous formula the P(error) reduces to where (Euclidean distance between the means)
Special case 3: Zero- one Bayes Costs and Equal apriori probabilities Using the previous formula for P(error) gives:
Special Case 4: Then
Example: Calculation of probability of Error Given: Find: P(error) for the following assumptions
(a) Solution:
(b) Solution: Substituting the above into the P(error) gives:
(c) Solution: Substituting the above into the P(error) gives:
(d) Solution: Substituting the above into the P(error) for the case of equal covariance matrices gives:
Lecture 13 Summary 1. Multiple observation Multiple class example: (review) Sufficient statistic space and Likelihood ratio space 2. Calculation of P(error) for two class case : special cases 3. P(error) calculations examples for special cases - 2 class case