200 likes | 324 Views
Bayes Decision Rule. Comp328 tutorial 3 Kai Zhang. Outline. Basic notions Three examples Minimizing error rate Decision functions. Prior Probability. w - state of nature, e.g. w 1 the object is a fish, w 2 the object is a bird, etc. w 1 the course is good, w 2 the course is bad
E N D
Bayes Decision Rule Comp328 tutorial 3 Kai Zhang
Outline • Basic notions • Three examples • Minimizing error rate • Decision functions
Prior Probability • w - state of nature, e.g. • w1 the object is a fish, w2 the object is a bird, etc. • w1 the course is good, w2 the course is bad • etc. • A priory probability (or prior) P(wi)
Class-Conditional Probability • Observation x, e.g. • the objects has wings • The object’s length is 20 cm • The first lecture is interesting • Class-conditional probability density (mass) function p(x|w)
Bayes Decision Rule Suppose the priors P(wj) and conditional densities p(x|wj) are known prior likelihood posterior evidence
Example • Bayes Decision Rule • If P(apple | color) > P(peach | color) then choose apple • Note that the evidence p(color) is only necessary for normalization purposes; it does not affect the decision rule
Misclassification Error • After observing x, the error occurs when the decision is different from the truth • So the average error is • The Bayesian decision rule minimize the average probability of error since P(error|x) is always forced to be minimum
Examples • We know the ratio of unqualified products for the 4 workers. • Given a unqualified product, from which worker it is from most likely? A1,A2,A3,A4: (products of) four workers B: event that the product is not qualified
Example • The objects can be classified as either GREEN or RED. • Our task is to classify new cases as they arrive, i.e., decide to which class they belong, based on currently exiting objects.
Prior probabilities • In this case, the percentage of GREEN and RED objects, can be used to predict outcomes before they actually happen. = 40/60 = 20/60 • Likelihood / class conditional probability = 1/40 = 3/20
Object classification • Posterior probabilities and decision rule
Deiscriminant Functions • Discriminant function is one of the ways to represent a pattern classifier; the classifier assigns a feature to class i if • Bayes classifiers can be represented in this way :
Decision Boundaries • Discriminant functions can be in different forms, but the effect of the decision rules is the same • Decision boundaries of different joint probabilities as above
Discriminant Functions for NormalProbability Density • Case I: equal covariance (spherical Gaussian)