1 / 9

• Objectives: Chernoff Bound Bhattacharyya Bound ROC Curves Discrete Features Resources:

• Objectives: Chernoff Bound Bhattacharyya Bound ROC Curves Discrete Features Resources: V.V. – Chernoff Bound J.G. – Bhattacharyya T.T. – ROC Curves NIST – DET Curves AAAS - Verification. ECE 8443 – Pattern Recognition. LECTURE 09: ERROR BOUNDS / DISCRETE FEATURES.

tdowdell
Download Presentation

• Objectives: Chernoff Bound Bhattacharyya Bound ROC Curves Discrete Features Resources:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. • Objectives: Chernoff Bound Bhattacharyya Bound ROC Curves Discrete Features Resources: V.V. – Chernoff Bound J.G. – Bhattacharyya T.T. – ROC Curves NIST – DET Curves AAAS - Verification ECE 8443 – Pattern Recognition LECTURE 09: ERROR BOUNDS / DISCRETE FEATURES • URL: .../publications/courses/ece_8443/lectures/current/lecture_09.ppt

  2. 09: ERROR BOUNDS MOTIVATION • Bayes decision rule guarantees lowest average error rate • Closed-form solution for two-class Gaussian distributions • Full calculation for high dimensional space difficult • Bounds provide a way to get insight into a problem and engineer better solutions. • Need the following inequality: Assume a  b without loss of generality: min[a,b] = b. Also, ab(1- ) = (a/b)b and (a/b)  1. Therefore, b  (a/b)b, which implies min[a,b]  ab(1- ) . • Apply to our standard expression for P(error).

  3. 09: ERROR BOUNDS CHERNOFF BOUND • Recall: • Note that this integral is over the entire feature space, not the decision regions (which makes it simpler). • If the conditional probabilities are normal, this expression can be simplified.

  4. where: 09: ERROR BOUNDS CHERNOFF BOUND FOR NORMAL DENSITIES • If the conditional probabilities are normal, our bound can be evaluated analytically: • Procedure: find the value of  that minimizes exp(-k( ), and then compute P(error) using the bound. • Benefit: one-dimensional optimization using 

  5. where: 09: ERROR BOUNDS BHATTACHARYYA BOUND • The Chernoff bound is loose for extreme values • The Bhattacharyya bound can be derived by  = 0.5: • These bounds can still be used if the distributions are not Gaussian (why? hint: maximum entropy). However, they might not be adequately tight.

  6. 09: ERROR BOUNDS RECEIVER OPERATING CHARACTERISITC • How do we compare two decision rules if they require different thresholds for optimum performance? • Consider four probabilities:

  7. One system can be considered superior to another only if its ROC curve lies above the competing system for the operating region of interest. 09: ERROR BOUNDS GENERAL ROC CURVES • An ROC curve is typically monotonic but not symmetric:

  8. where 09:DISCRETE FEATURES INTEGRALS BECOME SUMS • For problems where features are discrete: • Bayes formula involves probabilities (not densities): • Bayes rule remains the same: • The maximum entropy distribution is a uniform distribution: P(x=xi) = 1/N.

  9. 09: ERROR BOUNDS INTEGRALS BECOME SUMS • Consider independent binary features: • Assuming conditional independence: • The likelihood ratio is: • The discriminant function is:

More Related