1 / 19

Chernoff Bounds (Theory)

Chernoff Bounds (Theory). ECE 7251: Spring 2004 Lecture 25 3/17/04. Prof. Aaron D. Lanterman School of Electrical & Computer Engineering Georgia Institute of Technology AL: 404-385-2548 <lanterma@ece.gatech.edu>. Consider the log likelihood ratio test. Conditional error probabilities:.

wylie
Download Presentation

Chernoff Bounds (Theory)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chernoff Bounds (Theory) ECE 7251: Spring 2004 Lecture 25 3/17/04 Prof. Aaron D. Lanterman School of Electrical & Computer Engineering Georgia Institute of Technology AL: 404-385-2548 <lanterma@ece.gatech.edu>

  2. Consider the loglikelihood ratio test • Conditional error probabilities: The Setup • General purpose likelihood ratio test

  3. The Problem • Alas, it is often difficult, if not impossible, to find simple formulas for • Makes computing probabilities of detection and false alarm difficult • Could use Monte Carlo simulations, but those are cumbersome • Alternative: find easy to compute, analytic bounds on the error probabilities • Discussion based on Van Trees, pp. 116-125

  4. A Moment Generating Function

  5. Tilted Densities • Define a new random variable Xs (for various values of s) with density

  6. The Happy Mu Function (EFTR)

  7. More Properties of the Mu Function

  8. A Weird Way of Writing PFA • With • Then

  9. Creating the Bound

  10. Find the Tightest Bound • We want the s0 which makes the RHS as small as possible • Assuming everything worked (things exist, equation for maximizing s solvable, etc.):

  11. Similar Analysis Bounds PM • We want the s1 which makes the RHS as small as possible • Assuming everything worked (things exist, equation for maximizing s solvable, etc.):

  12. Putting It All Together Why is this useful? L can often be easily described by its moment generating function

  13. Case of Equal Costs and Equal Priors • Let s=smsatisfy

  14. where Another Look at the Derivation

  15. A Revelation About the Constant Original Chernoff inequality was formed by replacing this with 1. We can get a tighter constant in some asymptotic cases.

  16. Asymptotic Gaussian Approximation • In some cases, Z approaches a Gaussian random variable as the number of samples n grows large (ex: data points i.i.d. with finite means and variances (EFTR)

  17. If we can approximate Q • using an upper bound Yet Another Approximation

  18. If we can approximate Q using the upper bound Similar Analysis Works for PM

  19. Asymptotic Analysis for Pe • For the case of equal priors and equal costs, if the conditions for the approximation for Q to be valid on the previous to slides holds, we have (EFTR)

More Related