360 likes | 465 Views
ELEC 303 – Random Signals. Lecture 17 – Hypothesis testing 2 Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 2, 2009. outline. Reading: 8.2,9.3 Bayesian Hypothesis testing Likelihood Hypothesis testing. Four versions of MAP rule. discrete, X discrete discrete, X continuous
E N D
ELEC 303 – Random Signals Lecture 17 – Hypothesis testing 2 Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 2, 2009
outline • Reading: 8.2,9.3 • Bayesian Hypothesis testing • Likelihood Hypothesis testing
Four versions of MAP rule • discrete, X discrete • discrete, X continuous • continuous, X discrete • continuous, X continuous
Example – spam filter • Email may be spam or legitimate • Parameter , taking values 1,2, corresponding to spam/legitimate, prob p(1), P(2) given • Let 1,…, n be a collection of special words, whose appearance suggests a spam • For each i, let Xi be the Bernoulli RV that denotes the appearance of i in the message • Assume that the conditional prob are known • Use the MAP rule to decide if spam or not.
Bayesian Hypothesis testing • Binary hypothesis: two cases • Once the value x of X is observed, Use the Bayes rule to calculate the posterior P|X(|x) • Select the hypothesis with the larger posterior • If gMAP(x) is the selected hypothesis, the correct decision’s probability is P(= gMAP(x)|X=x) • If Si is set of all x in the MAP, the overall probability of correct decision is P(= gMAP(x))=iP(=i,XSi) • The probability of error is: iP(i,XSi)
Example – biased coin, single toss • Two biased coins, with head prob. p1 and p2 • Randomly select a coin and infer its identity based on a single toss • =1 (Hypothesis 1), =2 (Hypothesis 2) • X=0 (tail), X=1(head) • MAP compares P(1)PX|(x|1) ? P(2)PX|(x|2) • Compare PX|(x|1) and PX|(x|2) (WHY?) • E.g., p1=.46 and p2 =.52, and the outcome tail
Example – biased coin, multiple tosses • Assume that we toss the selected coin n times • Let X be the number of heads obtained • ?
Example – signal detection and matched filter • A transmitter sending two messages =1,=2 • Massages expanded: • If =1, S=(a1,a2,…,an), if =2, S=(b1,b2,…,bn) • The receiver observes the signal with corrupted noise: Xi=Si+Wi, i=1,…,n • Assume WiN(0,1)
Binary hypothesis testing • H0: null hypothesis, H1: alternative hypothesis • Observation vector X=(X1,…,Xn) • The distribution of the elements of X depend on the hypothesis • P(XA;Hj) denotes the probability that X belongs to a set A, when Hj is true
Rejection/acceptance • A decision rule: • A partition of the set of all possible values of the observation vector in two subsets: “rejection region” and “acceptance region” • 2 possible errors for a rejection region: • Type I error (false rejection): Reject H0, even though H0 is true • Type II error (false acceptance): Accept H0, even though H0 is false
Probability of regions • False rejection: • Happens with probability (R) = P(XR; H0) • False acceptance: • Happens with probability (R) = P(XR; H1)
Analogy with Bayesian • Assume that we have two hypothesis =0 and =1, with priors p(0) and p(1) • The overall probability of error is minimized using the MAP rule: • Given observations x of X, =1 is true if • p(0)pX|(x|0) < p(1)pX|(x|1) • Define: = p(0) / p(1) • L(x) = pX|(x|1) / pX|(x|0) • =1is true if the observed values of x satisfy the inequality: L(x)>
More on testing • Motivated by the MAP rule, the rejection region has the form R={x|L(x)>} • The likelihood ratio test • Discrete: L(x)= pX(x;H1) / pX(x;H0) • Continuous: L(x) = fX(x;H1) / fX(x;H0)
Example • Six sided die • Two hypothesis • Find the likelihood ratio test (LRT) and probability of error
Error probabilities for LRT • Choosing trade-offs between the two error types, as increases, the rejection region becomes smaller • The false rejection probability (R) decreases • The false acceptance probability (R) increases
LRT • Start with a target value for the false rejection probability • Choose a value such that the false rejection probability is equal to : P(L(X) > ; H0) = • Once the value x of X is observed, reject H0 if L(x) > • The choices for are 0.1, 0.05, and 0.01
Requirements for LRT • Ability to compute L(x) for observations X • Compare the L(x) with the critical value • Either use the closed form for L(x) (or log L(x)) or use simulations to approximate
Example • A camera checking a certain area • Recording the detection signal • X=W, and X=1+W depending on the presence of the intruders (hypothesis H0 and H1) • Assume W~N(0,) • Find the LRT and acceptance/rejection region