1 / 13

Lecture 14 Decision Rules, Nearest Neighbor Decoding (Section 4.2, 4.3)

Lecture 14 Decision Rules, Nearest Neighbor Decoding (Section 4.2, 4.3). Theory of Information. What Is a Decision Rule?. received message … x …. encoded message … …. decoded message. Communications channel model. decision rule. noise.

hermanf
Download Presentation

Lecture 14 Decision Rules, Nearest Neighbor Decoding (Section 4.2, 4.3)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 14Decision Rules, Nearest Neighbor Decoding(Section 4.2, 4.3) Theory of Information

  2. What Is a Decision Rule? received message … x … encoded message … … decoded message Communications channel model decision rule noise An (n,M)-code means a block code of length n and size M (i.e. every codeword has length n, and there are M codewords altogether) Definition Let C be an (n,M)-code over a code alphabet A, and assume C does not contain the symbol ?. A decision rulefor C is a function f: AnC{?}. Intuition: f(x)=c means assuming that the received word x was meant to be c, i.e.that c was sent. In other words, decoding, or interpreting x as c. If we cannot identify such a codeword, i.e. cC, the symbol ? is used to declare a decoding error, so in this case always c=?.

  3. Two Sorts of Decision Rules Goal: maximize the probability of correct decoding Code alphabet: {0,1} Code: {0000,1111} Channel: 90% 0 0 1 1 10% 11% 89% If 0111 is received, how would you interpret it? If 0011 is received, how would you interpret it? But if you knew that 1111 is sent 99% of the time, how would you interpret 0011?

  4. Which Rule Is Better? Ideal observer decision rule Maximum likelihood decision rule Advantages:

  5. Ideal Observer Decision Rule c1 25% 40% c2 x 35% c3 c1 P(c1 sent | x received) x P(cMsent | x received) cM Ideal observer decision rule decodes received codeword x ascodeword c with maximal probability P(c sent | x received).

  6. Maximum Likelihood Decision Rule c1 50% 80% c2 x 70% c3 c1 P(x received | c1sent ) x P(x received | cM sent) cM Maximum likelihooddecision rule decodes received codeword x as codeword c = f(x)with maximal probability P(x received | c sent).

  7. One Rule as a Special Case of the Other c1 50% 80% c2 x 70% c3 Assume all codewords are equally likely to be sent. What would be the backward probabilities then? c1 c2 x c3 Theorem 4.2.2: For the uniform input distribution, ideal observer decoding coincides with maximum likelihood decoding.

  8. Example 4.2.1 Suppose codewords of C = {000, 111} are sent over a binary symmetric channel with crossover prob. p = 0.01. If string 100 is received, how should it be decoded by the maximum likelihood decision rule? P(100 received | 000 sent)= P(100 received | 111 sent)= Would the same necessarily be the case under ideal observer decoding?

  9. Sometimes the Two Rules Yield Different Results 95% 0 0 5% 6% 1 1 94% Assume: C={00,11}; 11 is sent 70% of the time; 01 is received. How would 01 be decoded by the maximum likelihood rule? How would 01 be decoded by the ideal observer rule? As As

  10. Handling Ties In the case of ties, it is reasonable for a given decision rule to declare an error. Suppose codewords of C = {0000, 1111} are sent over a binary symmetric channel with crossover prob. p = ¼ . How should the following strings be decoded by the maximum likelihood decision rule? 0000 1011 0011 P(0011 received | 0000 sent)= P(0011 received | 1111 sent)=

  11. Hamming Distance and Nearest Neighbor Decoding DefinitionLet x and y be two strings of the same length over the same alphabet. The Hamming distance between x and y, denotedd(x,y),is defined to be the number of places in which x and y differ. E.g.: d(000,100) = 1, d(111,100) = 2. The decision rule assigning a received word the closest codeword (in Hamming distance) is called the nearest neighbor decision rule. Theorem 4.3.2 For BSC, the maximum likelihood decision rule is equivalent to the nearest neighbor decision rule.

  12. Exercise 7 of Section 4.3 10% Construct a binary channel for which maximum likelihood decoding is not the same as nearest neighbor decoding. 0 0 1 1 Let C= {001, 011} Assume 000 is received. 90% 50% 50% P(000 received | 001 sent) = P(000 received | 011 sent) =

  13. Homework Exercises 2,3,4,5 of Section 4.2. Exercises 1,2,3 of Section 4.3.

More Related