1 / 5

Chapter 10

Chapter 10. Shannon’s Theorem. M distinct equiprobable n -bit blocks. A = { a i : i = 1, …, M }. B = { b j : | b j | = n , j = 1, …, 2 n }. Intuitively, each block comes through with n ∙ C bits of information. . P Q Q P. Random Codes. I 2 ( a i ) = log 2 M.

chayton
Download Presentation

Chapter 10

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 10 Shannon’s Theorem

  2. M distinct equiprobable n-bit blocks A = {ai : i = 1, …, M} B = {bj : |bj| = n, j = 1, …, 2n} Intuitively, each block comes through with n∙C bits of information. P Q Q P Random Codes I2(ai) = log2M C = 1 − H2(Q) Q < ½ small number ε > 0 To signal close to capacity, we want I2(ai) = n (C− ε) Send an n-bit block code through a binary symmetric channel: intuitively, # of message that can get thru channel by increasing n, this can be made arbitrarily large  we can choose M so that we use only a small fraction of the # of messages that could get thru – redundancy. Excess redundancy gives us the room required to bring the error rate down. For a large n, pick Mrandom codewords from {0, 1}n. 10.4

  3. Similarly, around each bj: What us the probability that an uncorrectable error occurs? Consider a sphere on radius n(Q+ε′) about each ai: With high probability, almost all ai will be a certain distance apart (provided M << 2n). Picture the ai in n-dimensional Hamming space. As each ai goes thru channel, we expect nQ errors on average. nε′ too much noise another a′ is also inside bj received symbol By the law of large numbers, ai nQ sent symbol ai a′ ai bj can be made  δ nQ nε′ 10.4

  4. # of code words C = Channel capacity n = block size (as yet undetermined) Idea Shannon’s Theorem Will show  some/most codes must work. Take any code, it will probably work! Q = prob. of channel error where a is the codeword sent, and b the one received. We decide how close we wish to approach the channel capacity. Number of possible random codes = (2n)M = 2nM, each equally likely 0 with probability P (no error) 1 with probability Q (error) Let X = represents errors in channel If the error vector a  b = (X1, …, Xn), then d(a, b) = X1 + … + Xn (by law of large numbers) N. B. Q = E{X}  Q < ½ , pick ε′  Q + ε′ < ½ 10.5

  5. Since the a′ are randomly (uniformly) distributed throughout, by the binomial bound volume of whole space  10.5

More Related