110 likes | 137 Views
In this memorial for Claude Shannon, explore the core topics of entropy, source coding, channel coding, and the Separation Theorem. Understand how information quantity relates to uncertainty, encoding techniques, channel capacity, and the integration of source and channel coding. Delve into Shannon's groundbreaking theories and their practical implications. 8 Relevant
E N D
Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley www.eecs.berkeley.edu/~wlr
Outline • Claude Shannon • Entropy • Source Coding • Channel Coding • Separation Theorem
Claude Shannon • 4/30/1916 – 2/24/2001 • 37: Boolean Algebra Logical Circuits • 48: Mathematical Theory of Communications
Entropy • How much information is required to convey the value of a random variable? • Key insight: The quantity of information is related to the uncertainty of that value. • Example 1: Coin Flip = One bit of information • Example 2: Two coin flips = 2 bits • Example 3: N equally likely values = log2(N) bits
Source Coding • How do we encode the values to be able to convey them with the minimum number of bits? • Key idea: Look at sequence of outcomesX(1), X(2), …, X(n) where X(m) is in {1, 2, …, K} • For n large, there are only 2nH equally likely sequences where H is smaller than log2K • In fact, H = - S pi log2(pi)
H 1 0 p 0 0.5 1 Source Coding (c’d) • Example:P(1) = p = 1 – P(0)H = - p log2p – (1 – p)log2 (1 – p)
2n n-bit words 2nH equally likely n-bit words. Source Coding (c’d) • Thus, for large n:
Channel Capacity • Question: How fast can one transmit bits reliably through a noisy channel? • Naïve answer: No reliable transmission is possible. • Shannon’s formulation: What is the possible rate, in the long term, if one wants the bit error rate to be arbitrarily small? • Shannon’ s answer: Channel Capacity
Sent Received 1 - p 0 0 p p 1 1 1 - p C 1 0 p 0 0.5 1 Channel Capacity (c’d) • Example:
2n equally likely n-bit words 2nH equally likely n-bit words for one word sent. Channel Capacity (c’d) • Justification: Choose 2nK n-bit words (fair coin flips) => 2n/2nH = 2n(1 – H) = 2nC distinguishable codewords
Separation Theorem • Source with Entropy H bits per symbol • Channel with Capacity C bps • Can send C/H symbols per seconds • First code symbols (n symbols => nH bits) • Then code channel(send bits with suitable codewords) • Hence: Separate source and channel coding!