90 likes | 239 Views
Chapter 8. Channel Capacity. Channel Capacity. Define C = max { I ( A ; B ) : p ( a )} = Amount of useful information per bits actually sent = The change in entropy by going through the channel (drop in uncertainty). after receiving. before receiving. average uncertainty
E N D
Chapter 8 Channel Capacity
Channel Capacity Define C = max {I(A; B) : p(a)} = Amount of useful information per bits actually sent = The change in entropy by going through the channel (drop in uncertainty) after receiving before receiving average uncertainty being sent: 8.1
Uniform Channel channel probabilities do not change from symbol to symbol I.e. the rows of the probability matrix are permutations of each other. So the following is independent of a: 1 for some b 0 all others Consider no noise: P(b | a) = W = 0 I(A ; B) = H(B) = H(A) (conforms to intuition only if permutation matrix) All noise implies H(B) = W 8.2
Capacity of Binary Symmetric Channel P p(a = 0) 0 0 p(b = 0) Q C = max {I(A; B) : p(a)} = max {H(B) - W : p(a)} = max p(a = 1) 1 1 p(b = 1) where x = pP + (1 − p)Q p = p(a = 0) maximum occurs when x = ½, p = ½ also (unless all noise). C = 1 − H2(P) 8.5
Numerical Examples If P = ½ + ε, then C(P) 3 ε2 is a great approximation. 8.5
|A| = q = 2n−1 a = c1 … cn (even parity) H2(A) = n − 1 |B| = 2n = 2q c1 … cn = b (any parity) H2(B) = ?? Error Detecting Code P Q Q P For blocks of size n, we know the probability of k errors = A uniform channel with equiprobable input: p(a1) = … = p(aq). Apply to n-bit single error detection, with one parity bit among ci {0, 1}: every b B can be obtained from any a A by k = 0 … n errors: 8.3
|| nthterm = 0 || 0thterm = 0 || || 1 1 This is W for one bit n∙ … |B| = 2n W 8.3
think of this as the channel P Q Q P encode decode × 3 noisy channel 3 Error Correcting Code triplicate majority uncoded coded P3+3P2Q3PQ2+Q3 3PQ2+Q3P3+3P2Q let P′ = P2·(P + 3Q) 8.4
Shannon’s Theorem will say that as n = 3 → ∞, there are codes that take P′ → 1 while C(P′)/n → C(P). 8.4