90 likes | 206 Views
Chapter 7. The Channel and Mutual Information. symbols can’t be swallowed. a 1 : : a q. b 1 : : b s. A. B. alphabet of symbols sent. alphabet of symbols received. P ( b j |a i ). or randomly generated. Information Through a Channel.
E N D
Chapter 7 The Channel and Mutual Information
symbols can’t be swallowed a1:: aq b1:: bs A B alphabet of symbols sent alphabet of symbols received P(bj|ai) or randomly generated Information Through a Channel For example, in an error correcting code over a noisy channel, s≥ q. If two symbols sent are indistinguishable when received, s < q. Characterize a stationary channel by a matrix of conditional probabilities: received row column sent s Pi,j P = Pi,j = P(bj | ai) q 7.1, 7.2, 7.3
For p(ai) = probability of source symbols, let p(bj) = probability of being received [p(a1) … p(aq)]P = [p(b1) … p(bs)] no noise: Pi,j = I; p(bj) = p(aj) all noise: Pi,j = 1/s; p(bj) = 1/s The probability that ai was sent and bj was received is: Baye’s Theorem P(ai, bj) = p(ai) ∙ P(bj | ai) = p(bj) ∙ P(ai | bj). [coincidental probability] So if p(bj) ≠ 0, the backwards conditional probabilities are: 7.1, 7.2, 7.3
Binary symmetric Channel P0,0 a = 0 b = 0 p(a = 0) = p p(a = 1) = 1 − p P0,0 = P1,1 = P P0,1 = P1,0 = Q P0,1 P1,0 P1,1 a = 1 b = 1 a = 0 a = 1 p(b = 0) p(b = 1) (p∙ P + (1 − p) ∙ Qp ∙ Q + (1 − p) ∙ P) P Q Q P (p 1−p) = 7.4
P = 1 Q = 0 P = Q = ½ If p = 1 − p = ½ (equiprobable) then: P(a = 1 | b = 0) = P(a = 0 | b = 1) = Q P(a = 0 | b = 0) = P(a = 1 | b = 1) = P P Q Q P 7.4
H(A| B) H(B| A) H(A) Input entropy H(B) Output entropy System Entropies condition on bj : average over all bj : This is the information loss in the channel, called “equivocation” (also called “noise entropy”) Similarly : 7.5
H(A, B) Joint Entropy Intuition: taking snapshots A B Define : H(A| B) H(B| A) H(B | A) H(A) H(A | B) H(A, B) = H(B) 7.5
H(A, B) A priori A posteriori joint H(A| B) H(B| A) P(ai | bj) p(ai) I(A; B) The amount of information they are sharing corresponds to Information gain upon receiving bj : I(ai) − I(ai | bj) . shared Mutual Information By symmetry: If ai and bj are independent (all noise), then P(ai ; bj) = p(ai) ∙ p(bj) and hence P(ai | bj) = p(ai) I(ai ; bj) = 0. No information gained in channel. 7.6
from symmetry Similarly: Average over all ai: By Gibbs I(A; B) ≥ 0. Equality only if P(ai, bj) = p(ai)∙p(bj) [independence]. = H(A) + H(B) − H(A, B) ≥ 0 H(A, B) H(A) + H(B) We know H(A, B) = H(A) + H(B | A) = H(B) + H(A | B). I(A ; B) = H(A) − H(A | B) = H(B) − H(B | A) ≥ 0 H(A | B) H(A) and H(B | A) H(B). 7.6