210 likes | 348 Views
Information theory. Multi-user information theory Part 3: Slepian Wolf Source Coding A.J. Han Vinck Essen, 2002. content. Source coding for dependent sources that are separated. Goal of the lecture:. Explain the idea of independent source coding for dependent source.
E N D
Information theory Multi-user information theory Part 3: Slepian Wolf Source Coding A.J. Han Vinck Essen, 2002
content • Source coding for dependent sources that are separated
Goal of the lecture: • Explain the idea of • independent source coding for dependent source
Problem explanation: X (X,Y) decoder Y Independent encoding of X: we use nH(X) bits Independent encoding of Y: we use nH(Y) bits Total: n[H(X) + H(Y)] nH(X,Y) = n[H(X)+H(Y|X)]
realization X X H(X) De-compress compress XHT YHT NHT Syndrome former De code N Y n-k nh(p) Y = X N Total rate: H(X) + H(Y|X) = H(X) + n h(p)
Can we do with less? Generate 2nH(X) „typical“ X sequences decoder needs only nH(X) bits to determine X Generate 2nH(Y) „typical“ Y sequences how does Y operate?
intermezzo for: N ~ 2n[H(Y|X)+] different colors we do: M ~ 2nH(Y|X) random selections Then: Probability( all different colors in M drawings ) = = N(N-1)(N-2)(N-M+1)/NM ~ 1 for M/N 0; N large
Coding for Y Y generates 2nH(Y) typical sequences: every sequence get one of 2n[H(Y|X)+] colors The decoder knows „everything“ about X, Y and the coloring
decoding • decode X from nH(X) received bits • Find the possible 2nH(Y|X) typical Y sequences for the particular X • 3) Use the color to find Y from the n[H(Y|X)+] bits
Result: Sum rate: nH(X) + n[H(Y|X)+] nH(X,Y) for small and n large
Homework: formalize the proof
alternative • For linear systematic codes: H = [ H1 , H2 ] = [ H1 , In-k ] k n – n h(m/n) (Hamming bound) m = # of correctable errors
General Transmission scheme: assume A and B differ in m positions A = ( a1 , a2 , a3 ) B = ( b1 , b2 , b3 ) k1 k2 n-kk1 k2 n-k Transmitter: X = ( a1, HA) and Y = (b2, HB) 2n – k bits Receiver: S = H[ A B ] ( e1 ,e2 ) = ( a1 b1 , a2 b2 ) a2 , b1 a3 = H (a1, a2) HA b3 = H ( b1, b2 ) HB
Efficiency: • Entropy: H(A) + H(B|A) = n + nh(m/n) • Transmitted: 2n-k = n + (n-k) n + nh(m/n) Optimal if we have optimal codes!