210 likes | 311 Views
J. Nayak, E. Tuncel and K. Rose University of California, Santa Barbara. Zero-error source-channel coding with source side information at the decoder. Outline. The problem Asymptotically vanishing error case Zero error Unrestricted Input Restricted Input How large are the gains?
E N D
J. Nayak, E. Tuncel and K. Rose University of California, Santa Barbara Zero-error source-channel coding with source side information at the decoder
Outline • The problem • Asymptotically vanishing error case • Zero error • Unrestricted Input • Restricted Input • How large are the gains? • Conclusions
sc sc s c c s S-C Encoder S-C Decoder n n n n U Channel Encoder Channel Decoder Source Encoder î Source Decoder X Û Y n i n n n X Y U Û Channel p(y|x) Channel p(y|x) n V n V The Problem • Is separate source and channel coding optimal? Does an encoder-decoder pair exist?
Asymptotically Vanishing Probability of Error • Source coding: R>H(U|V) • Slepian-Wolf code • Channel coding: R<C • Source-channel code (Shamai et. al.) • Communication not possible if H(U|V)>C • Separate source and channel coding asymptotically optimal
Channel • Channel transition probability p(y|x), yY, x X Characteristic graph of the channel • Examples • Noiseless channel : Edge free graph • Conventional channel : Complete graph
Channel Code • Code = symbols from an independent set • 1-use capacity = log2(Gx) • n uses of the channel • Graph = GXn, n-fold AND product of GX • Zero error capacity of a graph Depends only on characteristic graphGX
Source With Side Information • (U,V)UxV ~ p(u,v) • Support set SUV = {(u,v) UxV: p(u,v)>0} • Confusability graph on U: GU=(U,EU) • Examples • U=V : Edge free graph • U,V independent : Complete graph
Source Code • Rate depends only on GU • Connected nodes cannot receive same codewordEncoding=Coloring GU • Rate = log2(GU) • Two cases • Unrestricted inputs • Restricted inputs
Unrestricted Input • (u,v) not necessarily in SUV • Decode correctly if (u,v) SUV • 1-instance rate: log2(GU) • n-instance graph • Graph = Gu(n), n-fold OR product of Gu • Asymptotic rate for UI code
Restricted Input • (u,v) in SUV • 1-instance rate: log2[(GU)] • n-instance graph • Graph = Gun, n-fold AND product of Gu • Asymptotic rate = Witsenhausen rate of source graph
Source-Channel Coding • 1 source instance -1 channel use code • Encoder • Decoder • u1 and u2 are not distinguishable given side information sc1(u1) and sc1(u2) should not result in same output y • u1 and u2 connected in GUsc1(u1) and sc1(u2) are not connected in GXandsc1(u2) sc1(u1)
Source-Channel Coding • If n-n UI (RI) code exists for some n, (GU,GX) is a UI (RI) compatible pair • (G, G) is always a UI and RI compatible pair
A A Channel C D E B C D E B Unrestricted Input a Source • C(GX5) = log2[5] • RUI(GU5) = log2[5/2]> C(GX5) • Source = Complement of channel a b c d e A D B E C e b = d c
Restricted Input • Previous example not useful • RW(GU5) = log2[5] = C(GX5) • Source graph GU = complement of channel graph GX • Approach: Find f(G) such that • C(G) f(G)RW(G) • If either inequality strict, done!
Lovász theta function: • Lovász: • Key result:
Restricted Input • GU = Schläfli graph ( 27 vertex graph) = GX • Haemers • Code exists since GU = GX
How large are the gains? • Channel uses per source symbol • Alon ‘90: There exist graphs such that C(G) < log k and • Given l, there exist G such that
Conclusions • Under a zero error constraint separate source and channel coding is asymptotically sub-optimal. • Not so for the asymptotically vanishing error case. • In the zero-error case, the gains by joint coding can be arbitrarily large.
Scalar Code Design Complexity • Instance: Source graph G • Question: Does a scalar source-channel code exist from G to channel H? • Equivalent to graph homomorphism problem from G into H • NP-complete for all H (Hell & Nesetril ’90)
Future Work • Do UI compatible pairs (GU,GX) exist with RW(GU)<C(GX) <RUI(GU)? • For what classes of graphs is separate coding optimal?