190 likes | 465 Views
Compression with Side Information using Turbo Codes . Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference April 3, 2002. Overview . Introduction Turbo Coder and Decoder Compression of Binary Sequences
E N D
Compression with Side Information usingTurbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference April 3, 2002
Overview • Introduction • Turbo Coder and Decoder • Compression of Binary Sequences • Extension to Continuous-valued Sequences • Joint Source-Channel Coding • Conclusion Compression with Side Information Using Turbo Codes April 3, 2002
Encoder Encoder Statistically dependent Decoder Slepian-Wolf Theorem Distributed Source Coding Compression with Side Information Using Turbo Codes April 3, 2002
Research Problem • Motivation • Slepian-Wolf theorem: It is possible to compress statistically dependent signals in a distributed manner to the same rate as with a system where the signals are compressed jointly. • Objective • Design practical codes which achieve compression close to the Slepian-Wolf bound Compression with Side Information Using Turbo Codes April 3, 2002
Encoder Statistically dependent Decoder Asymmetric Scenario – Compression with Side Information • Compression techniques to send at rate close to H(Y) are well known • Can perform some type of switching for more symmetric rates Compression with Side Information Using Turbo Codes April 3, 2002
Our Approach: Turbo Codes • Turbo Codes • Developed for channel coding • Perform close to Shannon channel capacity limit (Berrou, et al., 1993) • Similar work • Garcia-Frias and Zhao, 2001 (Univ. of Delaware) • Bajcsy and Mitran, 2001 (McGill Univ.) Compression with Side Information Using Turbo Codes April 3, 2002
Encoder Statistically dependent Decoder System Set-up • X and Y are i.i.d binary sequences X1X2…XL and Y1Y2…YL with equally probable ones and zeros. Let Xi be independent of Yj for ij, but dependent on Yi. X and Y dependency described by pmf P(x|y). • Y is sent at rate RYH(Y) and is available as side information at the decoder Compression with Side Information Using Turbo Codes April 3, 2002
L bits in L bits Systematic Convolutional Encoder Rate Systematic Convolutional Encoder Rate Discarded bits bits Interleaverlength L Discarded L bits Turbo Coder Compression with Side Information Using Turbo Codes April 3, 2002
Channel probabilities calculations Channel probabilities calculations bits in bits in Decision Interleaverlength L Interleaverlength L Deinterleaverlength L Deinterleaverlength L Turbo Decoder Pchannel SISO Decoder Pa posteriori Pa priori Pextrinsic L bits out Pextrinsic Pa priori SISO Decoder Pchannel Pa posteriori Compression with Side Information Using Turbo Codes April 3, 2002
Simulation: Binary Sequences • X-Y relationship • P(Xi=Yi)=1-p and P(XiYi)=p • System • 16-state, Rate 4/5 constituent convolutional codes; • RX=0.5 bit per input bit with no puncturing • Theoretically, must be able to send X without error when H(X|Y)0.5 Compression with Side Information Using Turbo Codes April 3, 2002
Results: Compression of Binary Sequences RX=0.5 0.15 bit Compression with Side Information Using Turbo Codes April 3, 2002
Results for different rates • Punctured the parity bits to achieve lower rates Compression with Side Information Using Turbo Codes April 3, 2002
Turbo Coder Turbo Coder L values ML bits L symbols Interleaverlength L Quantize to 2M levels Convert to bits Convert to bits ML bits Extension to Continuous-Valued Sequences • X and Y are sequences of i.i.d continuous-valued random variables X1X2…XL and Y1Y2…YL. Let Xi be independent of Yj for ij, but dependent on Yi. X and Y dependency described by pdf f(x|y). • Y is known as side information at the decoder To decoder Compression with Side Information Using Turbo Codes April 3, 2002
Simulation: Gaussian Sequences • X-Y relationship • X is a sequence of i.i.d Gaussian random variables • Yi=Xi+Zi, where Z is also a sequence of i.i.d Gaussian random variables, independent of X. f(x|y) is a Gaussian probability density function • System • 4-level Lloyd-Max scalar quantizer • 16-state, rate 4/5 constituent convolutional codes • No puncturing so rate is 1 bit/source sample Compression with Side Information Using Turbo Codes April 3, 2002
Results: Compression of Gaussian Sequences RX=1 bit/sample 2.8 dB CSNR = ratio of the variance of X and Z Compression with Side Information Using Turbo Codes April 3, 2002
Joint Source-Channel Coding • Assume that the parity bits pass through a memoryless channel with capacity C • We can include the channel statistics in the decoder calculations for Pchannel. • From Slepian-Wolf theorem and definition of Channel capacity Compression with Side Information Using Turbo Codes April 3, 2002
Results: Joint Source-Channel Coding RX=0.5 BSC with q=0.03 0.15 bit 0.12 bit Compression with Side Information Using Turbo Codes April 3, 2002
Conclusion • We can use turbo codes for compression of binary sequences. Can perform close to the Slepian-Wolf bound for lossless distributed source coding. • We can apply the system for compression of distributed continuous-valued sequences. Performs better than previous techniques. • Easy extension to joint source-channel coding Compression with Side Information Using Turbo Codes April 3, 2002