380 likes | 542 Views
Towards Practical Distributed Coding. Bernd Girod Information Systems Laboratory Stanford University. Outline. Distributed lossless compression Simple examples Slepian-Wolf Theorem Slepian-Wolf coding vs. systematic channel coding Turbo codes for compression
E N D
Towards Practical Distributed Coding Bernd Girod Information Systems Laboratory Stanford University
Outline • Distributed lossless compression • Simple examples • Slepian-Wolf Theorem • Slepian-Wolf coding vs. systematic channel coding • Turbo codes for compression • Lossy compression with side information • Wyner-Ziv Theorem • Optimal quantizer design with Lloyd algorithm • Selected applications • Sensor networks • Low complexity video coding • Error-resilient video transmission
Source B Source statistics exploited in the encoder.Different statistics Different code. Simple Example Source A
Source B Different statistics Same code. Source statistics exploited in the decoder. “Lossless” compression with residual error rate. Simple Example - Revisited Source A
Compression with Side Information Source A/B Decoder Encoder
Distributed Compression of Dependent Sources Encoder X Source X X Joint Decoder Y Source Y Encoder Y
Separate encodingand decoding of X and Y Separate encodingand joint decoding of X and Y Achievable Rates for Distributed Coding Example
No errors Vanishing error probabilityfor long sequences General Dependent i.i.d. Sequences [Slepian, Wolf, 1973]
Idea Interpret Y as a “noisy” version of X with “channel errors” D Encoder generates “parity bits” P to protect against errors D Decoder concatenates Y and P and performs error-correcting decoding Distributed Compression and Channel Coding Source X|Y Decoder Encoder P
Practical Slepian-Wolf Encoding • Coset codes [Pradhan and Ramchandran, 1999] • Trellis codes [Wang and Orchard, 2001] • Turbo codes [Garcia-Frias and Zhao, 2001] [Bajcsy and Mitran, 2001] [Aaron and Girod, 2002] • LDPC codes [Liveris, Xiong, and Georghiades, 2002]
bits bits Interleaverlength L Compression by Turbo Coding L bits in L bits Systematic Convolutional Encoder Rate Systematic Convolutional EncoderRate L bits [Aaron, Girod, DCC 2002]
“Channel” probabilities calculations “Channel” probabilities calculations bits in bits in Interleaverlength L Decision Interleaverlength L Deinterleaverlength L Deinterleaverlength L Turbo Decoder Pchannel SISO Decoder Pa posteriori Pa priori Pextrinsic L bits out Pextrinsic Pa priori SISO Decoder Pchannel Pa posteriori [Aaron, Girod, DCC 2002]
X,Y dependent binary sequences with symmetric cross-over probabilities Rate 4/5 constituent convolutional codes; RX=0.5bit per input bit 0.15 bit Results for Compression of Binary Sequences [Aaron, Girod, DCC 2002]
Results with Puncturing [Aaron, Girod, DCC 2002]
Systematic Channel Coding Systematic bits and parity bits subject to bit-errors Mostly memoryless BSC channel Low bit-error rate of channel Rate savings relative to systematic bits Slepian-Wolf Coding No bit errors in parity bits General statistics incl. memory Whole range of “error” probabilities Rate savings relative to parity bits Might have to compete with conventional compression (e.g., arithmetic coding) Channel Coding vs. Slepian-Wolf Coding
Achievable rate region Distributed Lossy Compression of Dependent Sources Encoder X Source X X’ Joint Decoder Y’ Source Y Encoder Y
[Wyner, Ziv, 1976] [Zamir,1996] Lossy Compression with Side Information Source Decoder Encoder
Practical Wyner-Ziv Encoder and Decoder Wyner-Ziv Decoder Wyner-Ziv Encoder Slepian- Wolf Decoder Minimum Distortion Reconstruction Slepian-Wolf Encoder Quantizer
Non-Connected Quantization Regions • Example: Non-connected intervals for scalar quantization • Decoder: Minimum mean-squared error reconstruction with side information x x
1 3 4 2 q = 1 2 3 4 1 2 x Finding Quantization Regions
Update rate measure for current quantizers Y Convergence End Lloyd Algorithm for Wyner-Ziv Quantizers Choose initial quantizers Find best reconstruction functions for current quantizers [Fleming, Zhao, Effros, unpublished] [Rebollo-Monedero, Girod, DCC 2003] Lagrangian cost for current quantizers, reconstructor and rate measure N Find best quantizers for current reconstruction and rate measure
Conditional entropy coder H(Q|Y) [Rebollo-Monedero, Girod, DCC 2003] Which Rate Measure? Wyner-Ziv Decoder Wyner-Ziv Encoder Slepian- Wolf Decoder Minimum Distortion Reconstruction Slepian-Wolf Encoder Quantizer
Example Data set: Video sequence Carphone, 100 luminance frames, QCIF • X:pixel values in even frames • Y:motion-compensated interpolation from two adjacent odd frames
Example (cont.) Quantizer w/ rate constraint H(Q)Quantizer w/ rate constraint H(Q|Y) PSNR=37.4 dBH(Q)=1.87 bit H(Q|Y)=0.54 bit PSNR=39 dBH(Q)=3.05 bit H(Q|Y)=0.54 bit
Example (cont.) Quantizer w/ rate constraint H(Q)Quantizer w/ rate constraint H(Q|Y) PSNR[dB] PSNR[dB] Rate [bit] Rate [bit]
Wyner-Ziv Quantizers: Lessons Learnt • Typically no quantizer index reuse for rate constraint H(Q|Y) and high rates: Slepian-Wolf code provides more efficient many-to-one mapping in very high dimensional space. • Uniform quantizers close to minimum m.s.e., when combined with efficient Slepian-Wolf code • Quantizer index reuse required for rate constraint H(Q) and for fixed-length coding • Important to decouple dimension of quantizer (i.e. scalar) and Slepian-Wolf code (very large)
Remote Sensor Remote Sensor Remote Sensor Remote Sensor Sensor Networks Local Sensor Central Unit Side Information [Pradhan, Ramchandran, DCC 2000] [Kusuma, Doherty, Ramchandran, ICIP 2001] [Pradhan, Kusuma, Ramchandran, SP Mag., 2002] [Chou, Perovic, Ramchandran, Asilomar 2002]
Interframe Decoder Intraframe Encoder Slepian-Wolf Codec Video frame Reconstruction Turbo Decoder Turbo Encoder Scalar Quantizer Buffer X’ X Request bits Y previous Interpolation Key frames next Video Compression with Simple Encoder [Aaron, Zhang, Girod, Asilomar 2002] [Aaron, Rane, Zhang, Girod, DCC 2003]
Video Compression with Simple Encoder After Wyner-Ziv Decoding Decoder Side information 16-level quantization (~1 bpp)
Video Compression with Simple Encoder After Wyner-Ziv Decoding Decoder Side information 16-level quantization (~1 bpp)
Video Compression with Simple Encoder After Wyner-Ziv Decoding Decoder Side information 16-level quantization (~1 bpp)
7 dB Performance of Simple Wyner-Ziv Video Coder
Side info Digital Channel Wyner- Ziv Encoder Wyner- Ziv Decoder Digitally Enhanced Analog Transmission • Forward error protection of the signal waveform • Information-theoretic bounds [Shamai, Verdu, Zamir,1998] • “Systematic lossy source-channel coding” Analog Channel
Wyner-Ziv Decoder A Wyner-Ziv Encoder A S* Wyner-Ziv Decoder B Wyner-Ziv Encoder B S** Forward Error Protection for MPEG Video Broadcasting MPEG Encoder MPEG Decoder with Error Concealment Graceful degradation without a layered signal representation S S’ Error-Prone channel
Error-resilient Video Transmissionwith Embedded Wyner-Ziv Codec Carphone CIF, 50 frames @ 30fps, 1 Mbps, 1% Random Macroblock loss [Aaron, Rane, Rebollo-Monedero, Girod, ICIP 2003]
Towards Practical Distributed Coding:Why Should We Care? • Chance to reinvent compression from scratch • Entropy coding • Quantization • Signal transforms • Adaptive coding • Rate control • . . . • Enables new compression applications • Sensor networks • Very low complexity encoders • Error-resilient transmission of signal waveforms • Digitally enhanced analog transmission • Unequal error protection without layered coding • . . .