220 likes | 334 Views
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources. D. Krithivasan and S. Sandeep Pradhan University of Michigan, Ann Arbor. TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A A A. Presentation Overview.
E N D
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources D. Krithivasan and S. Sandeep Pradhan University of Michigan, Ann Arbor TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAA
Presentation Overview • Problem Formulation • Motivation • Nested Linear Codes • Main Result • Applications and Examples • Conclusions
Problem Formulation • Distributed Source Coding • Typical application: Sensor networks. • Example: Lossless reconstruction of all sources – joint entropy.
Problem Formulation • We ask: What if the decoder is interested only in a function of the sources? • In general: fidelity criterion of the form • Ex: average of the sensor measurements. • Obvious strategy: Reconstruct the sources and then compute the function. • Are rate gains possible if we directly encode the function in a distributed setting?
Motivation: A Binary Example • Korner and Marton – Reconstruction of • Centralized encoder: • Compute • Compress using a good source encoder • Suppose satisfies • Centralized scheme becomes distributed scheme. • Are there good source codes with this property? • Linear Codes.
The Korner-Marton Coding Scheme • matrix such that: • Decoder with high probability. • Entropy achieving: • Encoders transmit • Decoder: with high probability. • Rate pair achievable. • Can be lower than Slepian-Wolf bound: • Scheme works for addition in any finite field.
Properties of the Linear Code • Matrix :Puts different typical in different bins. • Consider - Coset code • Good channel code for channel with noise • Both encoders use identical codebooks • Binning completely “correlated” • Independent binning more prevalent in information theory.
Slepian-Wolf Coding • Function to be reconstructed • Treat binary sources as sources. • Function equivalent to addition in : • Encode the vector function one digit at a time. Second digit of First digit of
Slepian-Wolf Coding contd. • Use Korner-Marton coding scheme on each digit plane. • Sequential strategy achieves Slepian-Wolf bound. • General lossless strategy: • “Embed” the function in a digit plane field (DPF). • DPF – direct sum of Galois fields of prime order. • Encode the digits sequentially using Korner-Marton strategy.
Lossy Coding • Quantize to , to • - best estimate of w.r.t the distortion measure given • Use lossless coding to encode • What we need: Nested linear codes.
Nested Linear Codes • Codes used in KM, SW – good channel codes • Cosets bin the entire space. • Suitable for lossless coding. • Lossy coding: Need to quantize first. • Decrease coset density.
Nested Linear Codes • Codes used in KM, SW – good channel codes • Cosets bin the entire space. • Suitable for lossless coding. • Lossy coding: Need to quantize first. • Decrease coset density – Nested linear codes. • Fine code: quantizes the source. • Coarse code: bins only the fine code.
Nested Linear Codes • Linear code • nested if • We need • : “good” source code • Can find jointly typical with • :“good” channel code • Can find unique typical for a given
Good Linear Source Codes • Good linear code for the triple • Assume for some prime • Exists for large if • Not a good source code in the Shannon sense. • Contains a subset that is a good Shannon source code. • Linearity – rate loss of bits/sample
Good Linear Channel Codes • Good linear code for the triple • Assume for some prime • Exists for large if • Not a good channel code in the Shannon sense. • Every coset contains a subset which is a good channel code. • Linearity – rate loss of bits/sample
Main Result • Fix test channel such that and • Embed in . Need to encode • Fix order of encoding of digit planes – • Idea: Encode one digit at a time. • At bth stage: Use previous reconstructed digits as side information.
Coding Strategy for • Good source codes , good channel code
Cardinalities of the Linear Code • Cardinality of the nested codes • Rate of encoder: • Conventional coding:
Coding Theorem • An achievable rate region • Corollary:
Nested Linear Codes Achieve Rate Distortion Bound • Choose as constant. • Follows that achievable for any • Can also recover • Berger-Tung inner bound. • Wyner-Ziv rate region. • Wyner’s source coding with side information. • Slepian-Wolf and Korner Marton rate regions.
Lossy Coding of • Fix test channels • independent binary random variables. • Reconstruct • Using corollary to rate region, can achieve • Can achieve more rate points by • Choosing more general test channels. • Embedding in
Conclusions • Presented an unified approach to distributed source coding. • Involves use of nested linear codes. • Coding: Quantization followed by “correlated” binning. • Recovers the known rate regions for many problems. • Presents new rate regions for other problems.