1 / 18

On the interdependence of routing and data compression in multi-hop sensor networks

On the interdependence of routing and data compression in multi-hop sensor networks. Anna Scaglione, Sergio D. Servetto. Background. Broadcast communication Multi-hop sensor network Objective: each node obtains an estimation of the entire field Constraint: distortion < prescribed constant.

Download Presentation

On the interdependence of routing and data compression in multi-hop sensor networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. On the interdependence of routing and data compression in multi-hop sensor networks Anna Scaglione,Sergio D. Servetto

  2. Background • Broadcast communication • Multi-hop sensor network • Objective: each node obtains an estimation of the entire field • Constraint: distortion < prescribed constant

  3. Main Idea • Jointly compress data generated by different nodes as this information travels over multiple hops. • In order to eliminate correlations in the representation of the sample field.

  4. Problem setup • N nodes placed on the closed set [0,1]x[0,1]. • Each node i observes a sample Si. • How can we describe Si? • Rate distortion function Rs(D) • Correlation between samples.

  5. Rate/Distortion Function RS(D) • Distortion function defined: d(s,s’) • Hamming distortion: 1{ss’} • Square error distortion: (s-s’)2 • Average distortion: E[d(S,S’)]D • Physical meaning • Given distribution S and a constant D, what’s the minimum bits we need to represent S so that the mean distortion is less than or equal to D?

  6. Correlation • Correlation between samples increases as the distance between then in the grid decreases.

  7. Problem setup (reminder) • N nodes placed on the closed set [0,1]x[0,1]. • Each node i observes a sample Si • How can we describe Si • Rate distortion function Rs(D) e • Correlation between samples.

  8. Why independent Encoders Fail? • Assumption: each node encodes its own data independently. • Consider a general case when each Si is uniform in [0,1]. • Mean-square distortion function.

  9. Why independent Encoders Fail? • Each node uses a scalar quantizer with B bits of resolution (i.e., step size 2-B). • Previous result: average distortion is (1/12)x2-B • Total average distortion D=(N/12)x2-B • B=(1/2)log2(N/12D) • Total information bits: O(N logN)

  10. Why independent Encoders Fail? • Regardless of the routing strategy, the total information is O(N logN). • Amount of bandwidth is O(L N1/2) for a cut of the network, where L is the amount of bandwidth for each node.

  11. Routing and Data Compression • What have we learnt? • Data compression is needed to remove correlation between samples. • What are the choices? • Distributed Source Coding (not focused) • Routing and Source Coding (that’s it!)

  12. Routing and Source Coding • Idea: Re-encode data at intermediate nodes to remove correlation. • Easy to implement than distributed source coding. • Use scalar quantizer locally and then forward the data in files compressed using universal source coding algorithms (e.g. Lempel-Ziv).

  13. Transmission Time Vs. Compression Ratio • Example 1: • Amount of traffic: 3H(X1,X2,X3,X4) bits • Transmission time: 8 rounds • Example 2: • Amount of traffic: 2H(X1,X2,X3,X4)+H(X1,X2)+H(X3,X4) bits • Transmission time: 4 rounds

  14. Questions posted • Under what conditions on the statistics of the source can a network transport all the data generated by the sources? • What are the tradeoffs between bandwidth requirement and transmission delay?

  15. Transport Capacity • We know an upper bound: O(L N1/2). • Construct a particular flow, calculate the amount of bandwidth needed. • Suppose q(Si) is the quantized version of Si such that E[d(Si,q(Si))]D. • We need Hq(Si) bits for a source.

  16. Transport Capacity • Partition the nodes into four groups. • Total traffic to go through the cut: 3H(G1, G2, G3, G4) (plus O(N1/2) transmissions to spread data within cuts) • Trivial modification for random grid.

  17. General Constraints • Facts: • O(Hq(S1,…,SN)) bits must go across the 4-way cut. • Capacity of the 4-way cut is O(LN1/2). • From rate/distortion theory, Hq(S1,…,SN)R S1,…,SN(D), since q is a quantizer with mean distortion D. • Condition: O(R S1,…,SN(D)) O(L N1/2)

  18. The remaining of the story • Existence of routing algorithm and source codes that requires no more than O(R S1,…,SN(D)) bits in O( N1/2) transmissions. • Proof that O(R S1,…,SN(D)) O(log(N/D) • When D/N is kept constant, the field generates a bounded amount of information. • If the total distortion is kept constant, the growth of R is only logarithm in N, well below O(L N1/2).

More Related