180 likes | 184 Views
On the interdependence of routing and data compression in multi-hop sensor networks. Anna Scaglione, Sergio D. Servetto. Background. Broadcast communication Multi-hop sensor network Objective: each node obtains an estimation of the entire field Constraint: distortion < prescribed constant.
E N D
On the interdependence of routing and data compression in multi-hop sensor networks Anna Scaglione,Sergio D. Servetto
Background • Broadcast communication • Multi-hop sensor network • Objective: each node obtains an estimation of the entire field • Constraint: distortion < prescribed constant
Main Idea • Jointly compress data generated by different nodes as this information travels over multiple hops. • In order to eliminate correlations in the representation of the sample field.
Problem setup • N nodes placed on the closed set [0,1]x[0,1]. • Each node i observes a sample Si. • How can we describe Si? • Rate distortion function Rs(D) • Correlation between samples.
Rate/Distortion Function RS(D) • Distortion function defined: d(s,s’) • Hamming distortion: 1{ss’} • Square error distortion: (s-s’)2 • Average distortion: E[d(S,S’)]D • Physical meaning • Given distribution S and a constant D, what’s the minimum bits we need to represent S so that the mean distortion is less than or equal to D?
Correlation • Correlation between samples increases as the distance between then in the grid decreases.
Problem setup (reminder) • N nodes placed on the closed set [0,1]x[0,1]. • Each node i observes a sample Si • How can we describe Si • Rate distortion function Rs(D) e • Correlation between samples.
Why independent Encoders Fail? • Assumption: each node encodes its own data independently. • Consider a general case when each Si is uniform in [0,1]. • Mean-square distortion function.
Why independent Encoders Fail? • Each node uses a scalar quantizer with B bits of resolution (i.e., step size 2-B). • Previous result: average distortion is (1/12)x2-B • Total average distortion D=(N/12)x2-B • B=(1/2)log2(N/12D) • Total information bits: O(N logN)
Why independent Encoders Fail? • Regardless of the routing strategy, the total information is O(N logN). • Amount of bandwidth is O(L N1/2) for a cut of the network, where L is the amount of bandwidth for each node.
Routing and Data Compression • What have we learnt? • Data compression is needed to remove correlation between samples. • What are the choices? • Distributed Source Coding (not focused) • Routing and Source Coding (that’s it!)
Routing and Source Coding • Idea: Re-encode data at intermediate nodes to remove correlation. • Easy to implement than distributed source coding. • Use scalar quantizer locally and then forward the data in files compressed using universal source coding algorithms (e.g. Lempel-Ziv).
Transmission Time Vs. Compression Ratio • Example 1: • Amount of traffic: 3H(X1,X2,X3,X4) bits • Transmission time: 8 rounds • Example 2: • Amount of traffic: 2H(X1,X2,X3,X4)+H(X1,X2)+H(X3,X4) bits • Transmission time: 4 rounds
Questions posted • Under what conditions on the statistics of the source can a network transport all the data generated by the sources? • What are the tradeoffs between bandwidth requirement and transmission delay?
Transport Capacity • We know an upper bound: O(L N1/2). • Construct a particular flow, calculate the amount of bandwidth needed. • Suppose q(Si) is the quantized version of Si such that E[d(Si,q(Si))]D. • We need Hq(Si) bits for a source.
Transport Capacity • Partition the nodes into four groups. • Total traffic to go through the cut: 3H(G1, G2, G3, G4) (plus O(N1/2) transmissions to spread data within cuts) • Trivial modification for random grid.
General Constraints • Facts: • O(Hq(S1,…,SN)) bits must go across the 4-way cut. • Capacity of the 4-way cut is O(LN1/2). • From rate/distortion theory, Hq(S1,…,SN)R S1,…,SN(D), since q is a quantizer with mean distortion D. • Condition: O(R S1,…,SN(D)) O(L N1/2)
The remaining of the story • Existence of routing algorithm and source codes that requires no more than O(R S1,…,SN(D)) bits in O( N1/2) transmissions. • Proof that O(R S1,…,SN(D)) O(log(N/D) • When D/N is kept constant, the field generates a bounded amount of information. • If the total distortion is kept constant, the growth of R is only logarithm in N, well below O(L N1/2).