1 / 26

Estimation from Quantized Signals

Estimation from Quantized Signals. Cheng Chang. Outline of the talk. Decentralized Estimation Model of Random Quantization Non-isotropic Decentralized Quantization Isotropic Decentralized Quantization Conclusions. Decentralized Estimation from Quantized Signals.

tamyra
Download Presentation

Estimation from Quantized Signals

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Estimation from Quantized Signals Cheng Chang

  2. Outline of the talk • Decentralized Estimation • Model of Random Quantization • Non-isotropic Decentralized Quantization • Isotropic Decentralized Quantization • Conclusions

  3. Decentralized Estimation from Quantized Signals

  4. Model of Random Quantization What is a quantizer? A nonlinear system whose purpose is to transform the input sample  into one of a finite set of prescribed values. [Oppenheim and Schafer] • is a random variable in RL , in this talk,  always has a FINITE support set.

  5. Model of Random Quantization .

  6. Model of Random Quantization Definition of random quantization: A map from a subspace (support set of ) in RL to the M dimensional probability simplex. M is the size of the output set. Estimation is needed in the fusion center. Deterministic quantizations and non-subtract ditherings are subsets of random quantization.

  7. Model of Random Quantization L=1, M=3

  8. Model of Random Quantization (N,M) quantizer-network : N independent (not necessarily identical) quantizers , each one has M quantization levels. • Lemma1 :Optimal (1,M) quantizer-network is deterministic. And it exists. • How to find it is another story which is not in this talk’s scope. • Lemma2: For any (N,M) quantizer-network , there is an equivalent (same input, same output) (1,MN) quantizer-network. • (N,M) network can not do better than the optimal (1,MN) quantizer

  9. Non-Isotropic Quantization • Def: Sensors can be different things, meanwhile the sensors send their IDs to the fusion center. • Theorem1: There exists a (N,M) non-isotropic quantizer-network which can do as good as the optimal (1, MN) quantizer (deterministic). • Proof: There is a bijective map from the set of deterministic non-isotropic (N,M) quantizer-network to the set of deterministic (1, MN) quantizers. • The ith sensor sends the ith bit of the output of the (1, MN) quantizer.

  10. Non-Isotropic Quantization • Example: N=3, M=2, L=1

  11. Isotropic Quantizer Network (IQN) • Def: Every single sensor is doing exactly the same thing . No ID is needed. • Every sensor has the same map FM from the parameter space to the probability simplex. • (N,M, FM )IQN • Sensors all use the same quantization map FM

  12. Isotropic Quantizer Network (IQN) • Example : N=3, M=2. (1 1 0)=( 1 0 1) =(0 1 1), (0 0 0 ) , (1 1 1 ) , (1 0 0)=(0 1 0) =(0 0 1), 4 possible outputs instead of 8 (non-isotropic). • Let K(N,M) be the number of possible outputs of an (N,M) IQN.

  13. Isotropic Quantizer Network (IQN) • Lemma 3: K(N,M)= • Proof: K(N,M) = the number of the solutions of the non-negative integer equation : a1+a2+…+aM=N • A (N,M) IQN can not work better than the optimal (1,K(N,M)) quantizer. (Lemma2)

  14. Isotropic Quantizer Network (IQN) • A map FM is asymptotically better than map GM ,iff there exists V, s.t. (N,M, FM ) is better than (N,M, GM ) for all N>V. • Criteria for better: MSE,…

  15. Isotropic Quantizer Network (IQN) • Lemma4 (Sanov’s theorem): Let X1, X2,…XN be i.i.d ~ Q(X). Let E be a set of probablity distributions. Then • Crucial KL distance- 1/N

  16. Isotropic Quantizer Network (IQN) • Let H(M)= • {Measurable function from R to the M-dimensional probability simplex, s.t. there are only finite discontinuous points} • Theorem2 : L=1, M>2, for any FM in H(M), there exists GMin H(M), which is asymptotically better than FM • Proof: Lemma4 and the fact that the “topologies” are the same for Euclidean metric and KL(Kullback Leibler)- distance.

  17. Isotropic Quantizer Network (IQN) • Reason: H(M) is not complete. • Stronger statement may exist. • Can be generated to higher dimensional cases (L>1). • “If L<M-1, and the map is not weird….” • Need help from Evans Hall.

  18. Isotropic Quantizer Network (IQN) • Theorem3 : Fix M, (N,M) IQN can do at least as good as the optimal (1, B(M) NM/2) quantizer asymptotically with respect to N. • Proof: Construction: pack (M-1)-dimensional balls of volume N -(M-1)/2 into the M-dimensional probablity simplex . • M-dimensional simplex has volume A(M). • “Radius” of the balls is R(M)N -1/2 i

  19. Isotropic Quantizer Network (IQN)

  20. Isotropic Quantizer Network (IQN) Crucial KL radius – N-1 Equivalent Euclidean radius- N-1/2 Taylor expansion of KL distance.

  21. Isotropic Quantizer Network (IQN) • Conjecture : Fix M, (N,M) IQN cannot do better than the optimal (1, D(M) NM/2) quantizer asymptotically with respect to N.

  22. Conclusions • Quantization :a map from a space to the probability simplex. (is this new?) • Non-isotropic (N,M) quantizer-network = quantizer with MN quantization levels (is it trivial?) • Isotropic (N,M) quantizer-network can work as good as aquantizer with N(M-1)/2 quantization levels asymptotically. (converse?).

  23. In the report • Noisy case , each observation is truncated by an I.I.D r.v. • the reason why (N,M) is more preferable than (1, MN). • If Nlg(M) is constant, what is the best choice of N?

  24. In the report • A linear universal (unknown noise) isotropic decentralized estimation scheme (based on dithering) :

  25. The End……………….. Thank you!

  26. Q/A • (quantization, “probability simplex”)16 entries from Google • Definition of triviality. • I hope so… more in report

More Related