1 / 20

Consensus-Based Distributed Least-Mean Square Algorithm Using Wireless Ad Hoc Networks

Consensus-Based Distributed Least-Mean Square Algorithm Using Wireless Ad Hoc Networks. Gonzalo Mateos, Ioannis Schizas and Georgios B. Giannakis ECE Department, University of Minnesota Acknowledgment: ARL/CTA grant no. DAAD19-01-2-0011. Motivation.

curry
Download Presentation

Consensus-Based Distributed Least-Mean Square Algorithm Using Wireless Ad Hoc Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Consensus-Based Distributed Least-Mean Square Algorithm Using Wireless Ad Hoc Networks Gonzalo Mateos, Ioannis Schizas and Georgios B. Giannakis ECE Department, University of Minnesota Acknowledgment: ARL/CTA grant no. DAAD19-01-2-0011

  2. Motivation • Estimation using ad hoc WSNs raises exciting challenges • Communication constraints • Limited power budget • Lack of hierarchy / decentralized processing Consensus • Unique features • Environment is constantly changing (e.g., WSN topology) • Lack of statistical information at sensor-level • Bottom line: algorithms are required to be • Resource efficient • Simple and flexible • Adaptive and robust to changes Single-hop communications

  3. Prior Works • Single-shot distributed estimation algorithms • Consensus averaging [Xiao-Boyd ’05, Tsitsiklis-Bertsekas ’86, ’97] • Incremental strategies [Rabbat-Nowak etal ’05] • Deterministic and random parameter estimation [Schizas etal ’06] • Consensus-based Kalman tracking using ad hoc WSNs • MSE optimal filtering and smoothing [Schizas etal ’07] • Suboptimal approaches [Olfati-Saber ’05],[Spanos etal ’05] • Distributed adaptive estimation and filtering • LMS and RLS learning rules [Lopes-Sayed ’06 ’07]

  4. Problem Statement • Ad hoc WSN with sensors • Single-hop communications only. Sensor ‘s neighborhood • Connectivity information captured in • Zero-mean additive (e.g., Rx, quantization) noise • Each sensor , at time instant • Acquires a regressor and scalar observation • Both zero-mean w.l.o.g and spatially uncorrelated • Least-mean squares (LMS) estimation problem of interest

  5. Centralized Approaches • If , jointly stationary Wiener solution • If global (cross-) covariance matrices , available Steepest-descent converges avoiding matrix inversion • If (cross-) covariance info. not available or time-varying Low complexity suggests (C-) LMS adaptation Goal:develop a distributed (D-) LMS algorithm for ad hoc WSNs

  6. Proposition [Schizas etal’06]: For satisfying 1)-2) and the WSN is connected, then A Useful Reformulation • Introduce the bridge sensor subset • For all sensors , such that • For , there must such that • Consider the convex, constrained optimization

  7. Algorithm Construction • Problem of interest • Two key steps in deriving D-LMS • Resort to the alternating-direction method of multipliers Gain desired degree of parallelization • Apply stochastic approximation ideas Cope with unavailability of statistical information

  8. Step 1: Step 2: Step 3: Derivation of Recursions • Associated augmented Lagrangian • Alternating-direction method of Lagrange multipliers Three-step iterative update process Multipliers Dual iteration Local estimates Minimize w.r.t. Bridge variables Minimize w.r.t.

  9. Multiplier Updates • Recall the constraints • Use standard method of multipliers type of update • Requires from the bridge neighborhood

  10. Local Estimate Updates • Given by the local optimization • First order optimality condition • Proposed recursion inspired by Robbins-Monro algorithm • is the local prior error • is a constant step-size • Requires • Already acquired bridge variables • Updated local multipliers

  11. Bridge Variable Updates • Similarly, • Requires • from the neighborhood • from the neighborhood in a startup phase

  12. Steps 1,2: Step 3: Tx to Bridge sensor Sensor Tx Rx Rx from to from D-LMS Recap and Operation • In the presence of communication noise, for • Simple, fully distributed, only single-hop exchanges needed Step 1: Step 2: Step 3:

  13. Further Insights • Manipulating the recursions for and yields • Introduce the instantaneous consensus error at sensor • The update of becomes • Superposition of two learning mechanisms • Purely local LMS-type of adaptation • PI consesus loop tracks the consensus set-point

  14. Sensor j Local LMS Algorithm Consensus Loop To PI Regulator D-LMS Processor • Network-wide information enters through the set-point • Expect increased performance with Flexibility

  15. Mean Analysis • Independence setting signal assumptions for (As1) is a zero-mean white random vector , with spectral radius (As2) Observations obey a linear model where is a zero-mean white noise (As3) and are statistically independent • Define and Goal:derive sufficient conditions under which

  16. Dynamics of the Mean Lemma:Under (As1)-(As3), consider the D-LMS algorithm initialized with . Then for , is given by the second-order recursion with and , where • Equivalent first-order system by state concatenation

  17. First-Order Stability Result Proposition:Under (As1)-(As3), the D-LMS algorithm whose positive step-sizes and relevant parameters are chosen such that , achieves consensus in the mean sense i.e., • Step-size selection based on local information only • Local regressor statistics • Bridge neighborhood size

  18. Simulations True time-varying weight: node WSN, Regressors: i.i.d. Observations: D-LMS: ,

  19. Loop Tuning • Adequately selecting actually does make a difference • Compared figures of merit: • MSE (Learning curve): • MSD (Normalized estimation error):

  20. Concluding Summary • Developed a distributed LMS algorithm for general ad hoc WSNs • Intuitive sensor-level processing • Local LMS adaptation • Tunable PI loop driving local estimate to consensus • Mean analysis under independence assumptions step-size selection rules based on local information • Simulations validate mss convergence and tracking capabilities • Ongoing research • Stability and performance analysis under general settings • Optimality: selection of bridge sensors, • D-RLS. Estimation/Learning performance Vs complexity tradeoff

More Related