370 likes | 527 Views
Distributed Adaptive Estimation and Tracking using Ad Hoc WSNs. Gonzalo Mateos ECE Department, University of Minnesota Acknowledgment: ARL/CTA grant no. DAAD19-01-2-0011 USDoD ARO grant no. W911NF-05-1-0283. Minneapolis, MN July 29, 2009. Wireless Sensor Networks (WSNs).
E N D
Distributed Adaptive Estimation and Tracking using Ad Hoc WSNs Gonzalo Mateos ECE Department, University of Minnesota Acknowledgment: ARL/CTA grant no. DAAD19-01-2-0011 USDoD AROgrant no. W911NF-05-1-0283 Minneapolis, MNJuly 29, 2009
Wireless Sensor Networks (WSNs) • Large number of wireless sensors • Randomly deployed • Inexpensive • Resource constrained • Unique feature: cooperative effort of sensors • Promising technology for crucial applications • Environmental monitoring • Fault diagnosis in process industry • Protection of critical infrastructure • Surveillance systems • Renewed interest in distributed computing
FC-based WSN Ad hoc WSN Ad hoc WSN + = Increased life expectancy of the WSN Two Prevailing Topologies • Why ad hoc WSNs? • Less power consumption as WSN scales (geographically) • Improved robustness to sensor failures
Motivation • Estimation using ad hoc WSNs raises exciting challenges • Communication constraints • Limited power budget • Lack of hierarchy / in-network processing Consensus • Unique features • Environment is constantly changing (e.g., WSN topology) • Lack/variations of statistical information at sensor level • Bottom line: estimation algorithms must be • Resource efficient • Simple and flexible • Adaptive and robust to changes Single-hop communications
Subject of the Thesis • Distributed estimation/tracking algorithms using ad hoc WSNs • In-network processing of sensor observations • Stability/convergence analysis • Quantifiable MSE (tracking) performance • Distributed (D-) least mean-square (LMS) & recursive least-squares (RLS) • Affordable complexity • Do not require a data model to be applicable • Online data enriches the estimation process • Can track slowly time-varying processes • Explore the complexity vs. performance tradeoff
This Work in Context • Single-shot distributed estimation algorithms • Consensus averaging [Xiao-Boyd ’05, Tsitsiklis-Bertsekas ’86, ’97] • Incremental strategies [Rabbat-Nowak etal ’05] • Deterministic and random parameter estimation [Schizas etal ’06] • Consensus-based Kalman tracking using ad hoc WSNs • MSE optimal filtering and smoothing [Schizas etal ’07] • Suboptimal approaches [Olfati-Saber ’05],[Spanos etal ’05] • Distributed adaptive estimation and filtering • LMS and RLS learning rules [Lopes-Cattivelli-Sayed ’06-08] • Optimization tools in distributed estimation • Incremental strategies • Primal-dual approaches • Alternating-direction method of multipliers (AD-MoM)
Outline • Part I: The D-LMS algorithm • Algorithm construction and operation • Stability results • Tracking performance analysis • Part II: The D-RLS algorithm • Reduced complexity variants • Stability and steady-state MSE performance analysis • Concluding remarks and future research directions
Problem Statement • Ad hoc WSN with sensors • Single-hop communications only. Sensor ‘s neighborhood • Connectivity information captured in • Zero-mean additive (e.g., Rx) noise • Goal: estimate a signal vector • Each sensor , at time instant • Acquires a regressor and scalar observation • Both zero-mean and spatially uncorrelated • Least mean-squares (LMS) estimation problem of interest
Power Spectrum Estimation • Find spectral peaks of a narrowband (e.g., seismic) source • AR model: • Source-sensor multi-path channels modeled as FIR filters • Unknown orders and tap coefficients • Observation at sensor is • Define: • Challenges • Data model not completely known • Channel fades at the frequencies occupied by
Centralized Approaches • If , jointly stationary with Wiener solution • If , are available Steepest-descent converges avoiding matrix inversion • If (cross-) covariance info. not available or time-varying Low complexity suggests (C-) LMS adaptation Goal:develop a distributed (D-) LMS algorithm for ad hoc WSNs
Algorithmic Construction • Consider the convex, constrained optimization • Equivalent for connected WSN: • Two key steps in deriving D-LMS • Resort to the AD-MoM [Glowinski ‘75] Gain desired degree of parallelization • Apply stochastic approximation ideas Cope with unavailability of statistical information
Rx from Tx to Tx to D-LMS Recursions and Operation • In the presence of communication noise, for and • Reduced communications possible with `bridge’ sensors Step 1: Step 2: Step 1: forming Step 2: forming Rx from
Sensor j Local LMS Algorithm Consensus Loop To PI Regulator Consensus Controller Interpretation • Consensus error at sensor : • Superposition of two learning mechanisms • Purely local LMS-type of adaptation • PI consensus loop: tracks the consensus reference
D-LMS in Action True time-varying weight: node WSN, Regressors: i.i.d. Observations: D-LMS:
Error-form D-LMS • Study the dynamics of • Local estimation errors: • Local sum of multipliers: (a1) Sensor observations obey where the zero-mean white noise has variance • Introduce and Lemma: Under (a1), for then where and consists of the blocks and with
MSD EMSE Local Global Performance Metrics • Local (per-sensor) and global (network-wide) metrics of interest (a2) is white Gaussian with covariance matrix (a3) and are independent • Define • Customary figures of merit
Tracking Performance (a4) Random-walk model: where is zero-mean white with covariance ; independent of and • Let where • Convenient c.v.: Proposition:Under (a2)-(a4), the covariance matrix of obeys with . Equivalently, after vectorization where .
Proposition: Under (a1)-(a4), the D-LMS algorithm is MSE stable for sufficiently small Stability and Steady-State Performance Proposition: Under (a1)-(a4), the D-LMS algorithm achieves consensus in the mean, i.e., provided with • MSE stability follows • Intractable to obtain explicit bounds on • From stability, has bounded entries • The fixed point of is • Enables evaluation of all figures of merit in s.s.
Step-size Optimization • If optimum minimizing EMSE • Not surprising • Excessive adaptation MSE inflation • Vanishing tracking ability lost • Recall • Hard to obtain closed-form , but easy numerically (1-D).
Available Extensions • Results hold when communication noise is present • Tracking an AR(1) signal vector • Time-correlated, stationary ergodic regressors • Estimation errors are weakly stochastic bounded [Solo’97] • Almost sure exponential stability in the absence of noise • MSE performance analysis via stochastic averaging
Regressors: w/ ; i.i.d.; w/ Observations: linear data model, WGN w/ Time-invariant parameter: Random-walk model: Simulated Tests node WSN, Rx AWGN w/ , , D-LMS:
Distributed RLS Estimation • Motivation: fast convergence, increased complexity affordable • Second-order approach: exponentially-weighted LS (EWLS) estimator • is the `forgetting’ factor. Tracking with • is a regularization matrix (small ) • Equivalent reformulation for connected ad hoc WSN • Solve via AD-MoM
D-RLS Algorithm Step 1: • In the presence of communication noise, for and • Recursively compute • When , updated recursively in operations Step 2:
Remarks • Communication exchanges and cost identical to D-LMS • Cost is , no matrices exchanged • Raw data not exchanged comm. noise resilience • Provides its own regularization can use • Multiplier updates identical to D-LMS • Increased cost in updating local estimates • Cost is for D-LMS • Cost is for D-RLS ( when ) • D-LMS/D-RLS do not require a Hamiltonian cycle
D-RLS in Action D-RLS: Diffusion RLS: Metropolis weights node WSN, Regressors: i.i.d. Observations: Global MSD(t) evolution: Global MSE(t) evolution:
Spectrum Estimation Task node WSN Source: is AR(4) Channels: . Sensors 3, 7, 15 and 27 have a zero at D-LMS estimates (sensor 15) Global MSE(t) evolution:
D-RLS with Ideal Links • Recall: • If and • Local estimate updates simplify to • Introduce • Savings: multipliers not exchanged Step 1: Step 2:
Alternating Minimization Algorithm • Consider the convex separable problem • Lagrangian function: • Augmented Lagrangian: AMA[Tseng ’91]: [S1] [S2] [S3] AD-MoM[Glowinski ‘75]: [S2]
AMA-based D-RLS • Because • Goal: reduce complexity in updating • Setting , then D-RLS = L-RLS • Apply AMA (EWLSE cost strictly convex): • Savings:for all , complexity is unless Step 1: Step 2:
MSE Analysis Preliminaries • Analysis challenging due to: • Finding the distribution of is typically intractable • Resort to simplifying assumptions (a1) Sensor observations obey where the zero-mean white noise has variance (a2) is white with covariance matrix (a3) , , and are independent and approximations for and • Approach:form `averaged’ error-form D-RLS system
Overview of Results Proposition: Under (a1)-(a3) and for , the D-RLS algorithm achieves consensus in the mean, i.e., provided with • As for D-LMS, closed-form recursion for • Approximation only valid for large • Vectorized recursion sufficient condition for MSE stability • Solve for from a fixed-point equation • Enables evaluation of all figures of merit in s.s. • Results account for communication noise
Observations: linear data model, WGN w/ Simulated Tests Regressors: w/ ; i.i.d.; w/ node WSN, Rx AWGN w/ , , D-LMS: , D-RLS: , ,
Concluding Summary • Developed D-LMS/D-RLS algorithms for general ad hoc WSNs • Estimators expressed as separable minimization problems • Detailed stability and MSE performance analysis for D-LMS • Stationary setup, time-invariant parameter vector • Tracking a random-walk/stable AR(1) process • D-RLS: complexity vs. performance tradeoff • Reduced complexity variants • Local and network-wide figures of merit for in s.s. • Ongoing research • Tracking s.s. performance analysis for D-RLS • Distributed lasso for estimation of sparse signals
Related Publications • Journal publications: • I. D. Schizas, G. Mateos and G. B. Giannakis, ``Distributed LMS for Consensus-Based In-Network Adaptive Processing,'' IEEE Transactions on Signal Processing, vol. 57, no. 6, pp. 2365-2381, June 2009. • G. Mateos, I. D. Schizas, and G. B. Giannakis, ``Distributed Recursive Least-Squares for Consensus-Based In-Network Adaptive Estimation,'' IEEE Transactions on Signal Processing, 2009 (to appear) • G. Mateos, I. D. Schizas, and G. B. Giannakis, ``Performance Analysis of the Consensus-Based Distributed LMS Algorithm,'' EURASIP Journal on Advances in Signal Processing, submitted May 2009. • Conference papers: • G. Mateos, I. D. Schizas and G. B. Giannakis, ``Distributed Least-Mean Square Algorithm Using Wireless Ad Hoc Networks,'' Proc. of 45th Allerton Conf., Univ. of Illinois at U-C, Monticello, IL, Sept. 26-28, 2007. • I. D. Schizas, G. Mateos and G. B. Giannakis, ``Distributed Recursive Least-Squares Using Wireless Ad Hoc Sensor Networks,'' Proc. of 41st Asilomar Conf. on Signals, Systems, and Computers, Pacific Grove, CA, Nov. 4-7, 2007. • I. D. Schizas, G. Mateos and G. B. Giannakis, ``Stability analysis of the consensus-based distributed LMS algorithm,'' Proc. of Intl. Conf. on Acoustics, Speech and Signal Processing, Las Vegas, NV, March 30-April 4, 2008. • G. Mateos, I. D. Schizas, and G. B. Giannakis, ``Closed-Form MSE Performance of the Distributed LMS Algorithm,'' Proc. of DSP Workshop, Marco Island, FL, January 4-7, 2009.
Deriving D-LMS • Write constraints as • Augmented Lagrangian • AD-MoM [S1] [S2] [S3]
Deriving D-LMS (cont.) • [S1]-[S3] boil down to: ( redundant) • First order optimality condition • Obtain recursion via Robbins-Monro iteration