260 likes | 399 Views
Switching Kalman Filters for Prediction and Tracking in an Adaptive Meteorological Sensing Network. Victoria Manfredi, Sridhar Mahadevan, Jim Kurose SECON’05 September 28, 2005. Introduction. CASA C ollaborative A daptive S ensing of the A tmosphere
E N D
Switching Kalman Filters for Prediction and Tracking in an Adaptive Meteorological Sensing Network Victoria Manfredi, Sridhar Mahadevan, Jim Kurose SECON’05 September 28, 2005
Introduction • CASA • Collaborative Adaptive Sensing of the Atmosphere • Distributed, collaborative, adaptive radar network • Where/what, when, and how to sense? • Configure radars based on predicted locations of meteorological phenomena • Our focus? Storm cells
Problem • Track storm cells over time • Use predicted storm locations to identify future radar configurations • Constraints/Assumptions • Existing meteorological algorithms that identify storms from raw radar data • Tracking only a single storm cell • Less than 30 seconds for prediction
Outline • Meteorological vs. Statistical Approaches • Kalman Filter Approaches • Experiments • Conclusions • Future work
Storm Tracking • Extrapolation • SCIT: linear least-squares over last five points [JMWMSET98] • Titan: extrapolation plus cross-correlation [DW93] • K-means to identify storm clusters, smooth storm movements with Kalman filter [LRD03] • Knowledge-intensive • Gandolf: model meteorological evolution of each storm [PHCH00] • Growth and Decay Storm Tracker: track encompassing storm instead of storm cell [WFHM98] • Ensemble Kalman Filter: project a set of points forward in time using a meteorological model [E03] Simpler Computationally Expensive
Meteorological vs. Statistical • Other Statistical Approaches • Kalman filter: linear, Gaussian, state • Switching Kalman filter: non-linear, Gaussian, state • Goal: • Good predictions • Satisfy real-time constraints • Meteorological Approaches • Extrapolation • Knowledge-intensive SCIT: Linear least-squares regression[JMWMSET98] Linear, Gaussian, no state (Developed at NSSL, Kurt Hondl)
Kalman Filter (KF) State Observation t=2 t=3 t=1 • Model (linear) dynamics of an object • States, Obs: Linear function plus Gaussian noise State transitions : xt+1 = Axt + N[0,Q] Observations : yt+1 = Bxt+1 + N[0,R] X = [lat, long, vlat, vlong ] Y = [lat, long]
Switching Kalman Filter (SKF) Switch State Switch S = which Kalman filter Observation t=3 t=1 t=2 X = [lat, long, vlat, vlong ] A1 , Q1 , B1 , R1 , 1 , 1 A2 , Q2 , B2 , R2 , 2 , 2 A3 , Q3 , B3 , R3 , 3 , 3 Y = [lat, long] • Model object dynamics with set of Kalman filters • Piecewise linear approximation of nonlinear path State transitions : xt+1 = Aixt + N[0,Qi ] Observations : yt+1 = Bixt+1 + N[0,Ri ]
Inference + Prediction Kalman Filter Observe Infer Predict State X = [ lat, long, vlat, vlong ] Observation Least-Squares Y = [ lat, long] t=4 t=3 t=1 t=2 • Use five most recent observations only
Inference + Prediction Observe Infer Collapse Most Likely Predict Switching Kalman Filter Switch S = which Kalman filter State X = [ lat, long, vlat, vlong ] Observation Y = [ lat, long] t=4 t=3 t=1 t=2 Switch values unknown inference in SKF is hard t=1: K possible states with K Kalman filters t=2: K2possible states … t=n: Knpossible states Solution? Approximate inference: Generalized pseudo-Bayesian • Order 2: Collapse over state, switches two time steps ago Prediction • Compute most likely sequence of switches • Use corresponding KFs to infer hidden state and predict next state
Experiments • Compare Kalman filter, switching Kalman filter and linear least-squares regression (SCIT[JMWMSET98]) on tracking and predicting storm locations • Data • 35 storm tracks courtesy of Kurt Hondl at NSSL • Each track is a sequence of latitude and longitude coordinates • Range in length from ten to 30 data points • Identified using SCIT [JMWMSET98]
Parameter Learning KF hand-coded learned SKF KF-EM SKF-EM • Kalman filter, switching Kalman filter parameters • What are dynamics of storm cells? • How to obtain model of dynamics? • Compare hand-coded parameters with learned parameters • Expectation-maximization to learn parameters • E-step: Assume parameters are known, compute expected values of hidden variables (state, switch) • M-step: Assume values of hidden variables are known, compute maximum likelihood parameters
Results (Not suprisingly) On nonlinear track, switching Kalman filter performs better
Results On linear tracks, both methods perform similarly
Results 0.1° lat = 6.9 miles 0.1° long 6.9 miles
Timing • Within timing constraints
Conclusions and Future Work • Although tracks identified with least-squares method (SCIT), KF-EM and SKF have lower prediction error • Can learn storm dynamics to improve prediction model • Future work • Obtain more data to improve learned model • Especially SKF • Incorporate meteorological information • Track multiple targets, other meteorological phenomena • Combine decision-making with prediction • Add higher layers to the SKF
Thank You. Questions?
References [JMWMSET98] J. Johnson, P. MacKeen, A. Witt, E. Mitchell, G. Stumpf, M. Eilts, and K.. Thomas. The storm cell identification and tracking algorithm: An enhanced WSR-88D algorithm. Weather and Forecasting, 13:263-276, 1998. [DW93] M. Dixon and G. Weiner. TITAN: Thunderstorm identification, tracking analysis and nowcasting a radar based methodology. J. Atmos. Ocean. Tech., 10:785-797, 1993. [LRD03] V. Lakshamanan, R. Rabin, and V. DeBrunner. Multiscale strom identification and forecast. Journal of Atmospheric Research, 367-380, 2003. [PHCH00] C.Pierce, P. Hardaker, C. Collier, and C. Haggett. GANDOLF: A system for generating automated nowcasts of covective precipitation. Meteorol. Appl., 7:341-360, 2000. [WFHM98] M. Wolfson, B. Forman, R. Hallowell, and M. Moore. The growth and decay storm tracker. American Meteorological Society 79th Annual Conference, 1999. [E03] G. Evensen. The ensemble Kalman filter: Theoretical formulation and practical implementatioon. Ocean Dynamics, 53:343-367, 2003.
Generalized Pseudo-Bayesian • Values of switch variables are unknown inference in SKF is hard • Time step 1: K possible states with K Kalman filters • Time step 2: K2possible states • … • Time step n: Knpossible states • Solution? Approximate inference • Generalized pseudo-Bayesian • Variational • Sampling • Viterbi
Generalized Pseudo-Bayesian • Order two generalized pseudo-Bayesian algorithm • Collapse over everything two time steps ago • x = mean, V = covariance, W = switch probability (xj, Vj) = Collapse(xij, Vij, Wi) xj = ∑i Wi xij Vj = ∑i Wi Vij + ∑i Wi (xij -xj)(xij -xj)T • Covariance depends on observations through x
Linear Least-Squares Regression • Given a set of points, find best fit line • Assumes constant covariance • Solve Ax=b for coefficient vector x • If too many equations, problem is over-constrained • Error: difference between what model says response value should be and actual value • Ax - b • Minimize squared vertical distance to best fit line • ||Ax -b||2 • So instead solve ATAx=ATb for coefficient vector x
Kalman Filter (KF) • Assume A = Identity and Q = zero matrix • Then for all t, xt+1 = xt • This can be used to derive the recursive least-squares update equations • Implies least-squares assumes constant covariance while KF does not State transitions : xt+1 = Axt + N[0,Q] Observations : yt+1 = Bxt+1 + N[0,R]
Inference + Prediction Observe Infer Predict Kalman Filter State X = [ lat, long, vlat, vlong ] Observation X = [ lat, long] t=5 t=4 t=3 t=1 t=2
Inference + Prediction Observe Infer Predict Collapse Least-Squares • Use five most recent observations only Switching Kalman Filter Switch State X = [ lat, long, vlat, vlong ] Observation t=5 t=4 t=3 t=1 t=2