790 likes | 854 Views
Learn about Particle Filters and Recursive Bayesian Estimation for robot localization. Understand the key idea of Particle Filters, motion model reminder, global robot localization with sonar, importance sampling principle, history of Monte Carlo, and solution techniques for tracking systems. Explore the principles and characteristics of Particle Filters, various approaches, metaphors, and mathematical tools involved in implementing Particle Filters effectively.
E N D
Outline • Introduction to particle filters • Recursive Bayesian estimation • Bayesian Importance sampling • Sequential Importance sampling (SIS) • Sampling Importance resampling (SIR) • Improvements to SIR • On-line Markov chain Monte Carlo • Basic Particle Filter algorithm • Example for robot localization • Conclusions
Key Idea of Particle Filters • Idea = we try to have more samples where we expect to have the solution
Motion Model Reminder • Density of samples represents the expected probability of robot location
Global Localization of Robot with Sonarhttp://www.cs.washington.edu/ai/Mobile_Robotics/mcl/animations/global-floor.gif • This is the lost robot problem
Particles are used for probability density function Approximation
Function Approximation • Particle sets can be used to approximate functions • The more particles fall into an interval, the higher the probability of that interval • How to draw samples from a function/distribution?
Importance Sampling Principle weight • w = f / g • f is often calledtarget • g is often calledproposal • Pre-condition:f(x)>0 g(x)>0
Importance sampling: another example of calculating weight samples • How to calculate formally the f/g value?
History of Monte Carlo Idea and especially Particle Filters • First attempts – simulations of growing polymers • M. N. Rosenbluth and A.W. Rosenbluth, “Monte Carlo calculation of the average extension of molecular chains,” Journal of Chemical Physics, vol. 23, no. 2, pp. 356–359, 1956. • First application in signal processing - 1993 • N. J. Gordon, D. J. Salmond, and A. F. M. Smith, “Novel approach to nonlinear/non-Gaussian Bayesian state estimation,” IEE Proceedings-F, vol. 140, no. 2, pp. 107–113, 1993. • Books • A. Doucet, N. de Freitas, and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, 2001. • B. Ristic, S. Arulampalam, N. Gordon, Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech House Publishers, 2004. • Tutorials • M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-gaussian Bayesian tracking,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174–188, 2002.
What is the problem that we want to solve? • The problem is tracking the state of a system as it evolves over time • Sequentially arriving (noisy or ambiguous) observations • We want to know: Best possible estimate of the hidden variables
Solution: Sequential Update • Storing and processing all incoming measurements is inconvenient and may be impossible • Recursive filtering: • Predict next state pdf from current estimate • Update the prediction using sequentially arriving new measurements • Optimal Bayesian solution: • recursively calculating exact posterior density These lead to various particle filters
Particle Filters • Sequential Monte Carlo methods for on-line learning within a Bayesian framework. • Known as • Particle filters • Sequential sampling-importance resampling (SIR) • Bootstrap filters • Condensation trackers • Interacting particle approximations • Survival of the fittest
Approaches to Particle Filters METAPHORS
Particle filters • Sequential and Monte Carlo properties • Representing belief by sets of samples or particles • are nonnegative weights called importance factors • Updating procedure is sequential importance sampling with re-sampling
Tracking in 1D: the blue trajectory is the target.The best of10 particles is in red.
Short, more formal, Introduction to Particle Filters and Monte CarloLocalization
Particle filtering ideas • Recursive Bayesian filter by Monte Carlo sampling • The idea: represent the posterior density by a set of random particles with associated weights. • Compute estimates based on these samples and weights • Posterior density • Sample space
Particle filtering ideas • Particle filters are based onrecursive generation of random measures that approximatethe distributions of the unknowns. • Random measures: particles and importance weights. • As new observationsbecome available, the particles and the weights are propagated by exploiting Bayes theorem. • Posterior density • Sample space
Recall “law of total probability” and “Bayes’ rule” Mathematical tools needed for Particle Filters
Recursive Bayesian estimation (I) • Recursive filter: • System model: • Measurement model: • Information available:
Recursive Bayesian estimation (II) • Seek: • i = 0: filtering. • i > 0: prediction. • i<0: smoothing. • Prediction: • since:
Recursive Bayesian estimation (III) • Update: • where: • since:
System state dynamics • Observation dynamics • We are interested in: Belief or posterior density Bayes Filters (second pass) • Estimating system state from noisy observations
From above, constructing two steps of Bayes Filters • Predict: • Update:
Assumptions: Markov Process • Predict: • Update:
Bayes Filter • How to use it? What else to know? • Motion Model • Perceptual Model • Start from:
Example 1: theoretical PDF
Step 0: initialization • Step 1: updating • Example 1: theoretical PDF
Step 0: initialization • Each particle has the same weight • Step 1: updating weights. Weights are proportional to p(z|x) Example 2: Particle Filter
Step 3: updating • Step 4: predicting • Step 2: predicting • Example 1 (continue) • 1
Step 3: updating weights. Weights are proportional to p(z|x) • Step 4: predicting. • Predict the new locations of particles. • Step 2: predicting. • Predict the new locations of particles. Example 2: Particle Filter • Particles are more concentrated in the region where the person is more likely to be
Compare Particle Filter with Bayes Filter with Known Distribution • Updating • Example 1 • Example 2 • Predicting • Example 1 • Example 2
Classical approximations • Analytical methods: • Extended Kalman filter, • Gaussian sums… (Alspach et al. 1971) • Perform poorly in numerous cases of interest • Numerical methods: • point masses approximations, • splines. (Bucy 1971, de Figueiro 1974…) • Very complex to implement, not flexible.
Mobile Robot Localization • Each particle is a potential pose of the robot • Proposal distribution is the motion model of the robot (prediction step) • The observation model is used to compute the importance weight (correction step)
Monte Carlo Localization • Each particle is a potential pose of the robot • Proposal distribution is the motion model of the robot (prediction step) • The observation model is used to compute the importance weight (correction step)