1 / 51

Particle Filters

Particle Filters. Importance Sampling. Importance Sampling. Unfortunately it is often not possible to sample directly from the posterior distribution, but we can use importance sampling. Let p(x) be a pdf from which it is difficult to draw samples.

marybailey
Download Presentation

Particle Filters

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Particle Filters

  2. Importance Sampling

  3. Importance Sampling • Unfortunately it is often not possible to sample directly from the posterior distribution, but we can use importance sampling. • Let p(x) be a pdf from which it is difficult to draw samples. • Let xi ~ q(x), i=1, …, N, be samples that are easily generated from a proposal pdf q, which is called an importance density. • Then approximation to the density p is given by • where

  4. Bayesian Importance Sampling • By drawing samples from a known easy to sample proposal distribution we obtain: • where • are normalized weights.

  5. Sensor Information: Importance Sampling

  6. Sequential Importance Sampling (I) • Factorizing the proposal distribution: • and remembering that the state evolution is modeled as a Markov process • we obtain a recursive estimate of the importance weights: • Factorizing is obtained by recursively applying

  7. Sequential Importance Sampling (SIS) Particle Filter • SIS Particle Filter Algorithm • for i=1:N • Draw a particle • Assign a weight • end • (k is index over time and i is the particle index)

  8. Rejection Sampling

  9. Rejection Sampling • Let us assume that f(x)<1 for all x • Sample x from a uniform distribution • Sample c from [0,1] • if f(x) > c keep the sampleotherwise reject the sample • f(x’) • c • c’ • OK • f(x) • x • x’

  10. Importance Sampling with Resampling:Landmark Detection Example

  11. Distributions

  12. Distributions • Wanted: samples distributed according to p(x| z1, z2, z3)

  13. This is Easy! • We can draw samples from p(x|zl) by adding noise to the detection parameters.

  14. Importance sampling with Resampling After Resampling

  15. Particle Filter Algorithm

  16. weight = target distribution / proposal distribution

  17. draw xit-1from Bel(xt-1) • draw xitfrom p(xt | xit-1,ut-1) • Importance factor for xit: Particle Filter Algorithm

  18. Particle Filter Algorithm • Algorithm particle_filter( St-1, ut-1 zt): • For Generate new samples • Sample index j(i) from the discrete distribution given by wt-1 • Sample from using and • Compute importance weight • Update normalization factor • Insert • For • Normalize weights

  19. Particle Filter Algorithm

  20. Particle Filter for Localization

  21. Particle Filter in Matlab

  22. Matlab code: truex is a vector of 100 positions to be tracked.

  23. Application: Particle Filter for Localization (Known Map)

  24. Resampling

  25. Resampling

  26. Resampling

  27. Resampling Algorithm • Algorithm systematic_resampling(S,n): • For Generate cdf • Initialize threshold • For Draw samples … • While ( ) Skip until next threshold reached • Insert • Increment threshold • ReturnS’ • Also called stochastic universal sampling

  28. Low Variance Resampling

  29. SIS weights

  30. Derivation of SIS weights (I) • The main idea is Factorizing : • and • Our goal is to expand p and q in time t

  31. Derivation of SIS weights (II)

  32. Derivation of SIS weights (II) • and under Markov assumptions

  33. SIS Particle Filter Foundation • At each time step k • Random samples are drawn from the proposal distribution for i=1, …, N • They represent posterior distribution using a set of samples or particles • Since the weights are given by • and q factorizes as

  34. Sequential Importance Sampling (II) • Choice of the proposal distribution: • Choose proposal function to minimize variance of (Doucet et al. 1999): • Although common choice is the prior distribution: We obtain then

  35. Sequential Importance Sampling (III) • Illustration of SIS: • Degeneracy problems: • variance of importance ratios increases stochastically over time (Kong et al. 1994; Doucet et al. 1999). • In most cases then after a few iterations, all but one particle will have negligible weight

  36. Sequential Importance Sampling (IV) • Illustration of degeneracy:

  37. SIS - why variance increase • Suppose we want to sample from the posterior • choose a proposal density to be very close to the posterior density • Then • and • So we expect the variance to be close to 0 to obtain reasonable estimates • thus a variance increase has a harmful effect on accuracy

  38. Sampling-Importance Resampling

  39. Sampling-Importance Resampling • SIS suffers from degeneracy problems so we don’t want to do that! • Introduce a selection (resampling) step to eliminate samples with low importance ratios and multiply samples with high importance ratios. • Resampling maps the weighted random measure on to the equally weighted random measure • by sampling uniformly with replacement from with probabilities • Scheme generates children such that and satisfies:

  40. Basic SIR Particle Filter - Schematic • Initialisation • measurement • Resampling • step • Importance • sampling step • Extract estimate,

  41. Basic SIR Particle Filter algorithm (I) • Initialisation • For sample • and set • Importance Sampling step • For sample • For compute the importance weights wik • Normalise the importance weights, • and set

  42. Basic SIR Particle Filter algorithm (II) • Resampling step • Resample with replacement particles: • from the set: • according to the normalised importance weights, • Set • proceed to the Importance Sampling step, as the next measurement arrives.

  43. Resampling • x

  44. Generic SIR Particle Filter algorithm • M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters …,” IEEE Trans. on Signal Processing, 50( 2), 2002.

  45. Improvements to SIR (I) • Variety of resampling schemes with varying performance in terms of the variance of the particles : • Residual sampling (Liu & Chen, 1998). • Systematic sampling (Carpenter et al., 1999). • Mixture of SIS and SIR, only resample when necessary (Liu & Chen, 1995; Doucet et al., 1999). • Degeneracy may still be a problem: • During resampling a sample with high importance weight may be duplicated many times. • Samples may eventually collapse to a single point.

  46. Improvements to SIR (II) • To alleviate numerical degeneracy problems, sample smoothing methods may be adopted. • Roughening (Gordon et al., 1993). • Adds an independent jitter to the resampled particles • Prior boosting (Gordon et al., 1993). • Increase the number of samples from the proposal distribution to M>N, • but in the resampling stage only draw N particles.

  47. Improvements to SIR (III) • Local Monte Carlo methods for alleviating degeneracy: • Local linearisation - using an EKF (Doucet, 1999; Pitt & Shephard, 1999) or UKF (Doucet et al, 2000) to estimate the importance distribution. • Rejection methods (Müller, 1991; Doucet, 1999; Pitt & Shephard, 1999). • Auxiliary particle filters (Pitt & Shephard, 1999) • Kernel smoothing (Gordon, 1994; Hürzeler & Künsch, 1998; Liu & West, 2000; Musso et al., 2000). • MCMC methods (Müller, 1992; Gordon & Whitby, 1995; Berzuini et al., 1997; Gilks & Berzuini, 1998; Andrieu et al., 1999).

  48. Improvements to SIR (IV) • Illustration of SIR with sample smoothing:

  49. Ingredients for SMC • Importance sampling function • Gordon et al • Optimal  • UKF  pdf from UKF at • Redistribution scheme • Gordon et al SIR • Liu & Chen  Residual • Carpenter et al  Systematic • Liu & Chen, Doucet et al  Resample when necessary • Careful initialisation procedure (for efficiency)

More Related