390 likes | 618 Views
Particle Filters. 大连理工大学 金乃高 2009-01-03. Introduction. Sequential Monte Carlo Methods in Practice, Springer-Verlag,2001 IEEE Transactions on Signal Processing Special issue on Monte Carlo Methods for Statistical Signal Processing 2002,50(2)
E N D
Particle Filters 大连理工大学 金乃高 2009-01-03
Introduction • Sequential Monte Carlo Methods in Practice, Springer-Verlag,2001 • IEEE Transactions on Signal Processing Special issue on Monte Carlo Methods for Statistical Signal Processing 2002,50(2) • Proceedings of the IEEE, Special Issue on Sequential State Estimation, 2004,92(3) • Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech House Publishers, 2004.
Monte Carlo Method • Ulam • von Neumann • Metropolis • Fermi
Buffon 投针实验 N:投针次数 M:与平行线相交次数 D:间距 L:针长度
Monte Carlo Methods • Important Sampling • Rejection Sampling • Metropolis-Hastings • Gibbs采样
Sequential Monte Carlo • Bootstrap filtering (Gordon 1993) • Condensation algorithm(Isard and Blake 1996 ) • Particle filtering (Doucet 2001)
Sequential Monte Carlo • 20世纪50年代,Hammersley便采用基于序贯重要性采样(Sequential importance sampling,SIS)的蒙特卡洛方法解决统计学问题。 • 20世纪60年代后期,Handschin与Mayne使用序贯蒙特卡洛方法解决自动控制领域的相关问题。 • 20世纪70年代,Handschin、Akashi,Zaritskii
Sequential Monte Carlo • Tanizaki、Geweke等采用基于重要性采样的蒙特卡洛方法成功解决了一系列高维积分问题。 • Smith与Gelfand提出的采样-重采样思想为Bayesian推理提供了一种易于实现的计算策略。 • Smith与Gordon等人合作,于20世纪90年代初将重采样(Resampling)步骤引入到粒子滤波中,提出Bootstrap滤波算法。 • 美国海军集成水下监控系统中的Nodestar便是粒子滤波应用的一个实例
Applications of Particle Filters • Navigation, Positioning, Tracking • Channel equalization
Fundamental Concepts • Bayesian inference • Monte Carlo Simulation • Sequential Importance Sampling • Resampling
Bayesian Inference • X is unknown-a random variable or set (vector) of random variables • Y is observed-also a set of random variables • We wish to infer X by observing Y. • The probability distribution p(x) models our prior knowledge of X. • The conditional probability distribution p(Y|X) models the relationship between Y and X.
Bayesian Filtering • General problem statement
Bayes Theorem The conditional distribution p(x|y) represents posterior information about x given y.
Monte Carlo Sampling State space model Problem Solution Estimate posterior Integrals are not tractable Monte Carlo Sampling Difficult to draw samples Importance Sampling
Monte Carlo Simulation • The posterior distribution p(x|y) may be difficult or impossible to compute in closed form. • An alternative is to represent p(x|y) using Monte Carlo samples (particles): • Each particle has a value and a weight x x
Importance Sampling • Ideally, the particles would represent samples drawn from the distribution p(x|y). • In practice, we usually cannot get p(x|y) in closed form; in any case, it would usually be difficult to draw samples from p(x|y). • 重要性采样引入一个已知的易于采样的期望分布,权值用来描述期望分布与实际后验分布的差异。重要性采样是蒙特卡罗积分中的一种方差缩减策略,在贝叶斯滤波中,我们可以将重要性函数看成对后验概率密度函数的加权近似。
Sequential Importance Sampling • 粒子权值的递归形式可以表示为
Resampling • 我们希望经过若干次迭代,方差趋近于零以得到正确的估计。然而在SIS算法中的方差随着时间增加,产生权值退化现象。 • 1993年 Gordon提出重采样的思想克服了这个问题,推广了粒子滤波技术的应用范围。 • 重采样的基本思想是舍弃权值较小的、肯定不感兴趣的粒子,代之以较大的权值的粒子。
Resampling • In inference problems, most weights tend to zero except a few (from particles that closely match observations), which become large. • We resample to concentrate particles in regions where p(x|y) is larger. x x
Resampling • 破坏算法的并行性 • 粒子差异性丧失 解决方案 • 增加粒子数 • 重采样之后加入随机噪声 • Markov chain Monte Carlo 移动 • 核平滑:核函数代替狄拉克函数
Variations • Use a different importance distribution • Use a different resampling technique: • Resampling adds variance to the estimate; several resampling techniques are available that minimize this added variance. • Our simple resampling leaves several particles with the same value; methods for spreading them are available.
Variations • Reduce the resampling frequency: • Our implementation resamples after every observation, which may add unneeded variance to the estimate. • Alternatively, one can resample only when the particle weights warrant it. This can be determined by the effective sample size.
Rao-Blackwellization • Rao-Blackwellization: • Some components of the model may have linear dynamics and can be well estimated using a conventional Kalman filter. • The Kalman filter/Extended Kalman filter/Unscented Kalman filter/ Gauss-hermit filter is combined with a particle filter to reduce the number of particles needed to obtain a given level of performance.
Advantages of Particle Filters • Under general conditions, the particle filter estimate becomes asymptoticallyoptimal as the number of particles goes to infinity. • Non-linear, non-Gaussian state update and observation equations can be used. • Multi-modal distributions are not a problem.
Disadvantages of Particle Filters • Naïve formulations of problems usually result in significant computation times. • The Number ofparticles. • The best importance distribution and/or resampling methods may be very problem specific.
Conclusions Particle filter is a tractable exercise for previously difficult or impossible problems.
综述文章 M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, A tutorial on particle filters for online nonlinear/non-gaussian Bayesian tracking, IEEE Transactions on Signal Processing, 2002 ,50(2)174-188
相关网站 Google Sequential Monte Carlo http://www-sigproc.eng.cam.ac.uk/smc/papers.html