290 likes | 321 Views
The Unscented Particle Filter. 2000/09/29 이 시은. Introduction. Filtering estimate the states(parameters or hidden variable) as a set of observations becomes available on-line To solve it modeling the evolution of the system and noise
E N D
The Unscented Particle Filter 2000/09/29 이 시은
Introduction • Filtering • estimate the states(parameters or hidden variable) as a set of observations becomes available on-line • To solve it • modeling the evolution of the system and noise • Resulting models • non-linearity and non-Gaussian distribution
Extended Kalman filter • linearize the measurements and evolution models using Taylor series • Unscented Kalman Filter • not apply to general non Gaussian distribution • Seq. Monte Carlo Methods : Particle filters • represent posterior distribution of states. • any statistical estimates can be computed. • deal with nonlinearities distribution
Particle Filter • rely on importance sampling • design of proposal distribution • Proposal for Particle Filter • EKF Gaussian approximation • UKF proposal • control rate at which tails go to zero • heavy tailed distribution
Dynamic State Space Model • Transition equation and a measurement’s equation • Goal • approximate the posterior • one of marginals, filtering density recursively
Extended Kalman Filter • MMSE estimator based on Taylor expansion of nonlinear f and g around estimate of state
Unscented Kalman Filter • Not approximate non-linear process and observation models • Use true nonlinear models and approximate distribution of the state random variable • Unscented transformation
Particle Filtering • Not require Gaussian approximation • Many variations, but based on sequential importance sampling • degenerate with time • Include resampling stage
Perfect Monte Carlo Simulation • A set of weighted particles(samples) drawn from the posterior • Expectation
Bayesian Importance Sampling • Impossible to sample directly from the posterior • sample from easy-to-sample, proposal distribution
Asymptotic convergence and a central theorem for under the following assumptions • i.i.d samples drawn from the proposal, support of the proposal include support of posterior and finite exists. • Expectation of , exist and are finite.
Sequential Importance Sampling • Proposal distribution • assumption • state: Markov process • observations: independent given states
we can sample from the proposal and evaluate likelihood and transition probability, generate a prior set of samples and iteratively compute the importance weights
Choice of proposal distribution • Minimize variance of the importance weights • popular choice • move particle towards the region of high likelihood
Degeneracy of SIS algorithm • Variance of importance ratios increases stochastically over time
Selection(Resampling) • Eliminate samples with low importance ratios and multiply samples with high importance ratios. • Associate to each particle a number of children
SIR and Multinomial sampling • Mapping Dirac random measure onto an equally weighted random measure • Multinomial distribution
Residual resampling • Set • perform an SIR procedure to select remaining samples with new weights • add the results to the current
Generic Particle Filter 1. Initialization t=0 2. For t=1,2, … (a) Importance sampling step for I=1, …N, sample: evaluate importance weight normalize the importance weights (b) Selection (resampling) (c) output
Improving Particle Filters • Monte Carlo(MC) assumption • Dirac point-mass approx. provides an adequate representaion of posterior • Importance sampling(IS) assumption • obtain samples from posterior by sampling from a suitable proposal and apply importance sampling corrections.
MCMC Move Step • Introduce MCMC steps of invariant distribution • If particles are distributed according to the posterior then applying a Markov chain transition kernel
Designing Better Importance Proposals • Move samples to regions of high likelihood • prior editing • ad-hoc acceptance test of proposing particles • Local linearization • Taylor series expansion of likelihood and transition prior • ex) • improved simulated annealed sampling algorithm
Rejection methods • If likelihood is bounded, sample from optimal importance distribution
Auxiliary Particle Filters • Obtain approximate samples from the optimal importance distribution by an auxiliary variable k. • draw samples from joint distribution
Unscented Particle Filter • Using UKF for proposal distribution generation within a particle filter framework
Theoretical Convergence • Theorem1 If importance weight is upper bounded for any and if one of selection schemes, then for all , there exists independent of N s.t. for any