680 likes | 1.07k Views
Mobile Robot Localization (ch. 7, 8). We are now back to the topic of localization after reviewing some necessary background. Mobile robot localization is the problem of determining the pose of a robot relative to a given map of the environment.
E N D
Mobile Robot Localization (ch. 7, 8) • We are now back to the topic of localization after reviewing some necessary background. • Mobile robot localization is the problem of determining the pose of a robot relative to a given map of the environment. • Remember, in localization problem, the map is given, known, available. • Is it hard? Not really, because,
Mobile Robot Localization • Most localization algorithms are variants of Bayes filter algorithm. • However, different representation of maps, sensor models, motion model, etc lead to different variant.
Mobile Robot Localization • Solved already, the Bayes filter algorithm. How? • The straightforward application of Bayes filters to the localization problem is called Markov localization. • Here is the algorithm (abstract?)
Mobile Robot Localization • Algorithm Bayes_filter ( ) • for all do • endfor • return
Mobile Robot Localization • Algorithm Markov Locatlization ( ) • for all do • endfor • return The Markov Localization algorithm addresses the global localization problem, the position tracking problem, and the kidnapped robot problem in static environment.
Mobile Robot Localization • Revisit Figure 7.5 to see how Markov localization algorithm in working. • The algorithm Markov Localization is still very abstract. To put it in work (eg. your project), we need a lot of more background knowledge to realize motion model, sensor model, etc…. • Refer to page 197-200 for more details.
Mobile Robot Localization • We discuss three different implementations of Markov Localization algorithm based on: • Kalman filter • Discrete, grid representation • Particle filter
Bayes Filter Implementations (1) (Extended) Kalman Filter (Gaussian filters) (Ch.3 and 7) Page 201-220 in Ch 7 and Page 40-64 Read and Compare them
Bayes Filter Reminder • Prediction • Correction
m Univariate -s s m Multivariate Gaussians
Multivariate Gaussians • We stay in the “Gaussian world” as long as we start with Gaussians and perform only linear transformations. • Review your probability textbook http://en.wikipedia.org/wiki/Multivariate_normal_distribution
Kalman Filter Estimates the state x of a discrete-time controlled process that is governed by the linear stochastic difference equation with a measurement
Components of a Kalman Filter Matrix (nxn) that describes how the state evolves from t to t-1 without controls or noise. Matrix (nxl) that describes how the control ut changes the state from t to t-1. Matrix (kxn) that describes how to map the state xt to an observation zt. Random variables representing the process and measurement noise that are assumed to be independent and normally distributed with covariance Rt and Qt respectively.
Kalman Filter Algorithm • Algorithm Kalman_filter( mt-1,St-1, ut, zt): • Prediction: • Correction: • Returnmt,St
Linear Gaussian Systems: Initialization • Initial belief is normally distributed:
Linear Gaussian Systems: Dynamics • Dynamics are linear function of state and control plus additive noise:
Linear Gaussian Systems: Observations • Observations are linear function of state plus additive noise:
Linear Gaussian Systems: Observations See page 45-54 for mathematical derivation.
Prediction The Prediction-Correction-Cycle
Correction The Prediction-Correction-Cycle
Prediction Correction The Prediction-Correction-Cycle
Kalman Filter Summary • Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2) • Optimal for linear Gaussian systems! • However, most robotics systems are nonlinear, unfortunately!
Nonlinear Dynamic Systems • Most realistic robotic problems involve nonlinear functions
EKF Linearization: First Order Taylor Series Expansion • Prediction: • Correction:
EKF Algorithm • Extended_Kalman_filter( mt-1,St-1, ut, zt): • Prediction: • Correction: • Returnmt,St
Probabilistic Robotics Bayes Filter Implementations Grid Localization/Discrete filters (Ch 8.1-8.2/ Ch 4.1)
Piecewise Constant See figure 8.11 on page 251 and Ch 8.3.1
Discrete Bayes Filter Algorithm • Algorithm Discrete_Bayes_filter( Bel(x),d ): • h=0 • Ifd is a perceptual data item z then • For all x do • For all x do • Else ifd is an action data item uthen • For all x do • ReturnBel’(x)
Piecewise Constant Representation See figure 8.2 on page 241.
Tree-based Representation Idea: Represent density using a variant of octrees
Tree-based Representations • Efficient in space and time • Multi-resolution
Xavier: Localization in a Topological Map [Courtesy of Reid Simmons]
Bayes Filter Implementations (3) Monte Carlo Localization/Particle filters Ch 8.3/Ch 4.3
Particle Filters • Represent belief by random samples • Estimation of non-Gaussian, nonlinear processes • Monte Carlo filter, Survival of the fittest, Condensation, Bootstrap filter, Particle filter • Filtering: [Rubin, 88], [Gordon et al., 93], [Kitagawa 96] • Computer vision: [Isard and Blake 96, 98] • Dynamic Bayesian Networks: [Kanazawa et al., 95]