ROBOTICS 01PEEQW Basilio Bona DAUIN – Politecnico di Torino
Probabilistic Fundamentals in Robotics Gaussian Filters
Course Outline � Basic mathematical framework � Probabilistic models of mobile robots � Mobile robot localization problem � Robotic mapping � Probabilistic planning and control � Reference textbook � Thrun, Burgard, Fox, “Probabilistic Robotics”, MIT Press, 2006 � http://www.probabilistic-robotics.org/ Basilio Bona 3
Basic mathematical framework � Recursive state estimation � Basic concepts in probability � Robot environment � Bayes filters � Gaussian filters (parametric filters) � Kalman filter � Extended Kalman Filter � Unscented Kalman filter � Information filter � Nonparametric filters � Histogram filter � Particle filter Basilio Bona 4
Introduction � Gaussian filters are different implementations of Bayes filters for continuous spaces, with specific assumptions on probability distributions � Beliefs are represented by multi-variate normal distributions Basilio Bona 5
Multi-variate Gaussian distribution Covariance matrix Mean vector Basilio Bona 6
Examples Bi-dimensional Gaussian with conditional probabilities Mixture of Gaussians Basilio Bona 7
Covariance matrix Basilio Bona 8
Kalman filter (1) � Kalman filter (KF) [Swerling: 1958, Kalman: 1960] applies to linear Gaussian systems � KF computes the belief for continuous states governed by linear dynamic state equations � Beliefs are expressed by normal distributions � KF is not applicable to discrete or hybrid state space systems Basilio Bona 9
Kalman filter (2) Basilio Bona 10
Kalman filter (3) Basilio Bona 11
Kalman filter (4) Basilio Bona 12
Kalman filter algorithm (1) Prediction Innovation (residuals) covariance Kalman gain Update Basilio Bona 13
Block diagram z t + µ u z ˆ + t t t B C − t t + + A D K + t µ µ t residuals t − 1 t Basilio Bona 14
Kalman filter algorithm (2) visible hidden Basilio Bona 15
Kalman filter example Initial state measurement update update prediction measurement Basilio Bona 16
From Kalman filter to extended Kalman filter � Kalman filter is based on linearity assumptions � Gaussian random variables are expressed by means and covariance matrices of normal distributions � Gaussian distributions are transformed into Gaussian distributions � Kalman filter is optimal � Kalman filter is efficient Basilio Bona 17
Linear transformation of Gaussians Basilio Bona 18
Extended Kalman Filter (EKF) � When the linearity assumptions do not hold (as in robot motion models or orientation models) a closed form solution of the predicted belief does not exists Nonlinear state & measurement equations � Extended Kalman Filter (EKF) approximates the nonlinear transformations with a linear one � Linearization is performed around the most likely value: i.e., the mean value Basilio Bona 19
EKF Example Montecarlo generated distribution Transformed mean value Approximating mean value Approximating Gaussian Approximating Gaussian uses mean and covariance of the Montecarlo generated distribution Basilio Bona 20
EKF Example Basilio Bona 21
EKF Example EKF Gaussian : the normal distribution built using mean and covariance of the true nonlinear distributions EKF Gaussian Approximating Gaussian : the normal distribution built using mean and covariance Approximat of the true nonlinear distributions ing Gaussian Basilio Bona 22
EKF linearization � Taylor expansion Depends only on the mean Basilio Bona 23
EKF algorithm Basilio Bona 24
KF vs EKF Basilio Bona 25
Features � EKF is a very popular tool for state estimation in robotics � It has the same time complexity of the KF � It is robust and simple � Limitations: rarely state and measurement functions are linear. � Goodness of linear approximation depends on � Degree of uncertainty � Degree of nonlinearity � When using EKF the uncertainty must be kept small as much as possible Basilio Bona 26
Uncertainty More uncertain More uncertain Less uncertain Less uncertain Basilio Bona 27
Uncertainty More uncertain Less uncertain Basilio Bona 28
Nonlinearity More nonlinear More linear Basilio Bona 29
Nonlinearity More nonlinear More linear Basilio Bona 30
Example: EKF Localization within a sensor infrastructure Mobile Robot can acquire odometric Fixed sensors (deployed in measurements and known positions inside the distance environment) information from sensors in known positions True position of the mobile robot t=0 KF estimate (time zero) Basilio Bona 31
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry t=0 Basilio Bona 32
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction t=0 Basilio Bona 33 Luca Carlone – Politecnico di Torino
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. t=0 Basilio Bona 34 Luca Carlone – Politecnico di Torino
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update t=0 Basilio Bona 35 Luca Carlone – Politecnico di Torino
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update STEP 2: - Acquire odometry t=0 Basilio Bona 36 Luca Carlone – Politecnico di Torino
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update STEP 2: - Acquire odometry - Filter Prediction t=0 Basilio Bona 37 Luca Carlone – Politecnico di Torino
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update STEP 2: - Acquire odometry - Filter Prediction - Acquire meas. t=0 Basilio Bona 38 Luca Carlone – Politecnico di Torino
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update STEP 2: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update t=0 Basilio Bona 39 Luca Carlone – Politecnico di Torino
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update STEP 2: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update t=0 . . . Basilio Bona 40 Luca Carlone – Politecnico di Torino
Unscented Kalman Filter (UKF) � UKF performs a stochastic linearization based on a weighted statistical linear regression � A deterministic sampling technique (the unscented transform) is used to pick a minimal set of sample points (sigma points) around the mean value of the normal pdf � The sigma points are propagated through the nonlinear functions, and then used to compute the mean and covariance of the transformed distribution � This approach � removes the need to explicitly compute Jacobians, which for complex functions can be difficult to calculate � produces a more accurate estimate of the posterior distribution Basilio Bona 41
UKF Basilio Bona 42
UKF Basilio Bona 43
UKF Basilio Bona 44
UKF Algorithm – part a) Basilio Bona 45
UKF Algorithm – part b) Cross covariance Basilio Bona 46
EKF vs UKF Basilio Bona 47
EKF vs UKF Basilio Bona 48
KF – EKF – UKF KF EKF UKF Basilio Bona 49
Information filters Belief is represented by Gaussians Moments parameterization Canonical parameterization KF – EKF – UKF IF – EIF Duality Mean Information vector Covariance Information matrix Basilio Bona 50
Multivariate normal distribution Basilio Bona 51
Mahalanobis distance Mahalanobis distance Same Euclidean distance Same Mahalanobis distance Basilio Bona 52
IF algorithm Basilio Bona 53
IF vs KF IF KF � Prediction step requires two � Prediction step is additive matrix inversion � � � Measurements update is � Measurements update additive requires matrix inversion � � Duality Basilio Bona 54
Extended information filter – EIF � It is similar to EKF and applies when state and measurement equations are nonlinear State estimate � Jacobians G and H replace A, B and C matrices Basilio Bona 55
Practical considerations � IF advantages over KF: � Simpler global uncertainty representation: set Ω = 0 � Numerically more stable (in many but not all robotics applications) � Integrates information in simpler way � Is naturally fit for multi-robot problems (decentralized data integration => Bayes rule => logarithmic form => addition of terms => arbitrary order) � IF limitations: � A state estimation is required (inversion of a matrix) � Other matrix inversions are necessary (not required for EKF) � Computationally inferior to EKF for high-dim state spaces Basilio Bona 56
Recommend
More recommend