ROBOTICS 01PEEQW Basilio Bona DAUIN – Politecnico di Torino
Probabilistic Fundamentals in Robotics Non-parametric Filters
Course Outline � Basic mathematical framework � Probabilistic models of mobile robots � Mobile robot localization problem � Robotic mapping � Probabilistic planning and control � Reference textbook � Thrun, Burgard, Fox, “Probabilistic Robotics”, MIT Press, 2006 � http://www.probabilistic-robotics.org/ Basilio Bona 3
Probabilistic models of mobile robots � Recursive state estimation � Basic concepts in probability � Robot environment � Bayes filters � Gaussian filters � Kalman filter � Extended Kalman Filter � Unscented Kalman filter � Information filter � Nonparametric filters � Histogram filter � Particle filter Basilio Bona 4
Introduction � Nonparametric filters do not rely on fixed functional form of the posterior probability � They approximate posteriors over continuous state spaces (CSS) with finitely many values � Decomposition of CSS in finitely many regions and representation of the posterior by a histogram: histogram filters � Representation of CSS by finitely many samples: particle filters � Nonparametric filters represents well multimodal distributions, i.e., distinct hypotheses (as in mobile robotics) � Computational complexity Basilio Bona 5
Histogram filter � The state space can be discrete or continuous � The random variable Xt can take finitely many values � Examples of discrete spaces are: � Grid element in a grid map: occupied/free � Door: open/closed � Terrain slope: none/mid/high � Terrain characteristics: sand/rock/grass/… � Discrete Bayes filters can be used for this type of problems Basilio Bona 6
Discrete Bayes filter Discrete probability distributio Basilio Bona 7
Continuous State � Discrete Bayes filter can be used to approximate continuous state spaces � They are called histogram filters � Space is divided into mutually non-overlapping intervals (bins or grid elements) Basilio Bona 8
Approximation � If state is discrete, the following are well defined � If state is continuous, approximation is necessary; e.g., Mean state value (normalized) Basilio Bona 9
Histogram transformation � The histogram of the transformed random variable is computed by passing multiple points from each histogram bin through the nonlinear function Basilio Bona 10
Decomposition of the state space � Bins definition: for the histogram approach a decomposition of the state space is necessary � Static decomposition partition the state in a fixed pre-define number of mutually non-overlapping subsets: easier to implement, high computational resources � Dynamic decomposition adapt the decomposition to the shape of the posterior distribution reduced computational resources, added algorithmic complexity � Density trees are an example of dynamic decomposition Basilio Bona 11
Density trees � Space decomposition is recursive � Adapts the resolution to the posterior probability: the less likely is a region, the coarser the decomposition � Compact representation: higher approximation quality with the same number of bins Basilio Bona 12
Topological and grid maps Basilio Bona 13
Static state and binary Bayes filter � Binary Bayes filters are used when the state is both static and binary Basilio Bona 14
Binary Bayes filter Basilio Bona 15
Particle filters: key concepts � Another nonparametric implementation of the Bayes filter � Approximate the posterior by a finite number of parameters (as in histogram filters) � Key idea: the posterior belief is represented by a set of state samples drawn from the distribution � The state samples are called particles � A particle is a hypothesis as to what the true world state may be at time t � The likelihood for a state hypothesis xt to be included in the particle set shall be proportional to the its Bayes filter posterior bel(xt) Basilio Bona 16
Particle filters: examples � The denser a region is populated by samples, the more likely is that the true state belongs to this region Normal distribution Multimodal distribution Basilio Bona 17
Particle filters: mathematical description Basilio Bona 18
Particle filter algorithm Hypothetical state is generated Importance factor, used to incorporate the measurement into the particle set Resampling aka importance sampling M particles are drawn with replacement from the temporary set The probability is given by the importance weight. At the end of the process … Basilio Bona 19
Resampling � Resampling is an important step to correctly approximate the posterior belief � It can be seen as a probabilistic implementation of the “survival-of- the-fittest” model � It focuses the particles on regions of space with high posterior probability � We will discuss resampling in more details Basilio Bona 20
Importance sampling � Starting with samples coming from a distribution g, we want to compute an expectation over a probability function f Basilio Bona 21
Importance sampling Target distribution This is what we want Basilio Bona 22
Importance sampling Proposal distribution This is what we have Basilio Bona 23
Importance sampling This is what we obtain Basilio Bona 24
Practical considerations and properties � Density estimation or density extraction: we may want continuous description of belief, not discrete approximations given by particles � Sampling variance: statistics from particles is different from statistics from original densities � Resampling � Sampling bias, particle deprivation: not treated here Basilio Bona 25
Density estimation � Methods � Gaussian approximation: simple but captures only approximated distribution; unimodality only � Histogram approximation: can represent multimodal distributions, computationally highly efficient, the complexity of computing density in any state point is independent of the number of particles � Kernel density approximation: can represent multimodal distributions, smoothness and algorithmic simplicity, the complexity of computing density in any state point is linear in the number of particles Basilio Bona 26
Gaussian approximation Basilio Bona 27
Histogram approximation Basilio Bona 28
Kernel approximation Basilio Bona 29
Sampling variance 250 particles 25 particles Basilio Bona 30
Resampling � Sampling variance is amplified through repetitive resampling � Look at step 3. It may happen that no command signal ut is applied � No new states are introduced at successive steps � The particles are erased and new ones are not created � M identical copies of a single particle will survive � The variance of the particle set decreases, but … the variance of the particle set as an estimator of the true belief increases Basilio Bona 31
Summary and conclusions � The histogram filter decomposes the state space in finitely many convex regions. � It represents the cumulative posterior probability of each region by a single numerical value � Many state space decomposition techniques exist. The granularity of decomposition may or may not depend on the structure of the environment. When it does, the decomposition is called topological � An alternative nonparametric technique is the particle filter algorithm. They are easy to implement and, with due care, are the most versatile of all Bayes filter algorithms. � Specific strategies exist to reduce the error in particle filters � Reduction of the variance of the estimate that arises from the randomness of the algorithm � Adaptation of the number of particles in accordance to the complezity of the posterior Basilio Bona 32
Basilio Bona 33
Recommend
More recommend