two popular bayesian
play

Two Popular Bayesian Estimators: Particle and Kalman Filters - PowerPoint PPT Presentation

Two Popular Bayesian Estimators: Particle and Kalman Filters McGill COMP 765 Sept 14 th , 2017 z = observation u = action Recall: Bayes Filters x = state ( ) ( | , , , ) Bel x P x u z u z 1 1 t t t t


  1. Two Popular Bayesian Estimators: Particle and Kalman Filters McGill COMP 765 Sept 14 th , 2017

  2. z = observation u = action Recall: Bayes Filters x = state   ( ) ( | , , , ) Bel x P x u z u z 1 1 t t t t     ( | , , , , ) ( | , , , ) P z x u z u P x u z u Bayes 1 1 1 1 t t t t t    ( | ) ( | , , , ) P z x P x u z u Markov 1 1 t t t t     ( | ) ( | , , , , ) P z x P x u z u x Total prob.  1 1 1 t t t t t  ( | , , , ) P x u z u dx   1 1 1 1 t t t     ( | ) ( | , ) ( | , , , ) P z x P x u x P x u z u dx Markov    1 1 1 1 1 t t t t t t t t     ( | ) ( | , ) ( | , , , ) P z x P x u x P x u z z dx Markov     1 1 1 1 1 1 t t t t t t t t    ( | ) ( | , ) ( ) P z x P x u x Bel x dx    1 1 1 t t t t t t t

  3. Discrete Bayes Filter Algorithm Algorithm Discrete_Bayes_filter ( Bel(x),d ): 1.   0 2. 3. If d is a perceptual data item z then 4. For all x do  5. ' ( ) ( | ) ( ) Bel x P z x Bel x     6. ' x ( ) Bel 7. For all x do    8. 1 ' ( ) ' ( ) Bel x Bel x 9. Else if d is an action data item u then 10. For all x do   11. ' ( ) ( | , ' ) ( ' ) Bel x P x u x Bel x ' x 12. Return Bel’(x)

  4. Piecewise Constant Bel(x)

  5. Problem Statement • What are representations for Bel(x) and matching update rules work well in practice?    ( ) ( | ) ( | , ) ( ) Bel x P z x P x u x Bel x dx    1 1 1 t t t t t t t t • Desirable: • Accuracy and correctness • Time and space usage scales well with size of state and # dimensions • Represent realistic range of motion and measurement models

  6. Part 1: Particle Filters • Intuition: track Bel(x) with adaptively located discrete samples • Potentials: • Better accuracy/computation trade-off • Particles can take shape of arbitrary distributions • Uses: • Indoor robotics • Self driving cars • Computer vision • General tool in learning

  7. Intuitive Example: Localizing During Robocup

  8. Distributions Consider distributions to each p(x| zi) only. Are these related to our answer?

  9. Distributions Wanted: samples distributed according to p(x| z 1 , z 2 , z 3 ) 10

  10. This is Easy! We can draw samples from p(x|z l ) by adding noise to the detection parameters.

  11. Importance Sampling • As seen, it is often easy to draw samples from one portion of our Bayes filter • Main trick: importance sampling , i.e. how to estimate properties/statistics of one distribution (f) given samples from another distribution (g) For example, suppose we want to estimate the expected value of f given only samples from g.

  12. Importance Sampling • As seen, it is often easy to draw samples from one portion of our Bayes filter • Main trick: importance sampling , i.e. how to estimate properties/statistics of one distribution (f) given samples from another distribution (g)

  13. Importance Sampling • As seen, it is often easy to draw samples from one portion of our Bayes filter • Main trick: importance sampling , i.e. how to estimate properties/statistics of one distribution (f) given samples from another distribution (g) Weights describe the mismatch between the two distributions, i.e. how to reweigh samples to obtain statistics of f from samples of g

  14. Importance Sampling for Robocup  ( | ) ( ) p z x p x k  Target distributi on f : ( | , ,..., ) k p x z z z 1 2 n ( , ,..., ) p z z z 1 2 n ( | ) ( ) p z x p x  Sampling distributi on g : ( | ) l p x z l ( ) p z l  ( ) ( | ) p z p z x l k ( | , ,..., ) f p x z z z    1 2 Importance weights w : n k l ( | ) ( , ,..., ) g p x z p z z z 1 2 l n

  15. Importance Sampling Here are all of our p(x|zi) samples, now with w attached If we re-draw from these samples, (not shown). weighted by w, we get… Weighted samples After resampling

  16. Importance Sampling for Bayes Filter • What is are the proposal distribution and weighting computations? Sample from propagation, before update Want posterior belief after update Recall: weighting to remove sample bias

  17. Importance Sampling for Bayes Filter • What is are the proposal distribution and weighting computations? Sample from propagation, before update Want posterior belief after update This algorithm is known as a particle filter.

  18. Particle Filter Algorithm Actual observation and control received

  19. Particle Filter Algorithm Particle propagation/prediction: noise needs to be added in order to make particles differentiate from each other. If propagation is deterministic then particles are going to collapse to a single particle after a few resampling steps.

  20. Particle Filter Algorithm Weight computation as measurement likelihood. For each particle we compute the probability of the actual observation given the state is at that particle.

  21. Particle Filter Algorithm Resampling step Note: particle deprivation heuristics are not shown here

  22. Particle Filter Algorithm Resampling: The particle locations now have a chance to adapt according to the weights. More likely particles persist, while unlikely choices are removed.

  23. Examples: 1D Localization

  24. Examples: 1D Localization

  25. Resampling • Given : Set S of weighted samples. • Wanted : Random sample, where the probability of drawing x i is given by w i . • Typically done n times with replacement to generate new sample set S’ .

  26. Resampling Carefully w 1 w 1 w n w n w 2 w 2 W n-1 W n-1 w 3 w 3 • Stochastic universal sampling • Roulette wheel • Systematic resampling • Binary search, n log n • Linear time complexity • Easy to implement, low variance

  27. Resampling Algorithm 1. Algorithm systematic_resampling ( S,n ): 2.    1 ' , S c w 1  3. For  2 Generate cdf i n   i 4. c c w  1 i i   1 5. ~ ] 0 , ], 1 Initialize threshold u U n i 1  Draw samples … 6. For  1 j n u  7. While ( ) c Skip until next threshold reached j i  i  8. 1 i       1  i 9. ' ' , Insert S S x n    1 10. u u n Increment threshold  1 j j 11. Return S’ Also called stochastic universal sampling

  28. Particle Motion Model Start

  29. Proximity Sensor Model Reminder Sonar sensor Laser sensor

  30. 31

  31. 32

  32. 33

  33. 34

  34. 35

  35. 36

  36. 37

  37. 38

  38. 39

  39. 40

  40. 41

  41. 42

  42. 43

  43. 44

  44. 45

  45. 46

  46. Particle Filter Summary • Very flexible tool as we get to make our choice of proposal distributions (as long as we can properly compute importance weight) • Performance is guaranteed given infinite samples! • The particle cloud and its weights represent our distribution, but making decisions can still be complex: • Act based on the most likely particle • Act using a weighted summation over particles • Act conservatively, accounting for the worst particle • In practice, the number of particles required to perform well scales with the problem complexity and this can be hard to measure

  47. Part 2: Kalman Filters • Intuition: track Bel(x) with a Gaussian distribution, simplifying assumptions to ensure updates are all possible • Payoffs: • Continuous representation • Efficient computation • Uses: • Rocketry • Mobile devices • Drones • GPS • (the list is very long…)

  48. Part 2: Kalman Filters • Intuition: track Bel(x) with a Gaussian distribution, simplifying assumptions to ensure updates are all possible • Payoffs: • Continuous representation • Efficient computation • Uses: • Rocketry • Mobile devices • Drones • GPS • (the list is very long…)

  49. Example: Landing on mars

  50. Kalman Filter: Approach Kalman Filter: an instance of Bayes’ Filter Linear dynamics with Gaussian noise Linear observations with Gaussian noise Initial belief is Gaussian

  51. Kalman Filter: assumptions • Two assumptions inherited from Bayes’ Filter • Linear dynamics and observation models • Initial belief is Gaussian • Noise variables and initial state are jointly Gaussian and independent • Noise variables are independent and identically distributed • Noise variables are independent and identically distributed

Recommend


More recommend