CSE-571 Probabilistic Robotics Bayes Filter Implementations Particle filters
Motivation § So far, we discussed the § Kalman filter: Gaussian, linearization problems § Discrete filter: high memory complexity § Particle filters are a way to efficiently represent non-Gaussian distributions § Basic principle § Set of state hypotheses ( “ particles ” ) § Survival-of-the-fittest 2
Sample-based Localization (sonar) Probabilistic Robotics 1/21/12 3
Function Approximation § Particle sets can be used to approximate functions § The more particles fall into an interval, the higher the probability of that interval § How to draw samples form a function/distribution? 4
Rejection Sampling § Let us assume that f(x)< 1 for all x § Sample x from a uniform distribution § Sample c from [0,1] § if f(x) > c keep the sample otherwise reject the sampe f(x ’ c ) c OK ’ f(x) x x ’ 5
Importance Sampling Principle § We can even use a different distribution g to generate samples from f § By introducing an importance weight w , we can account for the “ differences between g and f ” § w = f / g § f is often called target § g is often called proposal 6
Importance Sampling with Resampling: Landmark Detection Example
Distributions Wanted: samples distributed according to p(x| z 1 , z 2 , z 3 )
This is Easy! We can draw samples from p(x|z l ) by adding noise to the detection parameters.
Importance Sampling with Resampling ∏ p ( z | x ) p ( x ) k = k Target distributi on f : p ( x | z , z ,..., z ) 1 2 n p ( z , z ,..., z ) 1 2 n p ( z | x ) p ( x ) = Sampling distributi on g : p ( x | z ) l l p ( z ) l ∏ p ( z ) p ( z | x ) l k f p ( x | z , z ,..., z ) = = ≠ 1 2 n k l Importance weights w : g p ( x | z ) p ( z , z ,..., z ) l 1 2 n Weighted samples After resampling
Importance Sampling with Resampling ∏ p ( z | x ) p ( x ) k = k Target distributi on f : p ( x | z , z ,..., z ) 1 2 n p ( z , z ,..., z ) 1 2 n p ( z | x ) p ( x ) = Sampling distributi on g : p ( x | z ) l l p ( z ) l ∏ p ( z ) p ( z | x ) l k f p ( x | z , z ,..., z ) = = ≠ 1 2 n k l Importance weights w : g p ( x | z ) p ( z , z ,..., z ) l 1 2 n
Importance Sampling with Resampling Weighted samples After resampling
Particle Filter Projection
Density Extraction
Sampling Variance
Particle Filters
Sensor Information: Importance Sampling − ← α Bel ( x ) p ( z | x ) Bel ( x ) − α p ( z | x ) Bel ( x ) ← = α w p ( z | x ) − Bel ( x )
Robot Motion ∫ − ← Bel ( x ) p ( x | u x ' ) Bel ( x ' ) d x ' , The image cannot be displayed. Your computer may not have enough memory to open the image, or the image may have been corrupted. Restart your computer, and then open the file again. If the red x still appears, you may have to delete the image and then insert it again.
Sensor Information: Importance Sampling − ← α Bel ( x ) p ( z | x ) Bel ( x ) − α p ( z | x ) Bel ( x ) ← = α w p ( z | x ) − Bel ( x )
Robot Motion ∫ − ← Bel ( x ) p ( x | u x ' ) Bel ( x ' ) d x ' ,
Particle Filter Algorithm 1. Algorithm particle_filter ( S t-1 , u t-1 z t ): = ∅ η = 2. S , 0 t = 3. For Generate new samples i 1 … n 4. Sample index j(i) from the discrete distribution given by w t-1 5. Sample from using and i p ( x | x , u ) j ( i ) u x x − − − − t t 1 t 1 t 1 t t 1 w = 6. i i Compute importance weight p ( z | x ) t t t η = η + i w 7. Update normalization factor t = ∪ < > 8. i i Insert S S { x , w } t t t t = 9. For i 1 … n w = η i i 10. w / Normalize weights t t
Particle Filter Algorithm ∫ = η Bel ( x ) p ( z | x ) p ( x | x , u ) Bel ( x ) dx − − − − t t t t t 1 t 1 t 1 t 1 draw x i t - 1 from Bel (x t - 1 ) draw x i t from p ( x t | x i t - 1 , u t - 1 ) Importance factor for x i t : target distributi on = i w t proposal distributi on η p ( z | x ) p ( x | x , u ) Bel ( x ) = − − − t t t t 1 t 1 t 1 p ( x | x , u ) Bel ( x ) − − − t t 1 t 1 t 1 ∝ p ( z | x ) t t
Resampling • Given : Set S of weighted samples. • Wanted : Random sample, where the probability of drawing x i is given by w i . • Typically done n times with replacement to generate new sample set S ’ .
Resampling w 1 w n w 1 w n w 2 w 2 W n-1 W n-1 w 3 w 3 • Stochastic universal sampling • • Roulette wheel Systematic resampling • • Binary search, n log n Linear time complexity • Easy to implement, low variance
Resampling Algorithm 1. Algorithm systematic_resampling ( S,n ): 2. = ∅ = 1 S ' , c w 1 = 3. For Generate cdf i 2 … n = + i c c w 4. − 1 i i − = 1 5. u ~ U [ 0 , n ], i 1 Initialize threshold 1 = 6. For j 1 … n Draw samples … u > 7. While ( ) c Skip until next threshold reached j i = i + 8. i 1 { } = ∪ < − 1 > i 9. S ' S ' x , n Insert = + − 1 10. Increment threshold u u n j j 11. Return S ’ Also called stochastic universal sampling
Motion Model Reminder Start
Proximity Sensor Model Reminder Sonar sensor Laser sensor
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
Using Ceiling Maps for Localization [Dellaert et al. 99]
Vision-based Localization P(z|x) z h(x)
Under a Light Measurement z: P(z|x) :
Next to a Light Measurement z: P(z|x) :
Elsewhere Measurement z: P(z|x) :
Global Localization Using Vision
Recovery from Failure
Localization for AIBO robots
Adaptive Sampling
KLD-sampling • Idea : • Assume we know the true belief. • Represent this belief as a multinomial distribution. • Determine number of samples such that we can guarantee that, with probability (1- d ) , the KL-distance between the true posterior and the sample-based approximation is less than e . • Observation : • For fixed d and e , number of samples only depends on number k of bins with support: 3 ⎧ ⎫ − 1 k 1 2 2 = Χ − − δ ≅ − + 2 n ( k 1 , 1 ) 1 z ⎨ ⎬ − δ 1 ε ε − − 2 2 9 ( k 1 ) 9 ( k 1 ) ⎩ ⎭
Adaptive Particle Filter Algorithm Δ ε , δ 1. Algorithm adaptive_particle_filter ( S t-1 , u t-1 z t, ): , = ∅ α = = = = ∅ S t , 0 , n 0 , k 0 , b 2. 3. Do Generate new samples 4. Sample index j(n) from the discrete distribution given by w t-1 n p ( x | x , u ) j ( n ) u 5. Sample from using and x x − − − − t t 1 t 1 t 1 t t 1 w = n n p ( z | x ) 6. Compute importance weight t t t η = η + n w 7. Update normalization factor t = ∪ < > n n S S { x , w } 8. Insert t t t t n x 9. If ( falls into an empty bin b ) Update bins with support t 10. k=k+1, b = non-empty 11. n=n+1 1 < Χ − − δ 2 n ( k 1 , 1 ) 12. While ( ) ε 2 13. For = i 1 … n w = η i i w / 14. Normalize weights t t
Example Run Sonar
Example Run Laser
Evaluation
Localization Algorithms - Comparison Kalman Multi- Topological Grid-based Particle filter hypothesis filter (fixed/variable) maps tracking Sensors Gaussian Gaussian Features Non-Gaussian Non- Gaussian Posterior Gaussian Multi-modal Piecewise Piecewise Samples constant constant Efficiency (memory) ++ ++ ++ -/o +/++ Efficiency (time) ++ ++ ++ o/+ +/++ Implementation + o + +/o ++ Accuracy ++ ++ - +/++ ++ Robustness - + + ++ +/++ Global No Yes Yes Yes Yes localization
Recommend
More recommend