Localization and Mapping Chapter 25.3 Chapter 25.3 1
Sensors Range finders: sonar (land, underwater), laser range finder, radar (aircraft), tactile sensors, GPS Imaging sensors: cameras (visual, infrared) Proprioceptive sensors: shaft decoders (joints, wheels), inertial sensors, force sensors, torque sensors Chapter 25.3 2
Localization—Where Am I? Compute current location and orientation (pose) given observations: A t − 2 A t − 1 A t X t − 1 X t X t + 1 Z t − 1 Z t Z t + 1 Chapter 25.3 3
Localization contd. ω t ∆ t x i , y i θ t + 1 h ( x t ) v t ∆ t Z 1 Z 2 Z 3 Z 4 x t + 1 θ t x t Assume Gaussian noise in motion prediction, sensor range measurements Chapter 25.3 4
Localization contd. Can use particle filtering to produce approximate position estimate Robot position Robot position Robot position Chapter 25.3 5
Localization contd. Can also use extended Kalman filter for simple cases: robot landmark Assumes that landmarks are identifiable —otherwise, posterior is multimodal Chapter 25.3 6
Mapping Localization: given map and observed landmarks, update pose distribution Mapping: given pose and observed landmarks, update map distribution SLAM: given observed landmarks, update pose and map distribution Probabilistic formulation of SLAM: add landmark locations L 1 , . . . , L k to the state vector, proceed as for localization Chapter 25.3 7
Mapping contd. Chapter 25.3 8
Recommend
More recommend