slam
play

SLAM Dr. Ahmad Kamal Nasir Dr. Ing. Ahmad Kamal Nasir 1 The SLAM - PowerPoint PPT Presentation

SLAM Dr. Ahmad Kamal Nasir Dr. Ing. Ahmad Kamal Nasir 1 The SLAM Problem Given Robot controls Nearby measurements Estimate Robot state (position, orientation) Map of world features Dr. Ing. Ahmad Kamal Nasir 2


  1. SLAM Dr. Ahmad Kamal Nasir Dr. – Ing. Ahmad Kamal Nasir 1

  2. The SLAM Problem • Given – Robot controls – Nearby measurements • Estimate – Robot state (position, orientation) – Map of world features Dr. – Ing. Ahmad Kamal Nasir 2

  3. Motivation and Challenges • Advantages : Faster objective completion time, re- assign task in case a robot fails, tasks can be done which are beyond the capability of single robot Challenges : • – Map merging, large dynamic sparse outdoor environment – Controlling and managing of multi-robot system is challenging because the system requires handling of multiple robots with heterogeneous capabilities – Standard software architecture to avoid re- implementation of basic communication and non- interoperability Application : Multi-robot map building in absence • of priori map such as sea ports, destroyed nuclear plants … Dr. – Ing. Ahmad Kamal Nasir 3

  4. Types of Sensors • Odometry • Laser Ranging and Detection (LIDAR) • Acoustic (sonar, ultrasonic) • Radar • Vision (monocular, stereo etc.) • GPS • Gyroscopes, Accelerometers (Inertial Navigation) • Etc. Dr. – Ing. Ahmad Kamal Nasir 4

  5. Sensor Characteristics • Noise • Dimensionality of Output – LIDAR- 3D point – Vision- Bearing only (2D ray in space) • Range • Frame of Reference – Most in robot frame (Vision, LIDAR, etc.) – GPS earth centered coordinate frame – Accelerometers/Gyros in inertial coordinate frame Dr. – Ing. Ahmad Kamal Nasir 5

  6. Problem Statement (Probablistic Approach) Formalize 𝑞 𝑦 𝑢 , 𝑛 𝑨 𝑢 , 𝑣 𝑢 , 𝑑 𝑢 for a team of mobile robots • where 𝑦 𝑢 is the state of the robots at time step 𝑢 𝑛 is the map 𝑨 𝑢 is the robots measurements 𝑣 𝑢 are the control inputs 𝑑 𝑢 is the data association function Dr. – Ing. Ahmad Kamal Nasir 6

  7. Full vs. Online SLAM • Full SLAM calculates the robot state over all time up to time t p( , | , ) x m z u 1 : 1 : 1 : t t t • Online SLAM calculates the robot state for the current time t     p( , | , ) ( , | , ) ... x m z u p x m z u dx dx dx  1 : 1 : 1 : 1 : 1 : 1 2 1 t t t t t t t Dr. – Ing. Ahmad Kamal Nasir 7

  8. Full vs. Online SLAM Full SLAM Online SLAM    p( , | , )  x m z u p( , | , ) ( , | , ) ... x m z u p x m z u dx dx dx 1 : 1 : 1 : t t t  t 1 : t 1 : t 1 : t 1 : t 1 : t 1 2 t 1 Dr. – Ing. Ahmad Kamal Nasir 8

  9. Two Example SLAM Algorithms • Extended Kalman Filter (EKF) SLAM – Solves online SLAM problem – Uses a linearized Gaussian probability distribution model • FastSLAM – Solves full SLAM problem – Uses a sampled particle filter distribution model Dr. – Ing. Ahmad Kamal Nasir 9

  10. Extended Kalman Filter SLAM • Solves the Online SLAM problem using a linearized Kalman filter • One of the first probabilistic SLAM algorithms • Not used frequently today but mainly shown for its explanatory value Dr. – Ing. Ahmad Kamal Nasir 10

  11. Kalman Filter Components Linear discrete time dynamic system (motion model) State Control input Process noise    x F x B u G w  1 t t t t t t t State transition Control input Noise input function function function with covariance Q Measurement equation (sensor model) Sensor reading State Sensor noise with covariance R   z H x n     1 1 1 1 t t t t Sensor function Dr. – Ing. Ahmad Kamal Nasir 11

  12. EKF Equations Propagation (motion model):   ˆ ˆ x F x B u  1 / / t t t t t t t   T T P F P F G Q G  1 / / t t t t t t t t t Update (sensor model):  ˆ ˆ z H x    t 1 t 1 t 1 / t   ˆ r z z    1 1 1 t t t   T S H P H R      1 1 1 / 1 1 t t t t t t   1 T K P H S     t 1 t 1 / t t 1 t 1   ˆ ˆ x x K r      1 / 1 1 / 1 1 t t t t t t    1 T P P P H S H P         1 / 1 1 / 1 / 1 1 1 1 / t t t t t t t t t t t Dr. – Ing. Ahmad Kamal Nasir 12

  13. EKF Example • Initial State and Uncertainty • Using Range Measurements t=0 Dr. – Ing. Ahmad Kamal Nasir 13

  14. EKF Example • Predict Robot Pose and Uncertainty at time 1 t=1 Dr. – Ing. Ahmad Kamal Nasir 14

  15. EKF Example • Correct pose and pose uncertainty • Estimate new feature uncertainties t=1 Dr. – Ing. Ahmad Kamal Nasir 15

  16. EKF Example • Predict pose and uncertainty of pose at time 2 • Predict feature measurements and t=2 their uncertainties Dr. – Ing. Ahmad Kamal Nasir 16

  17. EKF Example • Correct pose and mapped features • Update uncertainties for mapped features t=2 • Estimate uncertainty of new features Dr. – Ing. Ahmad Kamal Nasir 17

  18. Implementation Dr. – Ing. Ahmad Kamal Nasir 18

  19. SLAM Effect of odometeric errors on robot uncertainty Feature based SLAM to reduce robot uncertainty Dr. – Ing. Ahmad Kamal Nasir 19

  20. Feature based SLAM 2D Line feature based SLAM using Laser Scanner 3D plane map using Kinect Dr. – Ing. Ahmad Kamal Nasir 20

  21. Occupancy Grid based SLAM Grid based SLAM Experiment on H-F1 Grid based SLAM Experiment on H-F0 Dr. – Ing. Ahmad Kamal Nasir 21

  22. Mapping Results Original map Line feature map Grid map Planned trajectory Map using Hough transform Map using RANSAC Dr. – Ing. Ahmad Kamal Nasir 22

  23. SLAM ormulization Robot state: 𝒚 𝒔 = 𝒚, 𝒛, 𝜾 𝑼 Line features: 𝒏 𝒎 = 𝒔, 𝜷 𝑼 Plane features: 𝒏 𝒒 = 𝒔, 𝜾, 𝝌 𝑼 𝑄 ⋯ 𝑄 𝑄 ⋯ 𝑄 ⋯ 𝑠 1 𝑛 𝑞1 𝑦 𝑠1 𝑠 1 𝑠 1 𝑠 1 𝑠 2 𝑠 1 𝑛 𝑚1 𝑄 𝑄 ⋯ 𝑄 ⋯ 𝑄 ⋯ 𝑦 𝑠2 𝑠 2 𝑠 1 𝑠 2 𝑠 2 𝑠 2 𝑛 𝑚1 𝑠 2 𝑛 𝑞1 ⋮ ⋮ ⋱ ⋮ ⋱ ⋮ ⋱ ⋮ 𝑛 𝑚1 𝑄 = 𝑄 ⋯ 𝑄 𝑄 ⋯ 𝑄 ⋯ 𝑦 = 𝑛 𝑚1 𝑛 𝑞1 𝑠 1 𝑛 𝑚1 𝑠 2 𝑛 𝑚1 𝑛 𝑚1 𝑛 𝑚1 ⋮ ⋮ ⋮ ⋱ ⋮ ⋱ ⋮ ⋱ 𝑛 𝑞1 𝑄 𝑄 ⋯ 𝑄 ⋯ 𝑄 ⋯ 𝑠 1 𝑛 𝑞1 𝑠 2 𝑛 𝑞1 𝑛 𝑞1 𝑛 𝑚1 𝑛 𝑞1 𝑛 𝑞1 ⋮ ⋮ ⋮ ⋱ ⋮ ⋱ ⋮ ⋱ Map (robot states + features) Map covariance Dr. – Ing. Ahmad Kamal Nasir 23

  24. Methodology Feature Map Dr. – Ing. Ahmad Kamal Nasir 24

  25. Core CSLAM Modules • Prediction • Clustering/Segmentation • Feature Extraction • Correspondence/ Data association • Map Update • New Feature Augmentation • Map Management Dr. – Ing. Ahmad Kamal Nasir 25

  26. Prediction 𝒈 𝒚 𝒔 , 𝒗 𝒖 , 𝒙 𝒖 (Robot kinematic motion model) 𝒏 𝒎 represents all of the existing line features 𝒏 𝒒 represents the set of all existing plane features 𝜖 𝑮 𝒔𝟐 = 𝜖𝑦 𝑠1 𝑔 𝑦 𝑠 , 𝑣 𝑢 , 𝑥 𝑢 Jacobian wrt. robot pose 𝑮 𝒐 = 𝜖 𝜖𝑥 𝑔 𝑦 𝑠 , 𝑣 𝑢 , 𝑥 𝑢 Jacobian wrt. Noise 𝑹 = Covariance of the noise input 𝑈 + 𝐺 𝐺 𝑠 1 ∙ 𝑄 𝑈 𝐺 𝑠 1 ∙ 𝑄 𝐺 𝑠 1 ∙ 𝑄 𝐺 𝑠 1 ∙ 𝑄 𝑠 1 𝑠 1 ∙ 𝐺 𝑜 ∙ 𝑅 ∙ 𝐺 𝑠 1 𝑛 𝑞 𝑔 𝑦 𝑠1 , 𝑣 𝑢 , 𝑥 𝑢 𝑠 1 𝑛 𝑚 𝑠 2 𝑠 1 𝑜 𝑄 𝑄 𝑄 𝑄 𝑠 2 ∙ 𝐺 𝑔 𝑦 𝑠2 , 𝑣 𝑢 , 𝑥 𝑢 𝑠 2 𝑛 𝑞 𝑠 2 𝑠 2 𝑠 2 𝑛 𝑚 𝑠 1 𝑦 𝑢+1 = 𝑄 𝑢+1 = 𝑄 𝑛 𝑚 𝑄 𝑄 𝑠 1 𝑛 𝑚 ∙ 𝐺 𝑄 𝑛 𝑚 𝑠 2 𝑛 𝑚 𝑛 𝑚 𝑠 1 𝑛 𝑚 𝑛 𝑞 𝑛 𝑞 𝑄 𝑄 𝑄 𝑠 1 𝑛 𝑞 ∙ 𝐺 𝑄 𝑛 𝑞 𝑠 2 𝑛 𝑞 𝑛 𝑚 𝑠 1 𝑛 𝑞 𝑛 𝑞 Dr. – Ing. Ahmad Kamal Nasir 26

  27. Clustering / Segmentation 2 + 𝑠 2 𝐸 𝑠 𝑗 , 𝑠 𝑗+1 = 𝑠 − 2𝑠 𝑗 𝑠 𝑗+1 cos(∆𝜄) 𝑗 𝑗+1 𝐸 𝑢ℎ = 𝐷 0 + 𝐷 1 min 𝑠 𝑗 , 𝑠 𝑗+1 (∆𝜄)) = 𝐸 𝑠 𝑗 , 𝑠 𝑗+1 𝐷 1 = 2(1 − cos 𝑠 𝑗 Lee Segmentation Dynamic clustering threshold IEPF Segmentation Dr. – Ing. Ahmad Kamal Nasir 27

  28. Line Feature Extraction 𝑜 𝑜 𝛽 = 1 − 𝑧 𝑗 2 − 𝑦 − 𝑦 𝑗 2 2 𝒃𝒖𝒃𝒐𝟑( −2 𝑧 − 𝑧 𝑗 𝑦 − 𝑦 𝑗 , 𝑧 ) 𝑗=0 𝑗=0 𝑠 = 𝑦 cos 𝛽 + 𝑧 sin 𝛽 2 𝜏 𝛽 𝜏 𝛽𝑠 𝑄 𝛽𝑠 = 2 𝜏 𝑠𝛽 𝜏 𝑠 𝑜 2 2 = 𝜖𝛽 2 𝜏 𝛽 𝜏 𝜍 𝑗 𝜖𝜍 𝑗 𝑗=0 𝑜 2 2 = 𝜖𝑠 2 𝜏 𝑠 𝜏 𝜍 𝑗 𝜖𝜍 𝑗 𝑗=0 𝑜 𝜏 𝛽𝑠 = 𝜏 𝑠𝛽 = 𝜖𝛽 ∙ 𝜖𝑠 2 ∙ 𝜏 𝜍 𝑗 𝜖𝜍 𝑗 𝜖𝜍 𝑗 𝑗=0 Dr. – Ing. Ahmad Kamal Nasir 28

  29. Correspondence / Data association 𝒜 𝒋 is the innovation 𝒂 𝒋 is the covariance of the innovation 𝒐 is the threshold 𝑨 𝑗𝑈 ∙ 𝑎 𝑗 −1 ∙ 𝑨 𝑗 < 𝑜 2 Mahalanobis distance criterion 𝑎 𝑗 = S + 𝑆 𝑇 is the covariance of the expected feature 2 𝜏 𝑠 σ rα 𝑆 is the covariance of the measured feature 𝑆 = 2 σ αr 𝜏 𝛽 Dr. – Ing. Ahmad Kamal Nasir 29

Recommend


More recommend