bundle adjustment and slam
play

Bundle Adjustment and SLAM 31 March 2014 1 Structure-From-Motion - PowerPoint PPT Presentation

3D Photography: Bundle Adjustment and SLAM 31 March 2014 1 Structure-From-Motion Two views initialization: 5-Point algorithm (Minimal Solver) 8-Point linear algorithm 7-Point algorithm E ( R,t) 2 Structure-From-Motion


  1. 3D Photography: Bundle Adjustment and SLAM 31 March 2014 1

  2. Structure-From-Motion • Two views initialization: – 5-Point algorithm (Minimal Solver) – 8-Point linear algorithm – 7-Point algorithm E → ( R,t) 2

  3. Structure-From-Motion • Triangulation: 3D Points E → ( R,t) 3

  4. Structure-From-Motion • Subsequent views: Perspective pose estimation (R,t) (R,t) (R,t) 4

  5. Bundle Adjustment • Final step in Structure-from-Motion. • Refine a visual reconstruction to produce jointly optimal 3D structures P and camera poses C . • Minimize total reprojection errors .  z P Cost Function: j    2   argmin , x P C ij j i W   ij X i j  X , j C  j z ij x ij  1 : W  Measurement error covariance z ij ij X  [ , ] P C C i 5

  6. Bundle Adjustment • Final step in Structure-from-Motion. • Refine a visual reconstruction to produce jointly optimal 3D structures P and camera poses C . • Minimize total reprojection errors .  z P Cost Function: j    T argmin z W z ij ij ij   X i j  X , j C j   f X x ij  1 : W  Measurement error covariance z ij ij X  [ , ] P C C i 6

  7. Bundle Adjustment   • Minimize the cost function: argmin f X X 1. Gradient Descent 2. Newton Method 3. Gauss-Newton 4. Levenberg-Marquardt 7

  8. Bundle Adjustment 1. Gradient Descent X k  Initialization: X 0    f X    Compute gradient: T g Z WJ  X  Iterate until X X k convergence    Update: X X g k k     : Step size J : Jacobian  X Slow convergence near minimum point! 8

  9. Bundle Adjustment 2. Newton Method 2 nd order approximation (Quadratic Taylor Expansion):         T 1 ( ) ( ) f X f X g H  2  X X X X K K      2 f X  Hessian matrix : H   2  X X k   )  ( Find that minimizes ! f X  X X K 9

  10. Bundle Adjustment 2. Newton Method Differentiate and set to 0 gives:     H 1 g    Update: X X k k Computation of H is not trivial and might get stuck at saddle point! 10

  11. Bundle Adjustment 3. Gauss-Newton  2      ij T H J WJ Z W  ij ij 2 X i j  T H J WJ Normal equation:     T T J WJ J W Z    Update: X X k k Might get stuck and slow convergence at saddle point ! 11

  12. Bundle Adjustment 4. Levenberg-Marquardt  Regularized Gauss-Newton with damping factor .         T T J WJ I J W Z H LM   0 : Gauss-Newton (when convergence is rapid)    : Gradient descent (when convergence is slow) 12

  13. Structure of the Jacobian and Hessian Matrices • Sparse matrices since 3D structures are locally observed. 13

  14. Solving the Normal Equation • Schur Complement     T H J W Z LM  H LM 3D Camera Structures Parameters 14

  15. Solving the Normal Equation • Schur Complement     T H J W Z LM H H SC s  H LM T H H C SC 3D Camera Structures Parameters 15

  16. Solving the Normal Equation • Schur Complement     T H J W Z LM         H H 3D Structures  S SC S S         T       Camera Parameters H H SC C C C   0 I Multiply both sides by:     1 T   H H I SC S         H H  S SC S S              1 1 T T 0       H H H H H H C SC S SC C C S SC S 16

  17. Solving the Normal Equation • Schur Complement         H H  S SC S S              1 1 T T 0       H H H H H H C SC S SC C C S SC S  First solve for from: C Easy to invert a block diagonal matrix         1 1 T T ( ) H H H H H H C SC S SC C C S SC S Schur Complement (Sparse and Symmetric Positive Definite Matrix)  Solve for by backward substitution. SC 17

  18. Solving the Normal Equation          Ax  1 1 T T ( ) H H H H H H b C SC S SC C C S SC S Can be solved without inverting A since it is a sparse matrix! • Sparse matrix factorization A  1. LU Factorization LU Solve for x by forward A  backward substitutions. 2. QR factorization QR A  3. Cholesky Factorization T LL • Iterative methods 1. Conjugate gradient 2. Gauss-Seidel 18

  19. Problem of Fill-In 19

  20. Problem of Fill-In • Reorder sparse matrix to minimize fill-in.     T T T P AP P x P b Permutation matrix to reorder A • NP-Complete problem. • Approximate solutions: 1. Minimum degree 2. Column approximate minimum degree permutation 3. Reverse Cuthill-Mckee. 4. … 20

  21. Problem of Fill-In 21

  22. Robust Cost Function    • Non-linear least squares: T argmin z W z ij ij ij X ij • Maximum log-likelihood solution: - argmin ln ( | ) p Z X X • Assume that: 1. X is a random variable that follows Gaussian distribution. 2. All observations are independent.           T  - argmin ln ( | ) - argmin ln exp p X Z c z W z ij ij ij ij   X X ij     T argmin z W z ij ij ij X ij 22

  23. Robust Cost Function • Gaussian distribution assumption is not true in the presence of outliers! • Causes wrong convergences. 23

  24. Robust Cost Function          T argmin argmin z z S z ij ij ij ij ij X X ij ij  " Robust Cost Function W scaled with ij ij • Similar to iteratively re-weighted least-squares. • Weight is iteratively rescaled with the attenuating  " factor . ij • Attenuating factor is computed based on current error. 24

  25. Robust Cost Function   (.) " (.) Influence from high errors Gaussian Distribution Reduced influence from high errors Cauchy Distribution 25

  26. Robust Cost Function Outliers are taken into account in Cauchy! 26

  27. State-of-the-Art Solvers • Google Ceres: – https://code.google.com/p/ceres-solver/ • g2o: – https://openslam.org/g2o.html • GTSAM: – https://collab.cc.gatech.edu/borg/gtsam/ 27

  28. Simultaneous Localization and Mapping (SLAM) • For a robot to estimate its own pose and acquire a map model of its environment. • Chicken-and-Egg problem: – Map is needed for localization. – Pose is needed for mapping. 28

  29. Full SLAM: Problem Definition Control Actions u 1 u 3 u 2 Robot Poses Observations Map landmarks M K      argmax , | , argmax ( ) ( | , ) ( | , ) p X L Z U p X p x x u p z x l  0 1 i i i k ik jk X,L X,L   1 1 i k 29

  30. Simultaneous Localization and Mapping (SLAM) M K      argmax , | , argmax ( ) ( | , ) ( | , ) p X L Z U p X p x x u p z x l  0 1 i i i k ik jk   X,L X,L 1 1 i k   M K       Negative log- - argmin ln ( | , ) ln ( | , ) p x x u p z x l  i i 1 i k ik jk   likelihood   X,L 1 1 i k Likelihoods: 2    ( | , ) exp{ ( , ) } p x x u f x u x   1 1  i i i i i i Process model 2    ( | , ) exp{ ( , ) } p z x l h x l z k ik jk ik jk k  k Measurement model 30

  31. Simultaneous Localization and Mapping (SLAM)   M K       argmax ( , | , ) - argmin ln ( | , ) ln ( | , ) p X L Z U p x x u p z x l  i i 1 i k ik jk     X,L X,L 1 1 i k Putting the likelihoods into the equation:   M K   2 2       argmax ( , | , ) argmin ( , ) ( , ) p X L Z U f x u x h x l z  1  i i i ik jk k    i   k X,L X,L 1 1 i k Minimization can be done with Levenberg- Marquardt (similar to bundle adjustment)! 31

  32. Simultaneous Localization and Mapping (SLAM) Normal Equations:   Weight made up of , i k         T T J WJ I J W Z     f f h h Jacobian made up of , , ,     u x l x Can be solved with sparse matrix factorization or iterative methods 32

  33. Online SLAM: Problem Definition • Estimate current pose and full map . x L t     ( , | , ) ( , | , ) ... p x L Z U p X L Z U dx dx dx  1 2 1 t t Previous poses are marginalized out • Inference with: 1. Kalman Filtering (EKF SLAM) 2. Particle Filtering (FastSLAM) 33

Recommend


More recommend