cad cg bundle adjustment jointly optimize all
play

CAD&CG Bundle Adjustment - PowerPoint PPT Presentation

CAD&CG Bundle Adjustment Jointly optimize all cameras and points 2 arg min ( , ) X C x i j ij C ,... C , X ,..., X 1 N 1 N c p Triggs, B.,


  1. 集束调整 章国锋 浙江大学 CAD&CG 国家重点实验室

  2. Bundle Adjustment  Jointly optimize all cameras and points  2   arg min ( , ) X C x i j ij C ,... C , X ,..., X 1 N 1 N c p Triggs, B., Mclauchlan, P., Hartley, R., and Fitzgibbon, A. 1999. Bundle adjustment — a modern synthesis. In Proceedings of the International Workshop on Vision Algorithms: Theory and Practice. 298 – 372.

  3. Nonlinear Least Squares  Gaussian Newton   2 * x arg min ( x ) x          ˆ ˆ * ( x ) ( x ) ( x ) J x x     J x Jacobian matrix  ˆ x x     ˆ T T J J J ( x ) x first order approximation to Hessian  Levenberg-Marquardt       ˆ T T ( J J I ) x J ( x )

  4. Sparse Bundle Adjustment  Sparsity patten of Hessian 2   arg min ( , ) X C x i j ij C ,... C , X ,..., X 1 N 1 N c p 1 Point 1 Camera Manolis I. A. Lourakis, Antonis A. Argyros: SBA: A software package for generic sparse bundle adjustment. ACM Trans. Math. Softw. 36(1) (2009)

  5. Sparse Bundle Adjustment  An simple example  4 points  3 cameras  all points are visible in all cameras

  6. Sparse Bundle Adjustment      A 0 0 B 0 0 0     11 11 11      0 A 0 B 0 0 0 12 12 12      0 0 0 0 0 A B     13 13 13      A 0 0 0 B 0 0 21 21 21      0 A 0 0 B 0 0     22 22 22      0 0 0 0 0 A B     23 23   23  J ,   A 0 0 0 0 B 0    31 31 31      0 A 0 0 0 B 0     32 32 32      0 0 0 0 0 A B 33 33 33      A 0 0 0 0 0 B     41 41 41      0 A 0 0 0 0 B 42 42 42          0 0 0 0 0 A B 43 43 43

  7. Sparse Bundle Adjustment     T T   J J J U 0 0 W W W W   x 1 11 21 31 41   0 U 0 W W W W 2 12 22 32 42   0 0 U W W W W     3 13 23 33 43 U W       T T T T 0 0 0 J J   W W W V 11 12 13 1 T     W V T T T W W W 0 V 0 0   21 22 23 2   T T T W W W 0 0 V 0  31 32 33 3  T T T   W W W 0 0 0 V 41 42 43 4 4 3      T T T U A A , V B B , W A B j ij ij i ij ij ij ij ij   1 1 i j

  8. Sparse Bundle Adjustment     T T J J J x        T           C T T T T T T T    x C C C X X X X   1 2 3 1 2 3 4 X

  9. Sparse Bundle Adjustment     T T J J J x        T           C T T T T T T T T J    C C C X X X X   1 2 3 1 2 3 4 X 4     T A C ij ij j  i 1 3     T B X ij ij i  j 1

  10. Sparse Bundle Adjustment     T T J J J x         U W         C C         T       W V X X              1 1 T U WV W 0 WV         C C X         T       W V X X    1 T S U WV W Schur Complement        1 ( ) S WV Compute cameras first (# cameras << # points) C C X       T V W back substitution for points X X C

  11. Sparse Bundle Adjustment  In general, NOT all points are visible in all cameras 4 3      T T T U A A , V B B , W A B j ij ij i ij ij ij ij ij   1 1 i j  A ij = B ij = 0 if i -th points is invisible (or not matched) in j -th camera  More sparse structure, more speed-up

  12. Related Works  Hierarchical BA  Steedly et al. 2003, Snavely et al. 2008, Frahm et al. 2010  Segment-based BA  Zhu et al. 2014, Zhang et al. 2016 (ENFT)  Incremental BA  Kaess et al. 2008 (iSAM), Kaess et al. 2011 (iSAM2), Indelman et al. 2012 (iLBA), Ila et al. 2017 (SLAM++), Liu et al. 2017 (EIBA)  Parallel BA  Ni et al. 2007, Wu et al. 2011 (PBA)

  13. Segment-based Bundle Adjustment Zhang G, Liu H, Dong Z, et al. Efficient non-consecutive feature tracking for robust structure-from-motion[J]. IEEE Transactions on Image Processing, 2016, 25(12): 5957-5970.

  14. The Difficulties for Large-Scale SfM  Global Bundle Adjustment  Huge variables  Memory limit  Time-consuming  Iterative Local Bundle Adjustment  Large error is difficult to be propagated to the whole sequence.  Easily stuck in a local optimum.  Pose Graph Optimization  May not sufficiently minimize the error.

  15. Segment-based Progressive SfM  Split a long sequence to multiple short sequences.  Perform SfM for each sequence and align them together.  Detect the ``split point’’ and further split the sequence if the reprojection error is large.  The above procedure is repeated until the error is less than a threshold.

  16. Segment-based Progressive SfM  Split Point Detection  Best minimize the reprojection error w.r.t. a , i.e. steepest descent direction  The inconsistency between two consecutive frames

  17. Split Point Detection

  18. SFM on Garden Dataset 6 段长视频序列,将近 10 万帧,特征匹配 74 分钟, SfM 求解 16 分钟(单线程), 平均 17.7fps VisualSFM : SfM 求解 57 分钟 ( GPU 加速)

  19. Comparison on Garden Dataset VisualSFM ORB-SLAM ENFT-SFM

  20. Comparison with ORB-SLAM in Garden 01 Sequence ENFT-SLAM ORB-SLAM Non-consecutive Track Matching Bag-of-words Place Recognition Segment-based BA Pose Graph Optimization + Traditional BA

  21. Incremental BA in iSAM2 Based on Bayes Tree Kaess, M., Johannsson, H., Roberts, R., Ila, V., Leonard, J. J., & Dellaert, F. (2012). iSAM2: Incremental smoothing and mapping using the Bayes tree. The International Journal of Robotics Research, 31(2), 216-235.

  22. Incremental Bundle Adjustment In order to benefit from increased accuracy offered by relinearization in batch optimization:  Fixed-lag / Sliding-window Approaches  Keyframe-based Approaches  Incremental Approaches (iSAM, iSAM2, our EIBA)

  23. Gaussian Factor Graph kinematics measurement loop constraint a-priori constraint projection measurement Kaess, M., Johannsson, H., Roberts, R., Ila, V., Leonard, J. J., & : state Dellaert, F. (2012). iSAM2: Incremental smoothing and mapping using the Bayes tree. The International Journal of Robotics : landmark Research, 31(2), 216-235.

  24. Main Ideas of iSAM2  Reduce fill-in: Use heuristics algorithms CCOLAMD to provide a suboptimal ordering for factorization (finding the optimal is NP-hard).  Encode with the Bayes tree: Introduce Bayes tree (a.k.a. directed clique tree) to encode the square root information matrix.  Fluid relinearization: Perform fluid relinearization when adding new factors or updating the linearization points to avoid batch optimization.  Partial state updates: Perform partial state updates when solving the Bayes in order to update a state variable only when neccesary.

  25. One step: linearization 𝑚 1 𝑚 2 factor graph 𝑦 1 𝑦 2 𝑦 3 eliminating the factor graph using the CCOLAMD ordering (e.g. 𝑚 1 , 𝑚 2 , 𝑦 1 , 𝑦 2 , 𝑦 3 ) 𝑚 1 𝑚 2 chordal Bayes net 𝑦 1 𝑦 2 𝑦 3 creating Bayes tree in reverse elimination order 𝑦 2 , 𝑦 3 (e.g. 𝑦 3 , 𝑦 2 , 𝑦 1 , 𝑚 2 , 𝑚 1 ) Bayes tree 𝑚 1 , 𝑦 1 |𝑦 2 𝑚 2 |𝑦 3 adding new factors/states 𝑦 1 , 𝑦 2 , 𝑦 3 and applying the fluid relinearization (e.g. 𝑚 1 |𝑦 1 , 𝑦 2 𝑚 2 |𝑦 3 𝑔 𝑦 1 , 𝑦 3 )

  26. One step: partial update 𝑦 2 , 𝑦 3 starting from the root clique updating all variables that change by more 𝑚 2 |𝑦 3 𝑚 1 , 𝑦 1 |𝑦 2 than a threshold

  27. Reduce Fill-in Reordering with CCOLAMD / CHOLMOD Kaess, M., Ranganathan, A., & Dellaert, F. (2008). iSAM: Incremental smoothing and mapping. IEEE Transactions on Robotics, 24(6), 1365-1378.

  28. In Gaussian factor graphs, elimination is equivalent to sparse QR factorization of the measurement Jacobian. 𝑚 1 𝑚 2 𝑦 1 𝑦 2 𝑦 3 × × 𝑚 1 𝑚 2 × × × × 𝐾 = 𝑦 1 𝑦 2 𝑦 3 × × × × × sparse pattern of the measurement Jacobian

  29. In Gaussian factor graphs, elimination is equivalent to sparse QR factorization of the measurement Jacobian. 𝑚 1 𝑚 2 𝑦 1 𝑦 2 𝑦 3 × × × 𝑚 1 𝑚 2 × × 𝐼 = × × × 𝑦 1 𝑦 2 𝑦 3 × × × × × × × sparse pattern of the information matrix

Recommend


More recommend