exact and partial energy minimization in computer vision
play

Exact and Partial Energy Minimization in Computer Vision Alexander - PowerPoint PPT Presentation

Ph.D. Thesis Exact and Partial Energy Minimization in Computer Vision Alexander Shekhovtsov Supervisor: Vclav Hlav Supervisor-specialist: Tom Werner Czech Technical University in Prague Faculty of Electrical Engineering, Department


  1. Ph.D. Thesis Exact and Partial Energy Minimization in Computer Vision Alexander Shekhovtsov Supervisor: Václav Hlavá č Supervisor-specialist: Tomáš Werner Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics Ph.D. programme: Electrical Engineering and Information Technology Branch of study: Mathematical Engineering, 3901V021

  2. Overview 2/21  Discrete Optimization in Computer Vision (Energy Minimization)  Cases Reducible to Minimum Cut Contribution: Distributed mincut/maxflow algorithm  General NP-hard case Contribution: Methods to find a Part of Optimal Solution

  3. MINCUT Minimum Cut Problem 3/21

  4. MINCUT Introduction 4/21  Distributed Model – Divide Computation AND Memory Distributed Model Divide Computation AND Memory Distributed Sequential Distributed Parallel Mem Mem Mem Quick Quick Quick Quick CPU CPU CPU Slow Slow Slow Slow Disk Solve large problem on a on more computers single computer Split data in parts:

  5. MINCUT Introduction 5/21  Sequential Algorithms  Parallel Algorithms Our goal is to improve on thi Our goal is to improve on this

  6. MINCUT Main Result 6/21 s t  Main Result: Main Result:

  7. MINCUT Main Idea 7/21  New distance function d ∗ B ( u ) = 2 d ∗ B ( v ) = 0 t length of the path = number of boundary edges distance = length of a shortest path to the sink corresponds to costly operations Algorithm: push-relabel between regions, augmenting path inside regions

  8. MINCUT Experimental Confirmation 8/21 Synthetic instances: Grid graph with random capacities, partitioned into 4 regions sweep = synchronously send messages on all boundary arcs push-relabel (with heuristics) proposed method

  9. MINCUT Instances in Computer Vision 9/21  Dataset Published by Vision Group Dataset Published by Vision Group at University of Western Onta at University of Western Ontario rio

  10. MINCUT Sequential Variant for Limited Memory Model 10/21  uses BK (Boykov and speedup: Ours/BK Kolmogorov) inside regions  sometimes faster than BK (CPU time excluding I/O)  robust over partition size

  11. MINCUT Sequential Variant for Limited Memory Model 11/21  Messages (sweeps) speedup over push-relabel (distributed version of Delong and Boykov 2008)

  12. MINCUT Parallel Variant 12/21  Competitive with shared memory model methods  Speedup bounded by memory bandwidth

  13. MINCUT Conclusion 13/21  Ne New distribute w distributed algo d algorithm rithm  Terminates in at most B 2 +1 sweeps (few in practice) Terminates in at most B +1 sweeps (few in practice)  Sequential Algorithm Sequential Algorithm 1) competitive with sequential solvers 2) uses few sweeps (= loads/unloads of regions) 3) suitable to run in the limited memory model  Pa Parallel Algorithm rallel Algorithm 1) competitive with shared memory algorithms 2) uses few sweeps (= rounds of message exchange) 3) suitable for execution on a computer cluster  Implementation can be specialized for regular grids (less memory/faster)  (?) no good worst case complexity bound in terms of elementary operations

  14. MINCUT Part. Optimality Discrete Energy Minimization 14/35 Problem f st (0 , 1) f st (1 , 1) f t (1) x x s x t f s (0) f st (0 , 0) s t s t 0 t f st (1 , 0)

  15. MINCUT Part. Optimality Partial Optimality 15/35 Energy model for stereo, Find optimal solution minimization NP-hard in some some pixels solution unknown solution globally optimal

  16. MINCUT Part. Optimality Partial Optimality 16/35 y t 0 s t

  17. MINCUT Part. Optimality Partial Optimality (Multilabel) 17/35 t 0 s t

  18. MINCUT Part. Optimality Overview 18/35

  19. MINCUT Part. Optimality Dead End Elimination 19/35 y α t 00 β t 0 s t Desmet et al. (1992), Goldstein (1994), Lasters et al. (1995),Pierce et al. (2000), Georgiev et al. (2006)

  20. MINCUT Part. Optimality Improving Mapping 20/35 3 2 1 0 t 0 s t

  21. MINCUT Part. Optimality Linear Embedding 21/35 x t 0 s t

  22. MINCUT Part. Optimality Linear Embedding 22/35 x t 0 s t

  23. MINCUT Part. Optimality Linear Embedding of Maps 23/35

  24. MINCUT Part. Optimality Linear Embedding of Maps 24/35

  25. MINCUT Part. Optimality Linear Embedding of Maps 25/35

  26. MINCUT Part. Optimality Linear Embedding of Maps 26/35

  27. MINCUT Part. Optimality More General Projections/Maps 27/35 Example of fractional map 3 7 0 . 5 6 α 6 7 0 . 5 3 t s

  28. MINCUT Part. Optimality Λ -Improving Characterization 28/35

  29. MINCUT Part. Optimality Special Cases 29/35 Methods that can be explained by the proposed condition:  DEE conditions by Desmet (1992) and Goldstein (1994)  (Weak/strong) Persistency in Quadratic Pseudo-Boolean Optimization (QPBO) by Nemhauser & Trotter (1975), Hammerr et al. (1984), Boros et al. (2002)  Multilabel QPBO Kohli et al. (2008), Shekhovtsov et al. (2008)  Submodular Auxiliary problems by Kovtun (2003, 2010)  Iterative Pruninig by Swoboda et al. (2013) Common properties, Only (M)QPBO was previously related to LP relaxation

  30. MINCUT Part. Optimality Maximum Λ -Improving 30/35 Projections  Problem: Problem: Find the mapping that maximizes domain reduction

  31. MINCUT Part. Optimality Maximum Λ -Improving 31/35 Projections  Thesis  Follow-up work, submitted to CVPR [1] Nemhauser & Trotter (1975), Hammerr et al. (1984), Boros et al. (2002) [2] Picard & Queyranne (1977) (Vertex Packing)

  32. MINCUT Part. Optimality Conclusions 32/35

  33. MINCUT Part. Optimality Higher Order 33/35 -Adams, W. P., Lassiter, J. B., and Sherali, H. D. (1998). Persistency in 0-1 polynomial programming. -Kolmogorov, V. (2012). Generalized roof duality and bisubmodular functions. -Kahl, F. and Strandmark, P. (2012). Generalized roof duality. -Lu, S. H. and Williams, A. C. (1987). Roof duality for polynomial 0-1 optimization. -Ishikawa, H. (2011). Transformation of general binary MRF minimization to the first-order case. -Fix, A. et al. (2011). A graph cut algorithm for higher-order Markov random fields.

  34. MINCUT Part. Optimality New Follow-up Work 34/35  Algorithm proposed:  Experiments: solution completeness on random problems Generalized Potts (5 labels) Fully Random (4 labels)

  35. MINCUT Part. Optimality New Follow-up Work 35/35  Experiments: solving large scale problems by parts Restrict the method to a local window Find globally optimal reduction # remaining labels partial labeling

Recommend


More recommend