joint affinity propagation
play

Joint Affinity Propagation for Multiple View Segmentation - PowerPoint PPT Presentation

ICCV 2007 Eleventh IEEE International Conference on Computer Vision Rio de Janeiro, Brazil, October 14-20, 2007 Joint Affinity Propagation for Multiple View Segmentation Jianxiong XIAO, Jingdong WANG, Ping TAN, Long QUAN Department of


  1. ICCV 2007 Eleventh IEEE International Conference on Computer Vision Rio de Janeiro, Brazil, October 14-20, 2007 Joint Affinity Propagation for Multiple View Segmentation Jianxiong XIAO, Jingdong WANG, Ping TAN, Long QUAN Department of Computer Science & Engineering The Hong Kong University of Science & Technology

  2. Outline Part 1: Introduction Part 2: Our Approach – Formulation – Optimization: • Hierarchical Sparse Affinity Propagation • Semi-supervised Contraction Part 3: Experiment Results Part 4: Conclusion 2

  3. Outline Part 1: Introduction Part 2: Our Approach – Formulation – Optimization: • Hierarchical Sparse Affinity Propagation • Semi-supervised Contraction Part 3: Experiment Results Part 4: Conclusion 3

  4. Image-based modeling Two Steps Methods: • Get 3D points and camera positions from 2D images (geometry computation) • Get 3D objects from unstructured 3D points (objects reconstruction) input images recovered 3D points recovered object models 4

  5. Structure from motion 5

  6. Data segmentation • Pure 2D segmentation & 3D clustering is hard! – J. Shi and J. Malik. Normalized Cuts and Image Segmentation – etc. • Multiple view joint segmentation – Simultaneously segment 3D points and 2D images – Jointly utilize both 2D and 3D information 2D? 3D? 6

  7. Our work • Explore for multiple view joint segmentation by simultaneously utilizing 2D and 3D data. • The availability of both 2D and 3D data can bring complementary information for segmentation. • Propose two practical algorithms for joint segmentation: – Hierarchical Sparse Affinity Propagation – Semi-supervised Contraction 7

  8. Outline Part 1: Introduction Part 2: Our Approach – Formulation – Optimization: • Hierarchical Sparse Affinity Propagation • Semi-supervised Contraction Part 3: Experiment Results Part 4: Conclusion 8

  9. Outline Part 1: Introduction Part 2: Our Approach – Formulation – Optimization: • Hierarchical Sparse Affinity Propagation • Semi-supervised Contraction Part 3: Experiment Results Part 4: Conclusion 9

  10. Problem formulation   The set of images I  I     i  The set of regions I , P u i k k           A joint point x , y , z , , P , , n P , x u u 1 1 n   L  l A set of labels k    Set of visibilities V v j    Set of joint points X x j We now want to get the inference of L , given X , V and I . 10

  11. Graph based segmentation Graph G = { V , E }: V : 3D points recovered from SFM E : each point connected to its K - nearest neighbors, and two end points of each edge both visible at least in one view graph model 11

  12. Joint similarity         s i , j s i , j s i , j 3 c       s i , j s i , j ic t • 3D coordinates • 3D normal • Color • Contour • Patch 12

  13. 3D similarity 2  p p   p p j i   i j s i , j  3 d 2 2 3 d n n 2  j i   n n   i j s i , j  3 n 2 2 3 n         s i , j s i , j s i , j 3 3 d 3 n 13

  14. 2D color similarity     2 c  E E   c   i j s i , j  c 2 2   c   med max g t        v t i , j v v s i , j v v  ic 2 2 ic    g = gradient of i-th image i d 2d (p,q) . p . q p q 14

  15. Utilizing the texture information • Hyper Graph? • Higher Order Prior Smoothness? • … 15

  16. Competitive region growing • Associate patches with each 3D point. 16

  17. Patch filtering • A small error around the object boundary may result in a large color difference. 17

  18. Patch histogram similarity For each joint point   • Collect all its patches P n • Build an average color histogram h 0 • Down-sample the patches t-1 times    • A vector of histograms  h h , , h  0 t 1      t 1   1      i j i j s i , j d h , h d h , h t k k t  k 0 where d (·, ·) is the dissimilarity measures for histograms. 18

  19. Learning • The concept of segmentation is obviously subjective. • Hence, some user assistant information will greatly improve the segmentation. 19

  20. Handle the ambiguity • To improve robustness and handle the ambiguity of the projections near the boundary 20

  21. Outline Part 1: Introduction Part 2: Our Approach – Formulation – Optimization: • Hierarchical Sparse Affinity Propagation • Semi-supervised Contraction Part 3: Experiment Results Part 4: Conclusion 21

  22. Affinity propagation [Frey & Dueck 2007] • find several exemplars such that the sum of the similarities between the data points and the corresponding exemplars is maximized. • i.e. searching over valid configurations of    1  the labels so as to minimize c , , c c N the energy N        E s i , c c i  i 1 • i.e. maximizing the net similarity N            S E c c c k  k 1 22

  23. Responsibility   • The responsibility sent from data point r , i k i to candidate exemplar point , reflects the k accumulated evidence for how well-suited k point is to serve as the exemplar for point , i taking into account other potential exemplars for point . i Responsibility i k 23

  24. Availability   • The availability , sent from the a , i k i k candidate exemplar point to point , reflects the accumulated evidence for how i appropriate it would be for point to choose k point as its exemplar, taking into account the support from other points that point k should be an exemplar. Availability i k 24

  25. Responsibility & Availability              r i , k s i , k max a i , k ' s i , k '  k ' k                a i , k min 0 , r k , k max 0 , r i ' , k      i ' i , k Responsibility Availability i k 25

  26. Outline Part 1: Introduction Part 2: Our Approach – Formulation – Optimization: • Hierarchical Sparse Affinity Propagation • Semi-supervised Contraction Part 3: Experiment Results Part 4: Conclusion 26

  27. Sparse affinity propagation • Affinity propagation on a sparse graph, called sparse affinity propagation, is more efficient as pointed in [Brendan Frey, Delbert Dueck 2007]. • Then sparse affinity propagation runs in O( T | E |) time with T the number of the iterations and | E | the number of the edges. • Here, the time complexity is O( Tn ) since | E | = O( n ). 27

  28. Original sparse AP • The number of the data points that have the same exemplar i is at most degree( i ), where degree( i ) is the number of nodes connecting i . This will result in unexpectedly too many fragments. 28

  29. Hierarchical sparse AP G ’= G(V,E) ; while (true) { [ Exemplars , Label ] = Sparse Affinity Propagation ( G ’);            p , q V ' , p , q E ' ,    E ' c , c   i j   Exemplar ( p ) c , Exemplar ( q ) c   i j G ’= ( V ’= Exemplars , E’ ); if ( Satisfy Stopping Condition ) break; } 29

  30. Hierarchical sparse AP L=1 L=2 L=5 L=8 L=11 L=14 L=17 30

  31. Outline Part 1: Introduction Part 2: Our Approach – Formulation – Optimization: • Hierarchical Sparse Affinity Propagation • Semi-supervised Contraction Part 3: Experiment Results Part 4: Conclusion 31

  32. Semi-supervised contraction       s p , p s q , q 0 32

  33. Semi-supervised contraction 33

  34. Semi-supervised contraction 34

  35. Semi-supervised contraction • Finally, when the algorithm converged, availabilities and responsibilities are combined to identify exemplars. • For point , its corresponding label is i obtained as          * k arg max a i , k r i , k   k p , q 35

  36. Semi-supervised contraction 36

  37. Outline Part 1: Introduction Part 2: Our Approach – Formulation – Optimization: • Hierarchical Sparse Affinity Propagation • Semi-supervised Contraction Part 3: Experiment Results Part 4: Conclusion 37

  38. Results 38

  39. Results 39

  40. Results 40

  41. Outline Part 1: Introduction Part 2: Our Approach – Formulation – Optimization: • Hierarchical Sparse Affinity Propagation • Semi-supervised Contraction Part 3: Experiment Results Part 4: Conclusion 41

  42. Conclusion 42

  43. ICCV 2007 Eleventh IEEE International Conference on Computer Vision Rio de Janeiro, Brazil, October 14-20, 2007 Joint Affinity Propagation for Multiple View Segmentation Thank you! Questions? Contact: Jianxiong XIAO csxjx@cse.ust.hk 43

  44. 2D color similarity • Contour based similarity 44

Recommend


More recommend