administrivia
play

Administrivia And now the winners of homework #3 hybrid images as - PowerPoint PPT Presentation

Administrivia And now the winners of homework #3 hybrid images as decided by the graders CMPSCI 370: Intro. to Computer Vision Image alignment University of Massachusetts, Amherst March 22/24, 2016 Instructor: Subhransu Maji 2


  1. Administrivia • And now the winners of homework #3 hybrid images • as decided by the graders … CMPSCI 370: Intro. to Computer Vision Image alignment University of Massachusetts, Amherst March 22/24, 2016 Instructor: Subhransu Maji 2 Winner (#3): John Williams Winner (#2): Joshua Espinosa 3 4

  2. Winner (#1): Alan Rusell Honorable mention: Matthew Lydigsen lighthouse + dalek 5 6 Honorable mention: Makenzie Schwartz Honorable mention: Nathan Greenberg hillary + trump ?? + simon pegg 7 8

  3. Honorable mention: David Carlson Image alignment • But first are there any questions? snoop dogg + dog 9 10 A framework for alignment A framework for alignment • Matching local features • Matching local features • Local information used, can contain outliers • Local information used, can contain outliers • But hopefully enough of these matches are good • But hopefully enough of these matches are good • Consensus building • Aggregate the good matches and find a transformation that explains these matches 11 12

  4. Generating putative correspondences Generating putative correspondences ? ? ( ) ? ( ) = feature 
 feature 
 descriptor descriptor • Need to find regions and compare their feature descriptors 13 14 Feature detection with scale selection Scaling • We want to extract features with characteristic scale that matches the image transformation such as scaling and translation (a.k.a. covariance) Corner All points will be classified as Matching regions across scales edges Corner detection is sensitive to the image scale! Source: L. Lazebnik Source: L. Lazebnik 15 16

  5. 
 Blob detection: basic idea Blob detection: basic idea • Find maxima and minima of blob filter response in space Convolve the image with a “blob filter” at multiple scales minima and scale • Look for extrema (maxima or minima) of filter response in the resulting scale space • This will give us a scale and space covariant detector = * maxima Source: L. Lazebnik Source: N. Snavely 17 18 Blob detection in 2D Scale selection • At what scale does the Laplacian achieve a maximum Laplacian of Gaussian: Circularly symmetric operator for response to a binary circle of radius r? blob detection in 2D • To get maximum response, the zeros of the Laplacian have to be aligned with the circle • The Laplacian is given by (up to scale): 
 2 2 2 2 2 2 ( x y ) / 2 ( x y 2 ) e − + σ + − σ r / 2 . • Therefore, the maximum response occurs at σ = circle r 2 2 0 g g & # ∂ ∂ 2 2 g Scale-normalized: $ ! ∇ = σ + $ ! norm 2 2 Laplacian x y ∂ ∂ % " image Source: L. Lazebnik Source: L. Lazebnik 19 20

  6. Characteristic scale Scale-space blob detector • We define the characteristic scale of a blob as the scale 1. Convolve image with scale-normalized Laplacian at that produces peak of Laplacian response in the blob several scales center characteristic scale T. Lindeberg (1998). "Feature detection with automatic scale selection." International Journal of Computer Vision 30 (2): pp 77--116. Source: L. Lazebnik Source: L. Lazebnik 21 22 Scale-space blob detector: Example Scale-space blob detector: Example 23 24

  7. Scale-space blob detector: Example Scale-space blob detector 1. Convolve image with scale-normalized Laplacian at several scales 2. Find maxima of squared Laplacian response in scale- space Source: L. Lazebnik Source: L. Lazebnik 25 26 From feature detection to description Generating putative correspondences • Scaled and rotated versions of the same neighborhood will give rise to blobs that are related by the same transformation • What to do if we want to compare the appearance of these ? image regions? Normalization : transform these regions into same-size • circles • Problem: rotational ambiguity ( ) ( ) ? = feature 
 feature 
 descriptor descriptor • Need to find regions and compare their feature descriptors Source: L. Lazebnik 27 28

  8. 
 
 
 
 
 
 
 Eliminating rotation ambiguity SIFT features • • To assign a unique orientation to circular image windows: Detected features with characteristic scales and • orientations: Create histogram of local gradient directions in the patch • Assign canonical orientation at peak of smoothed histogram 2 π 0 David G. Lowe. "Distinctive image features from scale-invariant keypoints.” IJCV 60 (2), pp. 91-110, 2004. Source: L. Lazebnik Source: L. Lazebnik 29 30 From feature detection to description Feature descriptors • Simplest descriptor: vector of raw intensity values • How to compare two such vectors? • Sum of squared differences (SSD) — this is a distance measure 
 2 SSD( u, v ) ( u v ) ∑ = − i i i - Not invariant to intensity change 
 • Normalized correlation — this is a similarity measure 
 ( u u )( v v ) ∑ ( u u ) ( v v ) − − − − i i ( u, v ) i ρ = ⋅ = || u u || || v v || − − & # & # 2 2 ( u u ) ( v v ) $ ∑ ! $ ∑ ! − − $ j ! $ j ! how should we represent the patches? % " % " j j - Invariant to affine (translation + scaling) intensity change Source: L. Lazebnik 31 32

  9. 
 
 
 
 
 
 
 Feature descriptors: SIFT Problem with intensity vectors as descriptors • Descriptor computation: • Small deformations can affect the matching score a lot 
 • Divide patch into 4x4 sub-patches • Compute histogram of gradient orientations (8 reference angles) inside each sub-patch • Resulting descriptor: 4x4x8 = 128 dimensions David G. Lowe. "Distinctive image features from scale-invariant keypoints.” IJCV 60 (2), pp. 91-110, 2004. 33 34 Feature descriptors: SIFT Problem: Ambiguous putative matches • Descriptor computation: • Divide patch into 4x4 sub-patches • Compute histogram of gradient orientations (8 reference angles) inside each sub-patch • Resulting descriptor: 4x4x8 = 128 dimensions 
 • Advantage over raw vectors of pixel values • Gradients less sensitive to illumination change • Pooling of gradients over the sub-patches achieves robustness to small shifts, but still preserves some spatial information David G. Lowe. "Distinctive image features from scale-invariant keypoints.” IJCV 60 (2), pp. 91-110, 2004. Source: Y. Furukawa 35 36

  10. Rejection of unreliable matches RANSAC • How can we tell which putative matches are more reliable? • Ran dom Sa mple C onsensus • Choose a small subset of points uniformly at random • Heuristic: compare distance of nearest neighbor to that of • Fit a model to that subset second nearest neighbor • Find all remaining points that are “close” to the model and • Ratio of closest distance to second-closest distance will be high 
 reject the rest as outliers for features that are not distinctive • Do this many times and choose the best model • For rigid transformation we can estimate the parameters of the transformation , e.g., rotation angle, scaling, Threshold of 0.8 provides good separation translation, etc, from putative correspondence matches • Lets see how RANSAC works for a simple example. M. A. Fischler, R. C. Bolles. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Comm. of the ACM, Vol 24, pp David G. Lowe. "Distinctive image features from scale-invariant keypoints.” IJCV 60 (2), 381-395, 1981. pp. 91-110, 2004. 37 38 RANSAC for line fitting example RANSAC for line fitting example X ( ax i + b − y i ) 2 min a,b i Least-squares fit Source: R. Raguram Source: R. Raguram 39 40

  11. Least-squares line fitting RANSAC for line fitting example y i = ax i + b • Data: ( x 1 , y 1 ) , ( x 2 , y 2 ) , . . . , ( x n , y n ) • Line equation: y i = ax i + b n ( x i , y i ) X ( ax i + b − y i ) 2 E = 1. Randomly select i =1 minimal subset of n points n dE dE X X db = 2 ( ax i + b − y i ) = 0 da = 2 ( ax i + b − y i ) x i = 0 i =1 i =1 P n a = n P n i =1 x i y i − ( P n i =1 x i )( P n i =1 y i ) i =1 x i ) 2 n P n i − ( P n i =1 x 2 P n P n i − P n P n i =1 x 2 i =1 y i i =1 x i i =1 x i y i b = n P n i − ( P n i =1 x i ) 2 i =1 x 2 Source: R. Raguram 41 42 RANSAC for line fitting example RANSAC for line fitting example 1. Randomly select 1. Randomly select minimal subset of minimal subset of points points 2. Hypothesize a 2. Hypothesize a model model 3. Compute error function Source: R. Raguram Source: R. Raguram 43 44

  12. RANSAC for line fitting example RANSAC for line fitting example 1. Randomly select 1. Randomly select minimal subset of minimal subset of points points 2. Hypothesize a 2. Hypothesize a model model 3. Compute error 3. Compute error function function 4. Select points 4. Select points consistent with consistent with model model 5. Repeat hypothesize- and-verify loop Source: R. Raguram Source: R. Raguram 45 46 RANSAC for line fitting example RANSAC for line fitting example Uncontaminated sample 1. Randomly select 1. Randomly select minimal subset of minimal subset of points points 2. Hypothesize a 2. Hypothesize a model model 3. Compute error 3. Compute error function function 4. Select points 4. Select points consistent with consistent with model model 5. Repeat hypothesize- 5. Repeat hypothesize- and-verify loop and-verify loop Source: R. Raguram Source: R. Raguram 47 48

Recommend


More recommend