announcements
play

Announcements Midterm review: next Wed Oct 4, 12-1 pm, ENS 31NQ - PDF document

Announcements Midterm review: next Wed Oct 4, 12-1 pm, ENS 31NQ Lecture 9: Fitting, Contours Thursday, Sept 27 Last time Today Fitting shape patterns with the Hough Fitting lines (brief) transform and generalized Hough


  1. Announcements • Midterm review: next Wed Oct 4, 12-1 pm, ENS 31NQ Lecture 9: Fitting, Contours Thursday, Sept 27 Last time Today • Fitting shape patterns with the Hough • Fitting lines (brief) transform and generalized Hough – Least squares transform – Incremental fitting, k-means allocation • RANSAC, robust fitting • Deformable contours Line fitting: Line fitting: what is the line? which point is on which line? • Assuming all the points that belong to a particular Two possible strategies: line are known, solve for line parameters that yield • Incremental line fitting minimal error. • K-means Forsyth & Ponce 15.2.1

  2. Incremental line fitting Incremental line fitting • Take connected curves of edge points and fit lines to runs of points (use gradient directions)

  3. Allocating points with k-means • Believe there are k lines, each of which generates some subset of the data points • Best solution would minimize the sum of the squared distances from points to their assigned lines • Use k-means algorithm • Convergence based on size of change in lines, whether labels have been flipped. If we have occluded edges, will often result in more than one fitted line Allocating points with k-means

  4. Sensitivity to starting point Outliers Outliers affect least squares fit • Outliers can result from – Data collection error – Overlooked case for the model chosen • Squared error terms mean big penalty for large errors, can lead to significant bias Forsyth & Ponce, Fig 15.7

  5. Outliers affect least squares fit Outliers affect least squares fit Least squares and error Least squares and error • If we are assuming Gaussian additive noise corrupts the data points Outliers have large – Probability of noisy point being within distance influence on the fit d of corresponding true point decreases rapidly with d – So, points that are way off are not really consistent with Gaussian noise hypothesis, model wants to fit to them… ( ) ∑ θ r x , Best model minimizes i i residual error: i data point model parameters Robustness M-estimators • A couple possibilities to handle outliers: • Estimate parameters by minimizing modified – Give the noise heavier tails residual expression – Search for “inliers” ( ( ) ) ∑ ρ θ σ , ; r x i i i parameter determining residual error when function flattens out • Reflects a noise distribution that does not vanish as quickly as Gaussian, i.e., consider outliers more likely to occur • De-emphasizes contribution of distant points

  6. Applying the M-estimator Example M-estimator original Looks like distance for small values, Like a constant for large values Non-linear optimization, must be solved iteratively Fit with good choice of Impact of sigma on fitting quality? Applying the M-estimator Applying the M-estimator too small: error for all points similar too large: error about same as least squares Scale selection RANSAC • Popular choice: at iteration n during • RANdom Sample Consensus minimization • Approach: we don’t like the impact of outliers, so let’s look for “inliers”, and use those only.

  7. RANSAC • Choose a small subset uniformly at random • Fit to that • Anything that is close to result is signal; all others are noise • Refit • Do this many times and choose the best (best = lowest fitting error) RANSAC Reference: M. A. Fischler, R. C. Bolles. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Comm. of the ACM, Vol 24, pp 381-395, 1981. RANSAC Line Fitting Example RANSAC Line Fitting Example Task: Sample two points Estimate best line Slide credit: Jinxiang Chai, CMU RANSAC Line Fitting Example RANSAC Line Fitting Example Fit Line Total number of points within a threshold of line.

  8. RANSAC Line Fitting Example RANSAC Line Fitting Example Repeat, until get a Repeat, until get a good result good result RANSAC application: robust computation RANSAC Line Fitting Example Interest points (Harris corners) in left and right images about 500 pts / image 640x480 resolution Putative Outliers (117) correspondences ( t =1.25 pixel; 43 (268) iterations) (Best match,SSD<20) Final inliers (262) Repeat, until get a Inliers (151) good result Hartley & Zisserman p. 126 RANSAC parameters Grouping and fitting • Number of samples required ( n ) • Grouping, segmentation: make a compact – Absolute minimum will depending on model being fit (lines representation that merges similar features -> 2, circles -> 3, etc) – Relevant algorithms: K-means, hierarchical clustering, • Number of trials ( k ) Mean Shift, Graph cuts – Need a guess at probability of a random point being “good” – Choose so that we have high probability of getting one • Fitting: fit a model to your observed features sample free from outliers – Relevant algorithms: Hough transform for lines, • Threshold on good fits ( t ) circles (parameterized curves), generalized Hough – Often trial and error: look at some data fits and estimate transform for arbitrary boundaries; least squares; average deviations assigning points to lines incrementally or with k- means; robust fitting • Number of points that must agree ( d ) – Again, use guess of probability of being an outlier; choose d so that unlikely to have one in the group

  9. Today • Fitting lines (brief) – Least squares – Incremental fitting, k-means allocation • RANSAC, robust fitting • Deformable contours Towards object level grouping Deformable contours Low-level segmentation cannot go this far… How do we get these kinds of boundaries? One direction: semi-automatic methods • Give a good but rough initial boundary Tracking Heart Ventricles • Interactively guide boundary placement (multiple frames) Still use image analysis techniques in concert. Deformable contours Deformable contours a.k.a. active contours, snakes a.k.a. active contours, snakes Goal: evolve the contour to fit exact object boundary Given: initial contour (model) near desired object (Single frame) [Kass, Witkin, Terzopoulos 1987]

  10. Deformable contours Deformable contours a.k.a. active contours, snakes a.k.a. active contours, snakes • Elastic band of arbitrary shape, initially located near image contour of interest • Attracted towards target contour depending on intensity gradient • Iteratively refined initial intermediate final Comparison: shape-related methods Snake Energy The total energy of the current snake defined as = + E E E total in ex Internal energy encourages smoothness • Chamfer matching : given two shapes defined by points, External energy encourages curve onto or any particular shape image structures (e.g. image edges) measure average distance from one to the other Internal energy incorporates prior • (Generalized) Hough transform : given pattern/model knowledge about object boundary, which shape, use oriented edge points to vote for likely position allows a boundary to be extracted even if some image data is missing of that pattern in new image • Deformable contours : given initial starting boundary We will want to iteratively minimize this energy for and priors on preferred shape types, iteratively adjust a good fit between the deformable contour and boundary to also fit observed image the target shape in the image Many of the snakes slides are adapted from Yuri Boykov Parametric curve representation Parametric curve representation • Coordinates given as functions of a parameter ν = ≤ ≤ that varies along the curve ( ) ( ( ), ( )) 0 1 s x s y s s • For example, for a circle with center (0,0): parametric form: x = r sin(s) y = r cos(s) r parameters: open curve (0,0) closed curve radius r Curves parameterized by arc length, angle 0 <= s < 2pi the length along the curve (continuous case) (continuous case)

  11. Internal energy External energy • Bending energy of a continuous curve • Measures how well the curve matches the 2 image data, locally ν 2 ν 2 d d The more the curve ν = α + β bends � larger this E in ( ( s )) ( s ) ( s ) • Attracts the curve toward different image 2 ds d s energy value is. features – Edges, lines, etc. Elasticity, Stiffness, Tension Curvature Internal 1 ∫ = ν energy for a ( ( )) E E s ds in in curve: 0 External energy: edge strength Snake Energy (continuous form) • Image I(x,y) = + E E E ( , ) • Gradient images & G x x y G y ( x , y ) total in ex • External energy at a point is 1 ∫ = ν ν = − ν + ν ( ( )) E E s ds 2 2 E ( ( s )) ( | G ( ( s )) | | G ( ( s )) | ) e.g. bending energy in in ex x y 0 ( Negative so that minimizing it forces the curve toward strong edges) 1 ∫ = ν • External energy for the curve: ( ( )) E E s ds e.g. total edge strength ex ex under curve 1 0 ∫ = ν ( ( )) E E s ds ex ex 0 Parametric curve representation Discrete approach (discrete case) • Represent the curve with a set of n points discrete snake discrete optimization discrete image representation (dynamic programming) ν = = − ( , ) 0 K 1 x y i n … i i i

Recommend


More recommend