Fast and Robust Normal Estimation for Point Clouds with Sharp Features Alexandre Boulch & Renaud Marlet University Paris-Est, LIGM (UMR CNRS), Ecole des Ponts ParisTech Symposium on Geometry Processing 2012 1/37
Normal estimation Normal estimation for point clouds Our method Experiments Conclusion 2/37
Normal Estimation Normal estimation for point clouds Our method Experiments Conclusion 3/37
Data Sensitivity to noise Point clouds from photogram- metry or laser acquisition: ◮ may be noisy Robustness to noise 4/37
Data Point clouds from photogram- metry or laser acquisition: ◮ may be noisy ◮ may have outliers P Regression Plane 4/37
Data Point clouds from photogram- metry or laser acquisition: Smoothed ◮ may be noisy sharp features ◮ may have outliers ◮ most often have sharp features Preserved sharp features 4/37
Data Point clouds from photogram- metry or laser acquisition: Sensitivity to ◮ may be noisy anisotropy ◮ may have outliers ◮ most often have sharp features ◮ may be anisotropic Robustness to anisotropy 4/37
Data Point clouds from photogram- metry or laser acquisition: ◮ may be noisy ◮ may have outliers ◮ most often have sharp features ◮ may be anisotropic ◮ may be huge (more than 20 million points) 4/37
Normal Estimation Normal estimation for point clouds Our method Experiments Conclusion 5/37
Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. 6/37
Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. We consider two cases: ◮ P lies on a planar surface 6/37
Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. We consider two cases: ◮ P lies on a planar surface 6/37
Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. We consider two cases: ◮ P lies on a planar surface ◮ P lies next to a sharp feature 6/37
Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. We consider two cases: ◮ P lies on a planar surface ◮ P lies next to a sharp feature 6/37
Basics of the method (2D case here for readability) Let P be a point and N P be its neighborhood. We consider two cases: ◮ P lies on a planar surface ◮ P lies next to a sharp feature If Area ( N 1 ) > Area ( N 2 ) , picking points in N 1 × N 1 is more probable than N 2 × N 2 , and N 1 × N 2 leads to “random” normals. 6/37
Basics of the method (2D case here for readability) Main Idea Draw as many primitives as necessary to estimate the normal distribution, and then the most probable normal. P ◮ Discretize the problem Normal direction N.B. We compute the normal direction, not orientation. 7/37
Basics of the method (2D case here for readability) Main Idea Draw as many primitives as necessary to estimate the normal distribution, and then the most probable normal. P ◮ Discretize the problem ◮ Fill a Hough accumulator Normal direction N.B. We compute the normal direction, not orientation. 7/37
Basics of the method (2D case here for readability) Main Idea Draw as many primitives as necessary to estimate the normal distribution, and then the most probable normal. P ◮ Discretize the problem ◮ Fill a Hough accumulator ◮ Select the good normal Normal direction N.B. We compute the normal direction, not orientation. 7/37
Robust Randomized Hough Transform ◮ T , number of primitives picked after T iteration. ◮ T min , number of primitives to pick ◮ M , number of bins of the accumulator ◮ ˆ p m , empirical mean of the bin m ◮ p m , theoretical mean of the bin m 8/37
Robust Randomized Hough Transform Global upper bound T min such that: P ( m ∈{ 1 ,..., M } | ˆ max p m − p m | ≤ δ ) ≥ α From Hoeffding’s inequality, for a given bin: p m − p m | ≥ δ ) ≤ 2 exp ( − 2 δ 2 T min ) P ( | ˆ Considering the whole accumulator: 2 δ 2 ln ( 2 M 1 T min ≥ 1 − α ) 9/37
Robust Randomized Hough Transform Confidence Interval Idea: if we pick often enough the same bin, we want to stop drawing primitives. From the Central Limit Theorem, we can stop if: � 1 p m 1 − ˆ p m 2 ≥ 2 ˆ T i.e. the confidence intervals of the most voted bins do not in- tersect (confidence level 95 % ) 10/37
Accumulator Our primitives are planes direc- tions (defined by two angles). We use the accumulator of Borrmann & al ( 3D Research , 2011). ◮ Fast computing ◮ Bins of similar area 11/37
Discretization issues P The use of a discrete accumulator may be a cause of error. 12/37
Discretization issues P The use of a discrete accumulator may be a cause of error. 12/37
Discretization issues P The use of a discrete accumulator may be a cause of error. Solution Iterate the algorithm using randomly rotated accumulators. 12/37
Normal Selection P Normal directions obtained by rotation of the accumulator Mean over Mean over all Best confidence best cluster the normals 13/37
Dealing with anisotropy The robustness to anisotropy depends of the way we select the planes (triplets of points) Sensitivity to anisotropy Robustness to anisotropy 14/37
Random point selection among nearest neighbors Dealing with anisotropy The triplets are randomly selected among the K nearest neighbors. Fast but cannot deal with anisotropy. 15/37
Uniform point selection on the neighborhood ball Dealing with anisotropy ◮ Pick a point Q in the neighborhood ball Q P 16/37
Uniform point selection on the neighborhood ball Dealing with anisotropy ◮ Pick a point Q in the neighborhood ball ◮ Consider a small ball Q around Q P 16/37
Uniform point selection on the neighborhood ball Dealing with anisotropy ◮ Pick a point Q in the neighborhood ball ◮ Consider a small ball Q around Q P ◮ Pick a point randomly in the small ball 16/37
Uniform point selection on the neighborhood ball Dealing with anisotropy ◮ Pick a point Q in the neighborhood ball ◮ Consider a small ball around Q ◮ Pick a point randomly in the small ball ◮ Iterate to get a triplet Deals with anisotropy, but for a high computation cost. 16/37
Cube discretization of the neighborhood ball Dealing with anisotropy ◮ Discretize the neighborhood ball P 17/37
Cube discretization of the neighborhood ball Dealing with anisotropy ◮ Discretize the neighborhood ball ◮ Pick a cube P 17/37
Cube discretization of the neighborhood ball Dealing with anisotropy ◮ Discretize the neighborhood ball ◮ Pick a cube P ◮ Pick a point randomly in this cube 17/37
Cube discretization of the neighborhood ball Dealing with anisotropy ◮ Discretize the neighborhood ball ◮ Pick a cube ◮ Pick a point randomly in this cube ◮ Iterate to get a triplet Good compromise between speed and robustness to anisotropy. 17/37
Normal Estimation Normal estimation for point clouds Our method Experiments Conclusion 18/37
Methods used for comparison ◮ Regression ◮ Hoppe & al ( SIGGRAPH ,1992): plane fitting ◮ Cazals & Pouget Plane fitting Jet fitting ( SGP , 2003): jet fitting Noise � � Outliers Sharp fts Anisotropy Fast � � 19/37
Methods used for comparison ◮ Regression ◮ Hoppe & al ( SIGGRAPH ,1992): plane fitting ◮ Cazals & Pouget Plane fitting Jet fitting ( SGP , 2003): jet NormFet fitting ◮ Voronoï diagram ◮ Dey & Goswami Noise � � ( SCG , 2004): Outliers NormFet Sharp fts � Anisotropy � Fast � � � 19/37
Methods used for comparison ◮ Regression ◮ Hoppe & al ( SIGGRAPH ,1992): Sample Consensus plane fitting ◮ Cazals & Pouget Plane fitting Jet fitting ( SGP , 2003): jet NormFet fitting ◮ Voronoï diagram ◮ Dey & Goswami Noise � � � ( SCG , 2004): Outliers � NormFet Sharp fts � � ◮ Sample Consensus Anisotropy � Models Fast � � � ◮ Li & al ( Computer & Graphics , 2010) 19/37
Precision Two error measures: ◮ Root Mean Square (RMS): � � 1 � 2 � RMS = � � n P , ref n P , est |C| P ∈C ◮ Root Mean Square with threshold (RMS_ τ ): � � 1 � � v 2 RMS _ τ = � P |C| P ∈C where More suited for sharp features � � if � n P , ref n P , est < τ n P , ref n P , est v P = π otherwise 2 20/37
Visual on error distances Same RMS, different RMS τ 21/37
Precision (with noise) Precision for cube uniformly sampled, depending on noise. 22/37
Precision (with noise and anisotropy) Precision for a corner with anisotropy, depending on noise. 23/37
Computation time Computation time for sphere, function of the number of points. 24/37
Robustness to outliers Noisy model (0.2%) + 100% of outliers. 25/37
Robustness to outliers Noisy model (0.2%) + 200% of outliers. 26/37
Robustness to anisotropy 27/37
Preservation of sharp features 28/37
Robustness to “natural” noise, outliers and anisotropy Point cloud created by photogrammetry. 29/37
Normal Estimation Normal estimation for point clouds Our method Experiments Conclusion 30/37
Recommend
More recommend