scattered data interpolation for computer graphics
play

scattered data interpolation for computer graphics J.P . Lewis Ken - PowerPoint PPT Presentation

scattered data interpolation for computer graphics J.P . Lewis Ken Anjyo Fred Pighin (*) Weta Digital OLM Digital Google Inc SIGGRAPH Asia 2010 Course: Scattered Data Interpolation for Computer Graphics


  1. Comparison: Shepard’s p = 2 Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 5 / 84 regression Gaussian Process regression

  2. Comparison: Shepard’s p = 5 Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 6 / 84 regression Gaussian Process regression

  3. Kernel smoothing Nadaraya-Watson Euclidean invariant Shepard Interpolation � R ( p , p k ) d k Comparison: Shepard’s p = 1 ˆ � R ( p , p k ) d ( p ) = Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Same as Shepard’s if R ( p , p k ) ≡ � p − p k � − p Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 7 / 84 regression Gaussian Process regression

  4. Foley and Nielsen Euclidean invariant I Use Shepards to interpolate onto a regular grid Shepard Interpolation Comparison: I Interpolate the grid with a regular spline Shepard’s p = 1 Comparison: Shepard’s p = 2 I Interpolate the residual with a second Shepards Comparison: Shepard’s p = 5 Kernel smoothing I iterate... Foley and Nielsen Moving Least Squares T.A.Foley and G.M.Nielson Multivariate interpolation to Moving Least Squares scattered data using delta iteration. In E.W.Cheny, ed., Moving Least Squares Approximation Theory II, p.419-424, Academic Press NY 1980. MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 8 / 84 regression Gaussian Process regression

  5. Moving Least Squares Euclidean invariant I Fit a polynomial (or other basis) independently at each Shepard Interpolation point Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 I Use weighted least squares, de-weight data that are far Comparison: Shepard’s p = 5 away Kernel smoothing Foley and Nielsen I For interpolation, weights must go to infinity at the Moving Least Squares data points Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 9 / 84 regression Gaussian Process regression

  6. Moving Least Squares Euclidean invariant Synthesis: ˆ 0 a ( x ) d ( x ) = � m i x i Shepard k Interpolation Comparison: Shepard’s p = 1 Solve: Comparison: Shepard’s p = 2 Comparison: n m Shepard’s p = 5 � w ( x ) � a ( x ) k − d k ) 2 x i min k ( Kernel smoothing i a Foley and Nielsen 0 k Moving Least Squares Moving Least m - degree of polynomial Squares Moving Least Squares w ( x ) MLS = Shepard’s 1 = when m = 0 k � x − x k � p Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 10 / 84 regression Gaussian Process regression

  7. Moving Least Squares Euclidean invariant n m Shepard w ( x ) a ( x ) � � Interpolation x i k ) 2 min k ( d k − Comparison: i a Shepard’s p = 1 0 k Comparison: Shepard’s p = 2 Comparison: call x i k ≡ b k ∈ R m +1 , the polynomial basis evaluated at the Shepard’s p = 5 Kernel smoothing k th point Foley and Nielsen Moving Least Squares n Moving Least � w ( x ) k ( d k − b T a ) 2 = min Squares Moving Least a Squares k MLS = Shepard’s when m = 0 Matrix version: Moving Least Squares Moving Least Squares � W ( Ba − d ) � 2 min Moving Least a Squares Natural Neighbor Interpolation W is diagonal matrix with sqrt of w ( x ) k . Natural Neighbor Interpolation Notation Gaussian Process 11 / 84 regression Gaussian Process regression

  8. MLS = Shepard’s when m = 0 Euclidean invariant n Shepard w ( x ) � Interpolation k ( a · 1 − d k ) 2 min Comparison: a Shepard’s p = 1 k Comparison: � n Shepard’s p = 2 � d k ( a 2 − 2 ad k + d 2 Comparison: � w ( x ) k ) = 0 Shepard’s p = 5 da Kernel smoothing k Foley and Nielsen � n Moving Least � d Squares w k a 2 − 2 w k ad k + w k d 2 � Moving Least k da Squares Moving Least k Squares n MLS = Shepard’s � = 2 w k a − 2 w k d k = 0 when m = 0 Moving Least Squares k Moving Least � n k w k d k Squares a = Moving Least � n Squares k w k Natural Neighbor Interpolation ˆ d ( x ) = a · 1 Natural Neighbor Interpolation Notation Gaussian Process 12 / 84 regression Gaussian Process regression

  9. Moving Least Squares Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor m = 1 , i.e. local linear regression Interpolation Natural Neighbor Interpolation Notation Gaussian Process 13 / 84 regression Gaussian Process regression

  10. Moving Least Squares Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor m = 0 , 1 , comparison Interpolation Natural Neighbor Interpolation Notation Gaussian Process 14 / 84 regression Gaussian Process regression

  11. Moving Least Squares Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor m = 2 , i.e. local quadratic regression Interpolation Natural Neighbor Interpolation Notation Gaussian Process 15 / 84 regression Gaussian Process regression

  12. Natural Neighbor Interpolation Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares wikipedia Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 16 / 84 regression Gaussian Process regression

  13. Natural Neighbor Interpolation Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares Image: N. Sukmar, Natural Neighbor Interpolation and the Natural Element Method (NEM) MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 17 / 84 regression Gaussian Process regression

  14. Notation Euclidean invariant Shepard Interpolation R ( x, y ) symmetric pos. def. Comparison: Shepard’s p = 1 R ( x, y ) = φ ( � x − y � ) Comparison: Shepard’s p = 2 Comparison: matrix version R Shepard’s p = 5 Kernel smoothing R xy element of matrix Foley and Nielsen Moving Least Squares R is kernel or covariance Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 18 / 84 regression Gaussian Process regression

  15. Gaussian Process regression Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s from when m = 0 Generalized Stochastic Subdivision, Moving Least ACM TOG July 1987 Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 19 / 84 regression Gaussian Process regression

  16. Gaussian Process regression Euclidean invariant Shepard Interpolation ˆ � linear estimator d t = w k d t + k Comparison: Shepard’s p = 1 Comparison: E [( d t − ˆ orthogonality d t ) d m ] = 0 Shepard’s p = 2 Comparison: � Shepard’s p = 5 E [ d t d m ] = E [ w k d t + k d m ] Kernel smoothing Foley and Nielsen autocovariance E [ d t d m ] = R ( t − m ) Moving Least Squares � Moving Least linear system R ( t − m ) = w k R ( t + k − m ) Squares Moving Least Squares MLS = Shepard’s Note no requirement on the actual spacing of the data. Related when m = 0 Moving Least to the “Kriging” method in geology. Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 20 / 84 regression Gaussian Process regression

  17. Comparison: GaussianProcess Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 21 / 84 regression Gaussian Process regression

  18. Laplace/Poisson Interpolation Euclidean invariant Shepard i.e. “Laplacian Splines” Objective: Minimize a roughness Interpolation Comparison: Shepard’s p = 1 measure, the integrated derivative (or gradient) squared: Comparison: Shepard’s p = 2 Comparison: � 2 � � d f ( x ) Shepard’s p = 5 min dx Kernel smoothing dx f Foley and Nielsen Moving Least Squares or Moving Least Squares Moving Least � � Squares �∇ f � 2 ds min MLS = Shepard’s f when m = 0 Moving Least Squares Moving Least (subject to some constraints, to avoid a trivial solution) Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 22 / 84 regression Gaussian Process regression

  19. function, operator Euclidean invariant I function: f ( x ) → y Shepard Interpolation Comparison: Shepard’s p = 1 I operator: Mf → g , e.g. Matrix-vector multiplication Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 23 / 84 regression Gaussian Process regression

  20. “Null space of the operator” Euclidean invariant Shepard � 2 Interpolation � � d f ( x ) Comparison: min dx Shepard’s p = 1 dx f Comparison: Shepard’s p = 2 Comparison: Gives zero for f ( x ) = any constant. Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 24 / 84 regression Gaussian Process regression

  21. Laplace/Poisson: solution approaches Euclidean invariant I direct matrix inverse Shepard Interpolation Comparison: Shepard’s p = 1 I Jacobi (because matrix is quite sparse) Comparison: Shepard’s p = 2 Comparison: I Jacobi variants (SOR) Shepard’s p = 5 Kernel smoothing Foley and Nielsen I Multigrid Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 25 / 84 regression Gaussian Process regression

  22. Laplace/Poisson: Discrete Local viewpoint: Euclidean invariant Shepard Interpolation Comparison: � � |∇ u | 2 du ≈ ( u k +1 − u k ) 2 Shepard’s p = 1 roughness R = Comparison: Shepard’s p = 2 Comparison: dR d [( u k − u k − 1 ) 2 + ( u k +1 − u k ) 2 ] Shepard’s p = 5 for a particular k: = Kernel smoothing du k du k Foley and Nielsen = 2( u k − u k − 1 ) − 2( u k +1 − u k ) = 0 Moving Least Squares Moving Least u k +1 − 2 u k + u k − 1 = 0 → ∇ 2 u = 0 Squares Moving Least Squares Note 1,-2,1 pattern. MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 26 / 84 regression Gaussian Process regression

  23. Laplace/Poisson Interpolation Euclidean invariant Discrete/matrix viewpoint: Encode derivative operator in a Shepard Interpolation matrix D Comparison: Shepard’s p = 1 Comparison:     f 1 Shepard’s p = 2 Comparison: − 1 1 f 2 Shepard’s p = 5   Df =     − 1 1 Kernel smoothing .     . Foley and Nielsen . . . . Moving Least Squares Moving Least � � d Squares � 2 f � Df � 2 = min Moving Least f T D T Df min ≈ min Squares dx MLS = Shepard’s f f f when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 27 / 84 regression Gaussian Process regression

  24. Laplace/Poisson Interpolation Euclidean invariant Shepard Interpolation f T D T Df min Comparison: Shepard’s p = 1 f Comparison: Shepard’s p = 2 d Comparison: f T D T Df 2 D T Df = 0 � � = Shepard’s p = 5 d f Kernel smoothing Foley and Nielsen i.e. Moving Least Squares Moving Least d 2 f Squares ∇ 2 = 0 dx 2 = 0 or Moving Least Squares MLS = Shepard’s when m = 0 f = 0 is a solution; last eigenvalue is zero, corresponds to a Moving Least Squares constant solution. Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 28 / 84 regression Gaussian Process regression

  25. Discrete Laplacian Notice Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1   Comparison: Shepard’s p = 2 Comparison: 1 − 2 1   D T D = Shepard’s p = 5   1 − 2 1 Kernel smoothing   Foley and Nielsen . . . Moving Least Squares Moving Least Squares Two-dimensional stencil Moving Least Squares MLS = Shepard’s   1 when m = 0 Moving Least D T D = 1 − 4 1 Squares   Moving Least 1 Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 29 / 84 regression Gaussian Process regression

  26. Jacobi iteration Local viewpoint Euclidean invariant Shepard Interpolation Comparison: Jacobi iteration sets each f k to the solution of its row of the Shepard’s p = 1 Comparison: matrix equation, independent of all other rows: Shepard’s p = 2 Comparison: Shepard’s p = 5 � Kernel smoothing A rc f c = b r Foley and Nielsen Moving Least � Squares → A rk f k = b k − A rj f j Moving Least Squares j � = k Moving Least Squares f k ← b k � MLS = Shepard’s − A kj /A kk f j when m = 0 A kk Moving Least j � = k Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 30 / 84 regression Gaussian Process regression

  27. Jacobi iteration apply to Laplace eqn Euclidean invariant Shepard Interpolation Jacobi iteration sets each f k to the solution of its row of the Comparison: Shepard’s p = 1 matrix equation, independent of all other rows: Comparison: Shepard’s p = 2 Comparison: . . . f t − 1 − 2 f t + f t +1 = 0 Shepard’s p = 5 Kernel smoothing 2 f t = f t − 1 + f t +1 Foley and Nielsen Moving Least f k ← 0 . 5 ∗ ( f [ k − 1] + f [ k + 1]) Squares Moving Least Squares In 2D, Moving Least Squares f[y][x] = 0.25 * ( f[y+1][x] + f[y-1][x] + MLS = Shepard’s when m = 0 Moving Least f[y][x-1] + f[y][x+1] ) Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 31 / 84 regression Gaussian Process regression

  28. But now let’s interpolate 1D case, say f 3 is known. Three eqns involve f 3 . Subtract Euclidean invariant Shepard Interpolation (a multiple of) f 3 from both sides of these equations: Comparison: Shepard’s p = 1 Comparison: f 1 − 2 f 2 + f 3 = 0 → f 1 − 2 f 2 + 0 = − f 3 Shepard’s p = 2 Comparison: f 2 − 2 f 3 + f 4 = 0 → f 2 + 0 + f 4 = 2 f 3 Shepard’s p = 5 Kernel smoothing f 3 − 2 f 4 + f 5 = 0 → 0 − 2 f 4 + f 5 = − f 3 Foley and Nielsen Moving Least Squares Moving Least Squares   Moving Least 1 − 2 0 Squares 1 0 1    one column is zeroed L =   0 − 2 MLS = Shepard’s  . . . when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 32 / 84 regression Gaussian Process regression

  29. Multigrid inpainting Program demonstration. Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Remove dog’s spots. Combine Wiener filtering to separate Shepard’s p = 2 Comparison: fur from luminance, with Laplace interpolation to adjust the Shepard’s p = 5 Kernel smoothing luminance. Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 33 / 84 regression Gaussian Process regression

  30. Applications: Spot Removal Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares From: Lifting Detail from Darkness, SIGGRAPH 2001 Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 34 / 84 regression Gaussian Process regression

  31. Recovered fur: detail Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 35 / 84 regression Gaussian Process regression

  32. Comparison: Laplace Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 36 / 84 regression Gaussian Process regression

  33. Thin plate spline Minimize the integrated second derivative squared Euclidean invariant Shepard Interpolation (approximate curvature) Comparison: Shepard’s p = 1 Comparison: � 2 � � d 2 f Shepard’s p = 2 min dx Comparison: dx 2 Shepard’s p = 5 f Kernel smoothing Foley and Nielsen Null space: f = ax + c Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 37 / 84 regression Gaussian Process regression

  34. Membrane vs. Thin Plate Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Left - membrane interpolation, right - thin plate. Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 38 / 84 regression Gaussian Process regression

  35. Comparison: Cubic Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 39 / 84 regression Gaussian Process regression

  36. Radial Basis Functions Euclidean invariant Shepard Interpolation N Comparison: ˆ � d ( p ) = w k R ( � p − p k � ) Shepard’s p = 1 Comparison: Shepard’s p = 2 k Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Data at arbitrary (irregularly spaced) locations can be Moving Least Squares interpolated with a weighted sum of radial functions situated at Natural Neighbor Interpolation each data point. Natural Neighbor Interpolation Notation Gaussian Process 40 / 84 regression Gaussian Process regression

  37. Radial Basis Functions: History Euclidean invariant I Broomhead & Lowe, 1988 Shepard Interpolation Comparison: Shepard’s p = 1 I Werntges, ICNN 1993 Comparison: Shepard’s p = 2 Comparison: I in Graphics: 1999-2001 Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 41 / 84 regression Gaussian Process regression

  38. Radial Basis Functions: Theory Euclidean invariant I Micchelli - for a large class of functions, the RBF Shepard Interpolation matrix is non-singular Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 42 / 84 regression Gaussian Process regression

  39. Radial Basis Functions (RBFs) Euclidean invariant I any monotonic function can be used?! Shepard Interpolation Comparison: Shepard’s p = 1 I common choices: Comparison: Shepard’s p = 2 Comparison: N Gaussian R ( r ) = exp( − r 2 /σ 2 ) Shepard’s p = 5 Kernel smoothing N Thin plate spline R ( r ) = r 2 log r Foley and Nielsen Moving Least ( r 2 + c 2 ) , c > 0 Squares � N Hardy multiquadratic R ( r ) = Moving Least Squares Moving Least Squares Notice: the last two increase as a function of radius MLS = Shepard’s when m = 0 Moving Least Squares  Moving Least  Squares  Moving Least  Squares  Natural Neighbor Interpolation              Natural Neighbor  Interpolation  Notation Gaussian Process 43 / 84 regression Gaussian Process regression

  40. Comparison: RBF-Gauss Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 44 / 84 regression Gaussian Process regression

  41. Comparison: RBF-Gauss Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 45 / 84 regression Gaussian Process regression

  42. Comparison: RBF-Gauss Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 46 / 84 regression Gaussian Process regression

  43. Radial Basis Functions Euclidean invariant Shepard Interpolation N Comparison: ˆ � d ( p ) = w k R ( � p − p k � ) Shepard’s p = 1 Comparison: Shepard’s p = 2 k Comparison: e = || ( d − Rw ) || 2 Shepard’s p = 5 Kernel smoothing e = ( d − Rw ) T ( d − Rw ) Foley and Nielsen Moving Least de Squares d w = 0 = − R T ( d − Rw ) Moving Least Squares Moving Least Squares MLS = Shepard’s w = R − 1 d when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 47 / 84 regression Gaussian Process regression

  44. Radial Basis Functions Euclidean invariant Shepard       Interpolation R 1 , 1 R 1 , 2 R 1 , 3 · · · w 1 d 1 Comparison: Shepard’s p = 1 R 2 , 1 R 2 , 2 · · · w 2 d 2       Comparison:  =       Shepard’s p = 2 R 3 , 1 · · · w 3 d 3       Comparison: . . .      Shepard’s p = 5 . . . . . . Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 48 / 84 regression Gaussian Process regression

  45. RBF: multidimensional interpolation Euclidean invariant Shepard Interpolation w x = R − 1 d x Comparison: Shepard’s p = 1 Comparison: w y = R − 1 d y Shepard’s p = 2 Comparison: Shepard’s p = 5 w z = R − 1 d z Kernel smoothing Foley and Nielsen Moving Least Matrix R is in common to all dimensions Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 49 / 84 regression Gaussian Process regression

  46. Normalized Radial Basis Function Euclidean invariant Shepard R ( � x − x i � ) Interpolation Comparison: R i () ⇐ Shepard’s p = 1 � j R ( � x − x j � ) Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing I removes the “dips” that result from too-narrow σ Foley and Nielsen Moving Least Squares I i.e. somewhat less sensitive to choice of σ Moving Least Squares Moving Least I (for decaying kernel), far from the data, closest point Squares MLS = Shepard’s dominates when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 50 / 84 regression Gaussian Process regression

  47. Normalized Radial Basis Function Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 51 / 84 regression Gaussian Process regression

  48. insensitive to sigma, up to a point... Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 52 / 84 regression Gaussian Process regression

  49. Comparison: Shepard’s p = 1 Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 53 / 84 regression Gaussian Process regression

  50. Comparison: Shepard’s p = 2 Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 54 / 84 regression Gaussian Process regression

  51. Comparison: Moving Least Squares, linear polynomial Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 55 / 84 regression Gaussian Process regression

  52. Comparison: Moving Least Squares, quadratic polynomial Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 56 / 84 regression Gaussian Process regression

  53. Comparison: GaussianProcess Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 57 / 84 regression Gaussian Process regression

  54. Comparison: Laplace Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 58 / 84 regression Gaussian Process regression

  55. Comparison: RBF-Gauss Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 59 / 84 regression Gaussian Process regression

  56. Comparison: RBF-Gauss Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 60 / 84 regression Gaussian Process regression

  57. Comparison: RBF-Gauss Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 61 / 84 regression Gaussian Process regression

  58. Comparison: Normalized RBF-Gauss Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 62 / 84 regression Gaussian Process regression

  59. Comparison: Cubic (i.e. RBF-Thin plate) Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 63 / 84 regression Gaussian Process regression

  60. Comparison: Cubic + regularization Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 64 / 84 regression Gaussian Process regression

  61. Approximation rather than interpolation Euclidean invariant Find w to minimize ( Rw − b ) T ( Rw − b ) . If the training Shepard Interpolation points are very close together, the corresponding columns of Comparison: Shepard’s p = 1 R are nearly parallel. Di ffi cult to control if points are chosen Comparison: Shepard’s p = 2 by a user. Add a term to keep the weights small: w T w . Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen ( Rw − b ) T ( Rw − b ) + λ w T w minimize Moving Least Squares Moving Least R T ( Rw − b ) + 2 λ w = 0 Squares Moving Least R T Rw + 2 λ w = R T b Squares MLS = Shepard’s ( R T R + 2 λ I ) w = R T b when m = 0 Moving Least Squares w = ( R T R + 2 λ I ) − 1 R T b Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 65 / 84 regression Gaussian Process regression

  62. Comparison: Cubic + regularization Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 66 / 84 regression Gaussian Process regression

  63. Regularization Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Ill-conditioning and regularization. The regularization Kernel smoothing parameter is 0, .01, and .1 respectively. (Vertical scale is Foley and Nielsen Moving Least changing). Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 67 / 84 regression Gaussian Process regression

  64. Relation between Laplace,Thin-Plate, RBF 2D thin-plate interpolation Euclidean invariant Shepard Interpolation Comparison: ˆ � d ( p ) = w k R ( � p − p k � ) Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: with R ( r ) = r 2 log( r ) . Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 68 / 84 regression Gaussian Process regression

  65. Solving Thin plate interpolation Euclidean invariant I if few known points: use RBF Shepard Interpolation Comparison: Shepard’s p = 1 I if many points use multigrid instead Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing I but Carr/Beatson et. al. (SIGGRAPH 01) use FMM for Foley and Nielsen RBF with large numbers of points Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 69 / 84 regression Gaussian Process regression

  66. Break Euclidean invariant Shepard Interpolation Comparison: Shepard’s p = 1 Comparison: Shepard’s p = 2 Comparison: Shepard’s p = 5 Kernel smoothing Foley and Nielsen Moving Least Squares Moving Least Squares Moving Least Squares MLS = Shepard’s when m = 0 Moving Least Squares Moving Least Squares Moving Least Squares Natural Neighbor Interpolation Natural Neighbor Interpolation Notation Gaussian Process 70 / 84 regression Gaussian Process regression

Recommend


More recommend