wavelets for
play

Wavelets for Surface Reconstruction Josiah Manson Guergana Petrova - PowerPoint PPT Presentation

Wavelets for Surface Reconstruction Josiah Manson Guergana Petrova Scott Schaefer Convert Points to an Indicator Function Data Acquisition Properties of Wavelets Fourier Series Wavelets Represents all functions Locality Depends


  1. Wavelets for Surface Reconstruction Josiah Manson Guergana Petrova Scott Schaefer

  2. Convert Points to an Indicator Function

  3. Data Acquisition

  4. Properties of Wavelets Fourier Series Wavelets Represents all functions Locality Depends Smoothness on wavelet

  5. Wavelet Bases Haar D4   ( x ) ( x )   ( x ) ( x )

  6. Example of Function using Wavelets         j f ( x ) c ( x k ) c ( 2 x k ) k j , k k j , k

  7. Example of Function using Wavelets     f ( x ) c ( x k ) k k

  8. Example of Function using Wavelets         j f ( x ) c ( x k ) c ( 2 x k ) k j , k  k j 0 , k

  9. Example of Function using Wavelets         j f ( x ) c ( x k ) c ( 2 x k ) k j , k  k j { 0 , 1 }, k

  10. Example of Function using Wavelets         j f ( x ) c ( x k ) c ( 2 x k ) k j , k  k j { 0 , 1 , 2 }, k

  11. Strategy • Estimate wavelet coefficients of indicator function • Use only local combination of samples to find coefficients

  12. Computing the Indicator Function  [Kazhdan 2005] ( x )          j ( x ) c ( x k ) c ( 2 x k ) k j , k k j , k

  13. Computing the Indicator Function  [Kazhdan 2005] ( x )          j ( x ) c ( x k ) c ( 2 x k ) k j , k k j , k      j c ( x ) ( 2 x k ) dx j , k R

  14. Computing the Indicator Function  [Kazhdan 2005] ( x )          j ( x ) c ( x k ) c ( 2 x k ) k j , k k j , k          j j c ( x ) ( 2 x k ) dx ( 2 x k ) dx j , k R M

  15. Computing the Indicator Function  [Kazhdan 2005] ( x )          j ( x ) c ( x k ) c ( 2 x k ) k j , k k j , k          j j c ( x ) ( 2 x k ) dx ( 2 x k ) dx j , k R M Divergence Theorem           F ( x ) dx F ( p ) n ( p ) d   M p M

  16. Computing the Indicator Function  [Kazhdan 2005] ( x )          j ( x ) c ( x k ) c ( 2 x k ) k j , k k j , k          j j c ( x ) ( 2 x k ) dx ( 2 x k ) dx j , k R M Divergence Theorem           F ( x ) dx F ( p ) n ( p ) d   M p M

  17. Computing the Indicator Function  [Kazhdan 2005] ( x )          j ( x ) c ( x k ) c ( 2 x k ) k j , k k j , k          j j c ( x ) ( 2 x k ) dx ( 2 x k ) dx j , k R M       F ( p ) n ( p ) d j , k   p M

  18. Computing the Indicator Function  [Kazhdan 2005] ( x )          j ( x ) c ( x k ) c ( 2 x k ) k j , k k j , k          j j c ( x ) ( 2 x k ) dx ( 2 x k ) dx j , k R M       F ( p ) n ( p ) d j , k   p M       ( ) F p n d j , k i i i i

  19.  Finding F ( x )       j F ( x ) ( 2 x k )

  20.  Finding F ( x ) d       j F ( x ) F ( x ) ( 2 x k ) dx

  21.  Finding F ( x ) d       j F ( x ) F ( x ) ( 2 x k ) dx x     j F ( x ) ( 2 s k ) ds  

  22. Extracting the surface Coefficients          j ( x ) c ( x k ) c ( 2 x k ) k j , k k j , k Indicator function Dual marching cubes Surface

  23. Smoothing the Indicator Function Haar unsmoothed

  24. Smoothing the Indicator Function Haar unsmoothed Haar smoothed

  25. Comparison of Wavelet Bases Haar D4

  26. Advantages of Wavelets • Coefficients calculated only near surface – Fast – Low memory • Multi-resolution representation • Out of core calculation is possible

  27. Streaming Pipeline Output Input

  28. Results Michelangelo’s Barbuto 329 million points (7.4 GB of data), 329MB memory, 112 minutes

  29. Results Michelangelo’s Awakening 381 million points (8.5 GB), 573MB memory, 81 minutes Produced 590 million polygons

  30. Results Michelangelo’s Atlas 410 million points (9.15 GB), 1188MB memory, 98 minutes Produced 642 million polygons

  31. Results Michelangelo’s Atlas 410 million points (9.15 GB), 1188MB memory, 98 minutes Produced 642 million polygons

  32. Robustness to Noise in Normals 0 ° 30 ° 60 ° 90 °

  33. Comparison of Methods Poisson Haar MPU D4 289 sec 17 sec 551 sec 82 sec 57 MB 13 MB 750 MB 43 MB

  34. Relative Hausdorf Errors 1 0.9 0.8 0.7 Scaled Error MPU 0.6 Poisson Haar 0.5 Haar Smooth D4 0.4 D4 Smooth 0.3 0.2 0.1 0 armadilloman happy buddha dragon elephant2 hand malaysia tall teeth venus2

  35. Conclusions • Wavelets provide trade-off between speed/quality • Works with all orthogonal wavelets • Guarantees closed, manifold surface • Out of core

Recommend


More recommend