the spacefusion project
play

The SpaceFusion* project: applications to remote sensing and 3D - PowerPoint PPT Presentation

PASEO The SpaceFusion* project: applications to remote sensing and 3D topographic reconstruction Andr Jalobeanu PASEO Research Group MIV team @ LSIIT, Illkirch, France *Projet ANR Jeunes Chercheurs 2006-2008 Outline Introduction


  1. PASEO The SpaceFusion* project: applications to remote sensing and 3D topographic reconstruction André Jalobeanu PASEO Research Group MIV team @ LSIIT, Illkirch, France *Projet ANR “Jeunes Chercheurs” 2006-2008

  2. Outline Introduction Objectives Remote sensing: reflectance Small bodies: geometry Earth, planets: reflectance and topography Proposed approach Bayesian inference from multiple observations Accurate forward modeling Preliminary results: 1D and 2D in astronomy Extensions Deformation fields Non-optical signal fusion

  3. Why multisource data fusion? Problem: lots of data, same object! Images are recorded with various: � position, orientation (pose) � sensors (resolution, noise, bad pixels) � observing conditions (transparency, haze) � instruments (blur, distortions) � Multisource data fusion: � Optimally combine all observations into a single object � Preserve all the information from the original data set • Increase resolution if needed • Compute the uncertainties • Reconstruct the 3D geometry if required � Enhance the object quality (optional) Denoise or deblur depending on the degradation

  4. The SpaceFusion project Projet ANR “Jeunes Chercheurs 2005” (French Research Agency) 3-year grant, Jan 2006 - Dec 2008 Name Position, lab time% André Jalobeanu CR, LSIIT/MIV Illkirch 90% Christophe Collet PU, LSIIT/MIV Illkirch 40% Mireille Louys MCF, LSIIT/MIV Illkirch 40% Fabien Salzenstein MCF, InESS Strasbourg 40% Françoise Nerry CR, LSIIT/TRIO Illkirch 20% Albert Bijaoui / Eric Slezak A/AA, OCA Nice 10% Bernd Vollmer A, Obs. Strasbourg 10% Jorge A. Gutiérrez + ? PhD+, Illkirch 20 mo total

  5. Objectives Reconstruct a reflectance function in remote sensing Recover the geometry of small bodies and planetary surfaces Reconstruct both reflectance and topography in Earth/Space Sciences

  6. Remote sensing: 2D reflectance reconstruction (3D space) ReflectanceFusion Multisource data fusion for flat terrain BRDF recovery Remote Sensing, Planetary Imaging � Input: � Multiple images (single band, multispectral or hyperspectral) � Optical / calibrated or not / missing or corrupted data � Output: � 2D reflectance map (image-like), well-sampled � Uncertainties (simplified inverse covariance) � If applicable, spatial and spectral super-resolution

  7. Build a 2D reflectance map 2D rigid data fusion is not sufficient Observations View 1 View 3 View 2 Single model: Reflectance map (possibly multispectral, bidirectional) Assumption: flat terrain

  8. Small bodies: 3D surface recovery (geometry only) 3DShapeInference 3D shape recovery via Bayesian inference Planetary Imaging (small bodies and planets) SurfaceModelRender Accurate rendering and modeling of natural 3D surfaces � Input: � Multiple images (single band) � Optical, IR / calibrated or not / missing or corrupted data � Output: � 3D geometry (height field, planar/spherical topology) � Uncertainties (simplified inverse covariance)

  9. Asteroid data fusion 2D data fusion is not sufficient radar Single model: 3D geometry (spherical height field) lidar Observations optical

  10. Comparing synthetic and real images... (or why use forward modeling) 2D image = corrupted measurement of a 2D rendering of a 3D scene � accurately render 3D surfaces � take into account the imaging model Synthetic Observed � simplest, intuitive approach: Image Image compare real and synthetic images until best fit is found... � we use a probabilistic approach providing a formal framework for 3D Surface image comparison and uncertainties

  11. 3D surface rendering � A possible rendering algorithm Projection project vertices onto the image plane Visible area determination for each triangle w.r.t. each pixel Shadow computation Irradiance computation (use albedo and reflectance model) Intensity formation (use visibility polygons and irradiance) Simulated observed image of 433 Eros Unif. albedo, Lambert reflectance, Blur (convolution with the PSF) Gaussian blur

  12. Earth & Planetary Sciences: reflectance and topography recovery 3DSpaceFusion Multisource data fusion, 3D surface recovery and super-resolution Planetary Imaging 3DEarthFusion Multisource data fusion, 3D surface recovery, BRDF inference and super-resolution Remote Sensing � Input: � Multiple images (single band, multispectral or hyperspectral) � Optical, IR / calibrated or not / missing or corrupted data � Output: the works! � 3D geometry + reflectance map � Uncertainties (simplified inverse covariance) � If applicable, spatial and spectral super-resolution (reflectance)

  13. Topography and albedo recovery N images Geometry (DEM, height field) NASA Reflectance field (shading-free) Cameras (registration) Light sources NASA

  14. The proposed approach resolution camera geometric � pose mapping Bayesian inference from multiple parameters observed rendering image f coefficients observations Y � internal � camera parameters Uncertainty estimates, h � , � global � PSF recursive data processing noise sensor sampling grid In 2D: recover a well-sampled image, possibly super-resolved In 3D: recover the geometry Check the validity of this approach

  15. Bayesian Vision & inverse problems � Computer vision: model reconstruction from multiple observations, inverse problem of rendering � Bayesian inference applied to this inverse problem: everything is described by random variables � Data fusion into a single model becomes a parameter estimation problem � It can be solved by existing efficient optimization techniques

  16. Bayesian inference prior model OBJECTIVE : likelihood (a priori knowledge posterior probability image formation model about the observed object) density function (pdf) p ( θ | observations ) = p ( observations | θ ) × p ( θ ) p ( observations ) parameters of interest (unknown solution) evidence (useful for model comparison) ! All parameters are random variables ! Bayesian inference � functional optimization / approximations ! Deterministic optimization techniques for speed

  17. Probabilistic fusion vs. averaging Probabilistic fusion Average Result #2 Result #1 � Take into account uncertainties: variance, correlations Formal framework for the combination of multiple observations � Propagate uncertainties From the observation noise to the end result! Downside: algorithms ought to account for input uncertainties

  18. Bayesian inference from multiple observations Y � image model Y � camera X � parameters parameters Y � unknown object observations � Forward modeling: • Object modeling (image, 3D geometry, reflectance...) • Image formation = rendering + degradations � Bayesian inference: • Estimate the optimal object given all observations: • Integrate w.r.t. all nuisance variables: “marginalization” find the max/mean of the posterior distribution • Evaluate the uncertainties: • Model selection and assessment covariance matrix (Gaussian approx. of the posterior)

  19. Building blocks of the joint posterior probability density function posterior density { v k } , {L j } {X i } { � i } P (surface, cameras |images) � � � P (surface) � P (camera i ) P (image i |surface, camera i ) � i i prior density likelihood � Y X � Camera Image formation Surface prior model model Dirac pdf Gaussian pdf Gaussian pdf (calibrated camera) (smoothness prior) Rendering I( v ,L, � ) + additive Gauss. noise Camera pose � Geometry, Reflectanfce... Camera physics N ( I( v ,L, � ), � 2 )

  20. Accurate forward modeling Deterministic image formation: rendering � 2D object, 3D space PSF h � Resampling , account for perspective, L(x) atmosphere, deformations, blur � 3D object, 3D space � Rendering in the object space, account for occlusions, shadows, perspective, atmosphere, deformations, blur Probabilistic image formation: sensor noise modeling � Independent Gaussian noise , spatially variable variance

  21. 2D reflectance map model � Model of the unknown object ( 2D image ) � Choose an appropriate parametrization and topology • Sampling grid size � , rectangular or hexagonal lattice 2 / e z i s u r b l t u p n i < e z s i e l x p i t u p t u O � Understand the sampling theorem! • Don’t try to go beyond the Nyquist rate (optical frequency cut-off) • Near-optimal sampling, band-limited: BSpline-3 kernel Target PSF (B-Spline 3) � Constrain and stabilize this inverse problem • Use smoothness priors to avoid noise amplification (oversampled • Use efficiently designed prior models ( e.g. multiscale, wavelets ) to areas will undergo a deconvolution even if we just want data fusion...) help preserve useful information while filtering the noise

  22. 3D object model example: fractal surface geometry model � Adaptive scale-invariant Gaussian model for wavelet coefficients: a priori surface roughness local scales geometric wavelet details = adaptive Gaussian random variables parameters Statistical self-similarity � ~ 3D analog of the fractional Brownian motion in 2D ... also works for the Mars DEM (MOLA)

  23. Samples from a fractal surface model Take samples from the joint density: P (image, surface | camera) = P (image | surface, camera) P (surface) Image formation Surface model model q = 1.1 q = 1.1 q = 1.1 � = 0.5 � = 1.5 � = 5.0 Assumptions: uniform albedo and roughness

Recommend


More recommend