bayesian fusion of multi band images
play

Bayesian fusion of multi-band images Beyond pansharpening Nicolas - PowerPoint PPT Presentation

Bayesian fusion of multi-band images Bayesian fusion of multi-band images Beyond pansharpening Nicolas Dobigeon Joint work with Qi Wei, Jean-Yves Tourneret and Jose M. Bioucas-Dias University of Toulouse, IRIT/INP-ENSEEIHT & T eSA


  1. Bayesian fusion of multi-band images Bayesian fusion of multi-band images Beyond pansharpening Nicolas Dobigeon Joint work with Qi Wei, Jean-Yves Tourneret and Jose M. Bioucas-Dias University of Toulouse, IRIT/INP-ENSEEIHT & T´ eSA http://www.enseeiht.fr/˜dobigeon Winter School “Search for Latent Variables: ICA, Tensors, and NMF” Villard de Lans, February 2-4 2015 Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 1 / 67

  2. Bayesian fusion of multi-band images Context Multi-band imaging Multi/hyper-spectral images ◮ same scene observed at different wavelengths Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 2 / 67

  3. Bayesian fusion of multi-band images Context Multi-band imaging Multi/hyper-spectral images ◮ same scene observed at different wavelengths Hyperspectral Cube Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 2 / 67

  4. Bayesian fusion of multi-band images Context Multi-band imaging Multi/hyper-spectral images ◮ same scene observed at different wavelengths, ◮ pixel represented by a vector of tens/hundreds of measurements. Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 3 / 67

  5. Bayesian fusion of multi-band images Context Multi-band imaging Multi/hyper-spectral images ◮ same scene observed at different wavelengths, ◮ pixel represented by a vector of tens/hundreds of measurements. Hyperspectral Cube Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 3 / 67

  6. Bayesian fusion of multi-band images Context Multi-band image enhancement Overcome the spatial vs. spectral resolution trade-off Panchromatic images (PAN) ◮ no spectral resolution (only 1 band), ◮ very high spatial resolution ( ∼ 10cm). Multispectral images (MS) ◮ low spectral resolution ( ∼ 10 bands), ◮ high spatial resolution ( ∼ 1m). Hyperspectral images (HS) ◮ high spectral resolution ( ∼ 100 bands), ◮ low spatial resolution ( ∼ 10m). Objective of the fusion process: get the best of both resolutions. Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 4 / 67

  7. Bayesian fusion of multi-band images Context Multi-band image enhancement Overcome the spatial vs. spectral resolution trade-off Panchromatic images (PAN) ◮ no spectral resolution (only 1 band), ◮ very high spatial resolution ( ∼ 10cm). Multispectral images (MS) ◮ low spectral resolution ( ∼ 10 bands), ◮ high spatial resolution ( ∼ 1m). Hyperspectral images (HS) ◮ high spectral resolution ( ∼ 100 bands), ◮ low spatial resolution ( ∼ 10m). Objective of the fusion process: get the best of both resolutions. Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 4 / 67

  8. Bayesian fusion of multi-band images Context Multi-band image enhancement Pansharpening: PAN+MS fusion ◮ incorporate the spatial details of the PAN image into the MS image ◮ huge literature ◮ main approaches rely on band substitution Hyperspectral pansharpening: PAN+HS fusion ◮ incorporate the spatial details of the PAN image into the HS image ◮ more difficult due to the size of the HS image ◮ specific methods should be developed Multi-band image fusion: MS+HS fusion ◮ incorporate the spatial details of the MS image into the HS image ◮ more difficult since the spatial details contained in a multi-band image ◮ specific methods should be developed Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 5 / 67

  9. Bayesian fusion of multi-band images Context Multi-band image enhancement Pansharpening: PAN+MS fusion ◮ incorporate the spatial details of the PAN image into the MS image ◮ huge literature ◮ main approaches rely on band substitution Hyperspectral pansharpening: PAN+HS fusion ◮ incorporate the spatial details of the PAN image into the HS image ◮ more difficult due to the size of the HS image ◮ specific methods should be developed Multi-band image fusion: MS+HS fusion ◮ incorporate the spatial details of the MS image into the HS image ◮ more difficult since the spatial details contained in a multi-band image ◮ specific methods should be developed Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 5 / 67

  10. Bayesian fusion of multi-band images Context Multi-band image enhancement Pansharpening: PAN+MS fusion ◮ incorporate the spatial details of the PAN image into the MS image ◮ huge literature ◮ main approaches rely on band substitution Hyperspectral pansharpening: PAN+HS fusion ◮ incorporate the spatial details of the PAN image into the HS image ◮ more difficult due to the size of the HS image ◮ specific methods should be developed Multi-band image fusion: MS+HS fusion ◮ incorporate the spatial details of the MS image into the HS image ◮ more difficult since the spatial details contained in a multi-band image ◮ specific methods should be developed Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 5 / 67

  11. Bayesian fusion of multi-band images Context Multi-band image enhancement Hyperspectral pansharpening: PAN+HS fusion ◮ incorporate the spatial details of the PAN image into the HS image ◮ more difficult due to the size of the HS image ◮ specific methods should be developed Multi-band image fusion: MS+XS fusion ◮ incorporate the spatial details of the MS image into the HS image ◮ more difficult since the spatial details contained in a multi-band image ◮ specific methods should be developed Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 6 / 67

  12. Bayesian fusion of multi-band images Context Problem statement Figure : (a) Hyperspectral Image (size: 99 × 46 × 224, res.: 20m × 20m) (b) Multispectral Image (size: 396 × 184 × 4 res.: 5m × 5m) (c) Target (size: 396 × 184 × 224 res.: 5m × 5m) Name AVIRIS (HS) SPOT-5 (MS) Pleiades (MS) WorldView-3 (MS) Res. (m) 20 10 2 1.24 # bands 224 4 4 8 Table : Some existing remote sensors characteristics Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 7 / 67

  13. Bayesian fusion of multi-band images Context Forward model Y H = XBS + N H , Y M = RX + N M ◮ X ∈ R m λ × n : full resolution unknown image ◮ Y H ∈ R m λ × m and Y M ∈ R n λ × n : observed HS and MS images ◮ B ∈ R n × n : cyclic convolution operator acting on the bands ◮ S ∈ R n × m : downsampling matrix ◮ R ∈ R n λ × m λ : spectral response of the MS sensor ◮ N H ∈ R m λ × m and N M ∈ R n λ × n : HS and MS noises 1 0.8 0.6 F 2 0.4 0.2 0 10 20 30 40 50 60 70 80 90 Band (a) Spatial blur- (b) Spectral blurring R ring B Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 8 / 67

  14. Bayesian fusion of multi-band images Context Noise statistics Gaussian assumption � � N H | s 2 s 2 H ∼ MN m λ , m ( 0 m λ , m , diag , I m ) � H � N M | s 2 s 2 M ∼ MN n λ , n ( 0 n λ , n , diag , I n ) M where � � T (hyperspectral noise variances) ◮ s 2 s 2 H , 1 , . . . , s 2 H = H , m λ � � T (multispectral noise variances) ◮ s 2 s 2 M , 1 , . . . , s 2 M = M , n λ and the pdf of a matrix normal distribution is defined by � � �� Σ − 1 c ( X − M ) T Σ − 1 − 1 p ( X | M , Σ r , Σ c ) = exp ( X − M ) 2 tr r ( 2 π ) np / 2 | Σ c | n / 2 | Σ r | p / 2 → band-dependent noise → pixel-independent noise Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 9 / 67

  15. Bayesian fusion of multi-band images Context Likelihood of the observations Given the forward model (characterized by both left- and right-operators) Y H = XBS + N H Y M = RX + N M the two likelihood functions express as � � Y H | X , s 2 s 2 H ∼ MN m λ , m ( XBS , diag , I m ) � � H Y M | X , s 2 s 2 M ∼ MN n λ , n ( RX , diag , I n ) M Joint likelihood HS and MS images acquired by distinct sensors → independent HS and MS noises → independent observed images, cond. on X � Y H , Y M | X , s 2 � � � � � Y H | X , s 2 Y M | X , s 2 f = f f H M � � with s 2 = s 2 H , s 2 M Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 10 / 67

  16. Bayesian fusion of multi-band images Context Likelihood of the observations Given the forward model (characterized by both left- and right-operators) Y H = XBS + N H Y M = RX + N M the two likelihood functions express as � � Y H | X , s 2 s 2 H ∼ MN m λ , m ( XBS , diag , I m ) � � H Y M | X , s 2 s 2 M ∼ MN n λ , n ( RX , diag , I n ) M Joint likelihood HS and MS images acquired by distinct sensors → independent HS and MS noises → independent observed images, cond. on X � Y H , Y M | X , s 2 � � � � � Y H | X , s 2 Y M | X , s 2 f = f f H M � � with s 2 = s 2 H , s 2 M Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 10 / 67

  17. Bayesian fusion of multi-band images Context Multi-band image fusion as an estimation problem Maximum likelihood estimation/Weighted least-square regression ˆ D 2 M ( Y M | RX ) + D 2 X ∈ argmin H ( Y H | XBS ) s 2 s 2 X where D s 2 M ( ·|· ) and D s 2 H ( ·|· ) are Mahalanobis distances associated with the noise variances. Main issues ◮ (generally) ill-posed problem ◮ (generally) large scale problem ⇒ Regularization required... Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 11 / 67

Recommend


More recommend