Goals : generic fusion scheme New Bayesian Fusion scheme • Reconstruction of a single hyperspectral image Y f thanks to n noisy ob- for Hyperspectral servations Y i . • Gather in an ideal image X the whole information included within each Astronomical Data observation. • Each Y i may eventually be acquired with di ff erent sensors characterized by di ff erent acquisition conditions : spatial sampling on heteregenous lattices, geometric deformation, di ff erent Field Spread Function (FSP or spatial Ch Collet, M. Petremand, A. Jalobeanu, PSF) or Line Spread Function (LSF). F. Salzenstein, V. Mazet, M. Louys • Decrease data corruption due to cosmic rays University of Strasbourg - FRANCE • Give the fusion result + uncertainty associated with LSIIT UMR CNRS http://lsiit-miv.u-strasbg.fr/paseo/ 1 2 Muse needs • adding all 1 hour exposure datacubes obtained in different observing conditions to detect faint sources (deep field) in 3D Model Space volume ; Sensor Space • maximising the signal-to-noise ratio and spatial/spectral resolution of the final cube ; • recovery of a compound model “image+uncertainties” that best relates to the observations ; • remove cosmic rays at best 3 4
Data Fusion in the framework of new Plan of the talk generation IFU : MUSE instrument Introduction I- Direct Model II- Inverse Model • T : model space • F : continuous (3D-B-spline filtered) III- Preliminary Results Y : sensor space • X : discrete (spatial and spectral sampling) IV- Conclusion and perspectives ˆ X : Reconstruction 5 6 MUSE IFU (Integral Field Unit) I- Sensor modeling (IFU) and Image • Galaxy field observation with large redshift acquisition • Requires a set of 80 observations Y i , one hour each, to avoid non-reversible cosmic rays corruption • Each Y i comes from the same IFU, whose final size after astronomic re- Direct model duction equals 300 × 300 × 3500 pixels. Fusion challenge : to gather these 80 observations into a single one in an optimal (Bayesian) way. 7 8
Direct Model MUSE IFU (Integral Field Unit) Image formation modeling : direct problem This fusion process needs to eliminate cosmic rays or outliers pixels, take into account seeing, variation of acquisition (sky background, registration on • Model Space : 3D space where objects are observed (part of the celest the same lattice, etc.). sphere), with a topology and an arbitrary geometry : spatial square sam- Fusion process has to pling grid, spectral sampling grid from λ 0 with a step λ p . F stands for continous ideal image whereas X represents the sampled ideal image. • eliminate cosmic rays : easier if done on the CCD matrix without any pre- processing because cosmic rays corrupt a neighborhood around a central • Image space (2 D space) : focal plane where the image is formed within location, according to impact angle ; MUSE IFU, between the fore optic and the field Splitter. After succes- sive cut by splitter and slicers, each sub-image is spread thanks to the • take into account dead pixels (more generally all outliers including cosmic spectroscope and is printed on the CCD surface ( Sensor space ). rays), FSF-LSF and sensor noise ; • Each column in the sensor space Y corresponds to a spectrum line, each • Make information fusion and give uncertainty on each location on the line stands for spatial image. reconstructed hyperspectral data cube . Image reconstruction modeling : inverse problem in a Bayesian framework 9 10 Model space Model space Model Space Model Space Sensor Space Sensor Space X = L ⋆ϕ F = T ⋆ ϕ F = T ⋆ ϕ Image F sampled according to Shan- sampled on a lattice of variable resolu- non condition : in this sense, X is the T the truth 3D-filtered by a B-spline of tion : X p = F ( p ) ideal sampled image. degree 3 4 4 T F X Truth F X Ideal image Ideal image Truth (continuous) (sampled) Ideal image Ideal image F can be interpolated on each location ( u, z ) using the following expression: (continuous) (sampled) � � F ( u, z ) = L jk ϕ ( u − j ) ϕ ( z − k ) (1) • Let T be the truth defined within the model space ( u, z ). j k • Let F = T ⋆ϕ be the ideal image, with finite spatial and spectral resolution where j ∈ Z 2 stands for the spatial samples and k ∈ Z for the spectral samples. corresponding to the truth T observed with a perfect telescope modelized Coe ffi cients L jk are now called interpolation coe ffi cients or B-spline coe ffi - by a B-spline of degree 3 ϕ . cients. X p = F ( p ) is the digital version of F linked to interpolation coe ffi cients by : • F, T, X are all hyperspectral cubes (also called image hereafter) X = L ⋆ϕ (2) 11 12
Model space : MUSE observation Model space : MUSE observation Spatial side Spectral side 1. convolution of J s ( z ) by LSF h z is assumed independent of spatial position, 1. T is convolved by the FSF h u uz which depends on the spatial location u h z does not depend of u : and spectral z location: T ( u, z ) ⋆ h u J s ( z ) ⋆ h z ( z ) uz ( u ) (1) (1) 2. Spatial sampling of T ( u, z ) ⋆ h u uz ( u ) on the regular lattice u s . This sampling 2. Spectral sampling of J s ( z ) on the regular spectral lattice z st : grid takes into account spatial shifts δ i x λ δ i y λ , variable with the wavelength I st = J s ( z st ) ⋆ h z ( z st ) (2) (atmospheric refraction). Irregularities of the sampling due to field cut by the splitter and the slicers remain unimportant : where z st stands for the coordinate z of the Model space corresponding to point ( s, t ) in the Sensor space. The shifts between IFU on spectral J s ( z ) = T ( u s , z ) ⋆ h u u s z ( u s ) (2) sampling grids are integrated within spectral sampling process modeling where u s is the spatial coordinates of the Model space linked to location which depends on the spatial location s within Sensor space. s within Sensor space, 3. J s ( z ) is the continuous spectrum, at spatial position u s ; 13 14 Rendering coefficients : exposure time Model space : MUSE observation and spectral sampling process Observed values on the sensor at ( s, t ) location expressed as the combi- nation of spline coe ffi cients ( L j,k ) within the model space ( X ) and rendering co- e ffi cients linking both space together To summary, a pixel observed on Y can be written in the following manner: (Sensor space indexed with ( s, t ) and Model space indexed with ( j, k )). Y st = I st + N (0 , σ st ) (1) • After spectral sampling process, one obtains the following expression : with I st = ( J s ⋆ h z ) ( u s , z st ) = � T ⋆ h u u s z st ⋆ h z � ( u s , z st ) (2) � � L jk α stjk avec α stjk = h u u s z st ( u s − j ) h z ( z st − k ) I st = (1) where σ st stands for the noise standard-deviation observed in ( s, t ). j k • a gain factor W modeling exposure time and sensor sensitivity on each pixel ( s, t ) is finally integrated within the rendering coe ffi cients : α stjk = W st h u u s z st ( u s − j ) h z ( z st − k ) (2) 15 16
Direct problem : MUSE case � � F ( u, z ) = L jk ϕ ( u − j ) ϕ ( z − k ) j k • T : model space X = L ⋆ϕ • F : continuous (3D-B-spline filtered) • X : discrete (spatial and spectral sampling) II- Inverse model Y i = � � L jk α i stjk + N (0 , σ i α i stjk = W i st h iu u s z st ( u s − j ) h z ( z st − k ) st ) j k Y : sensor space ˆ X : Reconstruction 17 18 Strasbourg Scheme May 2010 Inverse Model Image reconstruction modeling : inverse problem in a Bayesian framework • Huge data set : cubes, covariance matrix, FSF, LSF, rendering coe ffi - cients... • No sequential algorithm : the fusion process has to be computed with the whole data set at a time • New approach : sequential algorithm and uncertainty propagation • Possible improvement when new data will be available 19 20
Available raw observations on the sensor B : Available observations on the sensor 2 ) � Y i α i pl L l + B i u B i p ∼ N (0 , σ i p = p o` (1) p l Each observation Y i , i ∈ { 1 ..n } on location p = ( s, t ) can be written as : can be rewritten in the following form : Y i = α i L + B i 2 ) Y i � α i pl L l + B i u B i p ∼ N (0 , σ i (2) p = p o` (1) p l where Y i where l = ( j, k ) is the 3D coordinate in the model space and B i p a white Gaussian 0 . . noise, with std σ i . p . Y i = Y i (3) p . . . Y i η 0 21 22 Inversion model Geometric transforms • Cube fusion have to take place within the same Model space, successive shifts must be estimated. • Such estimation is possible with reconstructed cubes outcoming from the DRS (Data Reduction System associated to MUSE sensor). • Estimated parameters are then integrated through rendering coe ffi cients. Thus they are implicitely taken into account during the fusion process. • Spectral shifts are known and constant through all the observations, but δ i x λ and δ i y λ require to be estimated. This can be done on several band- width and interpolated for all wavelengths. 23 24
Recommend
More recommend