reduction methods for multiband image analysis
play

Reduction Methods for Multiband Image Analysis F. Flitti 1 , C. - PowerPoint PPT Presentation

ACIs GRID IDHA & MDA Reduction Methods for Multiband Image Analysis F. Flitti 1 , C. Collet 1 and F.Bonnarel 2 1 LSIIT, Strasbourg Univ. 2 Strasbourg Astronomic Observatory 1 http://picabia.u-strasbg.fr/lsiit/ 2


  1. ACI’s GRID – IDHA & MDA Reduction Methods for Multiband Image Analysis F. Flitti 1 , C. Collet 1 and F.Bonnarel 2 1 LSIIT, Strasbourg Univ. 2 Strasbourg Astronomic Observatory 1 http://picabia.u-strasbg.fr/lsiit/ 2 http://cdsweb.u-strasbg.fr/

  2. Plan collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 Pattern recognition * Classification task * Curse of dimensionality * Data reduction Multi-super-hyperspectral analysis * goals * limits Data reduction * Superspectral images in radio-astronomy context * 1 st method : Reduction using local projections * 2 nd method : Reduction using spectrum gaussian modeling Conclusion and Perspectives

  3. Pattern recognition collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 Guessing / predicting the unknown nature of an observation * discrete quantity * definition of pattern Methods * template matching * statistical classification * neural networks Recognition * supervised classification * unsupervised classification >> Set of features

  4. Classification collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 Space spanned by feature vectors * is subdivided using decision boundaries * which are established by statistical decision theory * Bayes decision theory : average risk is minimized Performances of a classifier * sample size * nb of features * classifier complexity (criterion function) Classification of high dimensional vector * curse of dimensionality * main factor affecting the classification task

  5. Hughe phenomenon collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 Inherent sparsity of high dimensional spaces * in the absence of simplifying assumptions, the amount of data needed to get reasonably low variance estimators is really high * N-band observations >> N times more data but in R N space Dimensionality reduction * appropriate dimensionality of the reduced feature space * Important structure in the data actually lies in a much smaller dimensional space, and will therefore try to reduce the dimensionality before attempting the classification. This approach can be successful if the dimensionality reduction/feature extraction method loses as little relevant information as possible in the transformation from high-dimensional space to the low-dimensional one.

  6. Dimensionality reduction collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 PCA (Karhunen-Loève expansion) and so on… * rotates the original feature space before projecting the feature vectors onto a limited number of axe * Energy based criterion (variance) * PCA seeks to minimize the mean squared reconstruction error * Maximization of the projection variance * Probabilistic PCA (PPCA, 1999) : gaussian a priori

  7. Dimensionality reduction collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 ICA principles * Model of source mixture (« cocktail party problem ») * linear transform making the data components independent * Mutual information measured by Kullback-Leibler distance * Weak mutual information between sources : Neguentropy (non gaussianity criterion) * pre-processing : centered data, spherical noise * loss of source order * loss of source power ICA’s methods * Cumulant-based approach (Comon) * Jade (4th order cumulant + joint diagonalization), (Carodoso, Souloumiac) * Infomax : Neural Network (Bell, Sejnowski) ; * FastICA (Oja & Hyvärinen), * SOBI : cross-correlation + joint diagonalization (Belouchrani)…

  8. Dimensionality reduction collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 Multidimensional scaling * multivariate data analysis techniques : any method searching for a low dimensional representation of objects given their high-dimensional representation Projection pursuit * Battacharya distance between 2 distributions * Subspaces max this distance Kohonen’s self organizing map

  9. Dimensionality reduction collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 Limits A reduction in the number of features may lead to a loss in the discrimination power and thereby lower the accuracy of the resulting recognition system. Dimensionality reduction * feature selection : selects best subset of the input feature set * feature extraction : creates new features based on transformation or combination of the original feature The main issue in dimensionality reduction is the choice of a criterion function. A commonly used criterion is the classification error of a feature subset.

  10. Vector valued images collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 H y p e r s p e c t r a l d a t a Multispectral data > 5 0 b a n d s Superspectral data <10 bands 10<bands<50 Superspectral data 10<bands<50 MARSIAA Software Gaussian Model (Markovian Quadtree or Markov Chain) For classification tasks for Reduction (Clustering, PCA, PPCA, Superspectral data ICA, Projection Pursuit…) segmentation before segmentation Others imagery modalities : Polarimetric imagery (Stockes Imagery, Mueller Imagery) Magnetic Resonance Imagery J.-N Provost, Ch. Collet, P. Rostaing, P. Pérez and P. Bouthemy Multimodal imagery by using different imaging modalities “Hierarchical Markovian Segmentation of Multispectral Images for the Reconstruction of Water Depth Maps'', Computer Vision and Image Understanding, to appear, December 2003.

  11. Mueller Imaging collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 Segmentation : vein of the leaf are well detected Mueller matrix describes interaction between light source and raw materials Paper to appear : J. Zallat, Ch. Collet and Y. Takakura, “Polarization Images Clustering”, Applied Optics, to appear, January 2004

  12. MRI collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 3D MARSIAA Software (Markovian Quadtree or Markov Chain) For classification tasks Markov Chain 3D Markovian Quadtree Magnetic Resonance Imagery Multimodal imagery by using different imaging modalities

  13. MRI collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003

  14. Vector valued images collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003

  15. Vector valued images collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003

  16. The images to be reduced collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 48 bands around CO ray of the GG tauri system from the IRAM interferometer

  17. Reduction using local projections (1 st technique) collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 Multispectral / Superspectral Bottom to up clustering algorithm using cube multiscale similarity measure based on Grouping normalized histograms and barycenters Local projections on each cluster obtained by the grouping : � 1 st axe of the Principal Component Analysis PCA /ICA � 1 st axe of the fastICA with deflationary orthogonalization Segmentation Markov modelling on the quadtree

  18. Reduction using local projections (1 st technique) collet@lsiit.u-strasbg.fr Grouping iAstro Workshop - Nice Observatory 16/17 October 2003 Difference between normalized histograms & barycenters at each scale Multispectral / S 2 Superspectral Image S 1 Grouping S 0 similarity measure Summing over all scales and normalizing Illustration of the bottom to up clustering algorithm: Grouping the closest two clusters at each iteration

  19. Reduction using local projections (1 st technique) collet@lsiit.u-strasbg.fr Local projections iAstro Workshop - Nice Observatory 16/17 October 2003 On each cluster established by the grouping step, we perform one of the two projections: PCA: Seeks data variance maximisation. Projection matrix given by the eigen vectors of the covariance matrix of data. PCA /ICA ICA: We use the fastICA algorithm with deflationary orthogonalization which seeks maximisation of the nongaussianity Finally, one keeps only the first image corresponding to the higher eigenvalue (PCA) or to the higher nongaussianity criterion (ICA).

  20. The images reduced by the 1 st technique with PCA collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003

  21. The images reduced by the 1 st technique with PCA collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 Grouping Hierarchical Markovian Segmentation (MARSIAA)

  22. Segmentation Results of The images reduced by the 1 st technique with PCA collet@lsiit.u-strasbg.fr Map on each reduced image iAstro Workshop - Nice Observatory 16/17 October 2003

  23. Segmentation Results of The images reduced by the 1 st technique with PCA collet@lsiit.u-strasbg.fr Combined maps iAstro Workshop - Nice Observatory 16/17 October 2003

  24. The images reduced by the 1 st technique with ICA collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003

  25. The images reduced by the 1 st technique with ICA collet@lsiit.u-strasbg.fr iAstro Workshop - Nice Observatory 16/17 October 2003 Grouping Hierarchical Markovian Segmentation (MARSIAA)

  26. Segmentation Results of The images reduced by the 1 st technique with ICA collet@lsiit.u-strasbg.fr Map on each reduced image iAstro Workshop - Nice Observatory 16/17 October 2003

Recommend


More recommend