martin burger
play

Martin Burger Institute for Computational and Applied Mathematics - PowerPoint PPT Presentation

(4D) Variational Models Preserving Sharp Edges Martin Burger Institute for Computational and Applied Mathematics 2 Mathematical Imaging Workgroup @WWU 0.65 DNA Akrosom 0.60 Flagellum Glass 0.55 0.50 0.45 0.40 Intensity (cnt) 0.35


  1. (4D) Variational Models Preserving Sharp Edges Martin Burger Institute for Computational and Applied Mathematics

  2. 2 Mathematical Imaging Workgroup @WWU 0.65 DNA Akrosom 0.60 Flagellum Glass 0.55 0.50 0.45 0.40 Intensity (cnt) 0.35 0.30 0.25 0.20 0.15 0.10 0.05 600 800 1 000 1 200 1 400 1 600 1 800 Raman Shift (cm -1 ) Martin Burger Linz, 2011

  3. 3 Some Philosophy „ No matter what question, L1 is the answer “ Stanley O. Regularization in data assimilation is at the same state it was 10 years ago in biomedical imaging The understanding and methods we gained in medical imaging can hopefully be useful in geosciences and data assimilation Martin Burger

  4. 4 Biomedical Imaging: 2000 vs 2010 Modality State of the art 2000 State of the art 2010 Full CT Filtered Backprojection Exact Reconstruction PET/SPECT Filtered Backprojection /EM EM-TV / Dynamic Sparse PET-CT - EM-AnatomicalTV Acousto-Opt. - Wavelet Sparse / TV EEG/MEG LORETA Sparsity / Bayesian ECG-BSPM Least Norm L1 of normal derivative Microscopy None, linear Filter Poisson-TV / Shearlet-L1 Martin Burger

  5. 5 Based on joint work with Martin Benning, Michael Möller, Felix Lucka, Jahn Müller (Münster) Stanley Osher (UCLA) Christoph Brune (Münster / UCLA / Vancouver) Fabian Lenz (Münster), Silvia Comelli (Milano/Münster) Eldad Haber (Vancouver) Mohammad Dawood, Klaus Schäfers (NucMed/EIMI Münster) SFB 656 Martin Burger Linz, 2011

  6. 6 Regularization of Inverse Problems We want to solve Forward operator between Banach spaces with finite dimensional approximation (sampling, averaging) Martin Burger

  7. 7 Dynamic Biomedical Imaging Maximum Likelihood / Bayes Reconstruct maximum-likelihood estimate Model of posterior probability (Bayes) Yields regularized variational problem for finite m Martin Burger Saarbrücken, 9.7.10

  8. 8 Minimization of penalized log-likelihood General variational approach Combines nonlocal part (including K ) with local regularization functional Gaussian noise (note: covariance hidden in output norm) Martin Burger

  9. 9 Example Gauss: Additive noise, i.i.d. on each pixel, mean zero, variance s Minimization of negative posterior log-likelihood yields Asymptotic variational model Martin Burger

  10. 10 Optimality Existence and uniqueness by variational methods General case: optimality condition is a nonlinear integro-differential equation / inclusion (integral operator K , differential operator in J ) Gauss: Martin Burger

  11. 11 Robustness Due to noisy data robustness of with respect to errors in f is important Problem is robust for large a , but data are only reproduced for small a Convergence of solutions as f converges or as a to zero in weak* topology Martin Burger

  12. 12 Structure of Solutions Analysis by convex optimization techniques, duality Structure of subgradients important Possible solution satisfy source condition Allows to gain information about regularity (e.g. of edges) Martin Burger

  13. 13 Structure of Solutions Optimality condition for Structure of u determined completely by properties of u B and K* For smoothing operators K, singularity not present in u B cannot be detected Model error goes into K resp. K* and directly modifies u Martin Burger

  14. 14 4D VAR Given time dynamics starting from unknown initial value Variational Problem to estimate initial state for further prediction Martin Burger Linz, 2011

  15. 15 4D VAR = 3D Variational Problem Elimination of further states from dynamics Effective Variational Problem for initial value in 3D Martin Burger Linz, 2011

  16. 16 Example: Linear Advection Minimize quadratic fidelity + TV of initial value subject to Upwind discretization Martin Burger Linz, 2011

  17. 17 4D VAR for Linear Advection Gibbs phenomenon as usual Martin Burger Linz, 2011

  18. 18 4D VAR for Linear Advection Full observations (black), noisy(blue), 40 noisy samples (red) Martin Burger Linz, 2011

  19. 19 4D VAR for Linear Advection Different noise variances Martin Burger Linz, 2011

  20. 20 Analysis of Model Error Optimality Exact Operator for linear advection is almost unitary Hence Martin Burger Linz, 2011

  21. 21 Beyond Gaussian Priors Again: optimality condition for MAP estimate If J is strictly convex and smooth, subdifferential is a singleton containing only the gradient of J , which can be inverted to obtain a similar relation. Again operator determines structure Only chance to obtain full robustness: multivalued subdifferential. Singular regularization Martin Burger

  22. 22 Singular Regularization Construct J such that the subdifferential at points you want to be robust is large Example: l1 sparsity Zeros are robust Martin Burger

  23. 23 TV-Methods: Structural Prior (Cartooning) Penalization of total Variation Formal Exact ROF-Model for denoising g : minimize total variation subject to Rudin-Osher-Fatemi 89,92 Martin Burger

  24. 24 Why TV-Methods ? Cartooning Linear Filter TV-Method Martin Burger

  25. ROF Model clean noisy ROF Martin Burger

  26. 26 H 2 O 15 PET – Left Ventricular Time Frame EM EM-Gauss EM-TV Martin Burger

  27. 27 Dynamic Biomedical Imaging H 2 O 15 PET – Right Ventricular Time Frame EM EM-Gauss EM-TV Martin Burger Saarbrücken, 9.7.10

  28. 28 4D VAR for Linear Advection Gibbs phenomenon as usual Martin Burger Linz, 2011

  29. 29 4D VAR for Linear Advection Full observations (black), noisy(blue), 40 noisy samples (red) Martin Burger Linz, 2011

  30. 30 4D VAR TV for Linear Advection Comparison for full observations Martin Burger Linz, 2011

  31. 31 4D VAR TV for Linear Advection Comparison for observed samples Martin Burger Linz, 2011

  32. 32 4D VAR TV for Linear Advection Comparison for observed samples with noise Martin Burger Linz, 2011

  33. 33 Analysis of Model Error Variational problem as before, add Optimality condition As before Martin Burger Linz, 2011

  34. 34 Analysis of Model Error Structures are robust: apply T in region where If we find s solving Poisson equation with then Martin Burger Linz, 2011

  35. 35 Numerical Solution: Splitting or ALM Operator Splitting into standard problem (dependent on code) and simple denoising-type problem Example: Peaceman Rachford-Splitting for Martin Burger Linz, 2011

  36. 36 Bayes and Uncertainty Natural prior probabilities for singular regularizations can be constructed even in a Gaussian framework Interpret J(u) as a random variable with variance s 2 Prior probability density MAP estimate minimizes Martin Burger

  37. 37 Bayes and Uncertainty Equivalence to original form via constraint regularization For appropriate choice of a and g, minimization of and is equivalent to subject to Martin Burger

  38. 38 Uncertainty Quantification Sampling with standard MCMC schemes difficult Novel Gibbs sampler by F.Lucka based on analytical integration of posterior distribution function in 1D Theoretical Insight: MSc Thesis Silvia Comelli CM Estimate for TV prior Martin Burger

  39. 39 Uncertainty Quantification II Error estimates in dependence on the noise, using source conditions Error estimates need appropriate distance measure,generalized Bregman-distance mb-Osher 04, Resmerita 05, mb-Resmerita-He 07, Benning-mb 09 Estimates for Bayesian distributions in Bregman transport distances (w. H.Pikkarainen) = 2 Wasserstein distance in the Gaussian case Martin Burger

  40. 40 Uncertainty Quantification III Idea: construct linear functionals from nonlinear eigenvectors We have For TV-denoising (also for linear advection example), Estimate of maximal error for mean value on balls For l1-sparsity estimate of error in single components Benning PhD 11, Benning-mb 11 Martin Burger

  41. Loss of Contrast ROF minimization loses contrast, total variation of the reconstruction is smaller than total variation of clean image. Image features left in residual f-u g, clean f , noisy u , ROF f-u mb-Gilboa-Osher-Xu 06 Martin Burger

  42. 42 Loss of Contrast = Systematic Bias of TV Becomes more severe in ill-posed problems with operator K Not just simple vision effect to be corrected, but loss of information Simple idea for Least-Squares: add back the noise to amplify = Augmented Lagrangian Osher-mb-Goldfarb-Xu-Yin 2005 Martin Burger

  43. 43 Bregman Iteration Can be shown to be equivalent to Bregman iteration Immediate generalization to convex fidelities and regularizers Generalization to Gauss-Newton type Methods for nonlinear K: use linearization of K around last iterate u l Bachmayr-mb 2009 Martin Burger

  44. 44 Bregman Iteration Properties like iterative regularization method Regularizing effect from appropriate termination of the iteration Better performance for oversmoothing single steps, i.e. regularization parameter a very large Limit: Inverse Scale Space Method mb-Gilboa.Osher-Xu 2006 Martin Burger

  45. 45 Why does Inverse Scale Space work ? Singular value decomposition in fully quadratic case Eigenfunctions: yields Convergence faster in small frequencies (large eigenvalues) Martin Burger

Recommend


More recommend