Adaptive Patch-based Image Denoising by EM-Adaptation Purdue University Joint work with Enming Luo and Truong Nguyen (UCSD) 1
Image Denoising … AGAIN!? 2
Image Denoising Consider an additive iid Gaussian noise model: where Our goal is to estimate from Our Approach: Maximum-a-Posteriori 3
MAP Framework Since the noise iid Gaussian, the conditional distribution is Therefore, the MAP is 4
Image Priors • Markov Random Field (80s) • Gradients (80s) • Total Variation (90s) • X-lets (wavelet, contourlet, curvelet , …, 90s ) • Lp norm (00s) • Dictionary (KSVD, 00s) • Example (00s) • Non-local (BM3D, nonlocal means, 2005, 2007) • Shotgun! (2011) • Graph Laplacian (2012) 5
Patch-based Priors What is a patch? A patch is a small block of pixels in an image Why patch? What is patch-based prior? 6
Training a Patch-based Prior Typically, we train a patch-based prior from a large collection of images EM Algorithm e.g., Gaussian mixture: 7
Good Training Set 8
How good? Example: Text Image clean image noisy image BM3D [Luo-Chan-Nguyen, 15] (single image method) (use targeted training) 9
Challenge: (1)Finding good examples are HARD (2)Finding a lot of good examples are EVEN HARDER This Talk: Can priors be learned adaptively? Target image update Gaussian mixture model Generic database [Zoran- Weiss ‘11] 10 2 million 8x8 image patches
Our Proposed Idea 11
Toy Example Imagine that: (a) Original generic database (A LOT of samples) (b) Ideal targeted database (A LOT of samples) (c) In reality, samples from targeted database is FEW!!! 12
13
EM Adaptation 14
EM Adaptation 15
EM Adaptation Classical EM: EM Adaptation: 16
EM Adaptation 17
EM Adaptation Classical EM: EM Adaptation: 18
EM Adaptation 19
EM Adaptation Classical EM: EM Adaptation 20
21
EM Adaptation in the literature Theory of EM Adaptation • J. Gauvain and C. Lee, “Maximum a posteriori estimation for multivariate Gaussian mixture observations of Markov chains ,” IEEE Transactions Speech and Audio Process. , vol. 2, no. 2, pp. 291 – 298, Apr. 1994. • D.A. Reynolds, T.F. Quatieri, and R.B. Dunn , “ Speaker verification using adapted gaussian mixture models ,” Digital signal process. , vol. 10, no. 1, pp. 19 – 41, 2000. • P.C . Woodland, “Speaker adaptation for continuous density hmms: A review ,” in In ITRW on Adaptation Methods for Speech Recognition , pp. 11 – 19, Aug. 2001. • M. Dixit, N. Rasiwasia, and N. Vasconcelos , “Adapted gaussian models for image classification,” in IEEE Conference Computer Vision and Pattern Recognition (CVPR’11 ) , pp. 937 – 943, Jun. 2011. 22
i.e., denoise the image with a method you like. EM Adaptation for Noisy Images Assume the pre-filtered image satisfies In this case, the M-step becomes 23
Stein’s Unbiased Risk Estimator (SURE) What is the difference? Clean: Pre-filtered: 24
25
Results 26
27
28
29
EPLL 31.48dB proposed 31.80dB 30
Conclusion 31
EM Adaption is - a method to combine generic database and the noisy image EM Adaption automatically swings between - Generic database - When noise is extremely high - When patches are relatively smooth - Where there are insufficient training samples - Noisy image - When there is sharp edges in a patch - When there are enough training samples 32
Questions? 33
34
We want to address two questions for MAP: Question 1 : How to SOLVE this optimization problem? (If we cannot solve this problem, then there is no point of continuing.) Question 2 : How to ADAPTIVELY learn a prior? Generic prior (from an arbitrary databased) Specific prior (match the image of interest) 35
We want to address two questions for MAP: Question 1 : How to SOLVE this optimization problem? (If we cannot solve this problem, then there is no point of continuing.) Question 2 : How to ADAPTIVELY learn a prior? Generic prior (from an arbitrary databased) Specific prior (match the image of interest) 36
Half Quadratic Splitting General Principle [Geman-Yang, T-IP, 1995] The Algorithm: 37
Solution to Problem (1): Example Gaussian Mixture Model [Zoran- Weiss ‘11] If , then the solution to (1) is where 38
Solution to Problem (2): The solution to (2) is 39
Question 1 : How to SOLVE this optimization problem? For Gaussian Mixture: 40
We want to address two questions for MAP: Question 1 : How to SOLVE this optimization problem? (If we cannot solve this problem, then there is no point of continuing.) Question 2 : How to ADAPTIVELY learn a prior? Generic prior (from an arbitrary databased) Specific prior (match the image of interest) 41
Recommend
More recommend