Decomposition by operator-splitting methods and applications in image deblurring Daniel O’Connor (Department of Mathematics, UCLA) Lieven Vandenberghe (Electrical Engineering, UCLA) Workshop on Optimization for Modern Computation BICMR, Beijing, September 2–4, 2014
Primal-dual decomposition minimize f ( x ) + g ( Ax ) • f, g are ‘simple’ convex functions (indicators of simple sets, norms, . . . ) • A is a structured matrix • widely used format in literature on multiplier and splitting algorithms This talk: decomposition by splitting primal-dual optimality conditions � � � � � � A T 0 x ∂f ( x ) 0 ∈ + ∂g ∗ ( z ) − A 0 z and applications in image deblurring 1/30
Models of image blur b = Kx + w • x is exact image, Kx is image blurred by blurring operator K • b is observed image, equal to blurred image plus noise w Space-invariant blur: structure of K depends on boundary conditions • periodic: WKW H is diagonal where W is 2D DFT matrix • zero/replicate: K = K c + K s with K c diagonalizable by DFT, K s sparse m � Space-varying blur (Nagy-O’Leary model): K = U i K i i =1 • K i : space-invariant blurring operators • U i : positive diagonal matrices that sum to identity 2/30
Deblurring via convex optimization minimize φ f ( Kx − b ) + φ s ( Dx ) + φ r ( x ) Data fidelity term φ f • convex penalty, e.g. , squared 2-norm, 1-norm, Huber penalty, . . . • indicator for convex set, e.g. , 2-norm ball Smoothing term φ s • D is discretized first derivative, or a wavelet/shearlet transform matrix • φ s is a norm, e.g. , for total variation reconstruction, a sum of 2-norms n � � u 2 i + v 2 φ s ( u, v ) = γ � ( u, v ) � iso = γ i i =1 3/30
Deblurring via convex optimization minimize φ f ( Kx − b ) + φ s ( Dx ) + φ r ( x ) Regularization term φ r • penalty on x • indicator for convex set, for example, { x | 0 ≤ x ≤ 1 } In composite form: minimize f ( x ) + g ( Ax ) with � � K f ( x ) = φ r ( x ) , A = , g ( u, v ) = φ f ( u − b ) + φ s ( v ) D 4/30
Outline • Introduction • Douglas-Rachford splitting method • Primal-dual splitting • Space-varying blur
Monotone operator A set-valued mapping F is monotone if ( u − v ) T ( y − x ) ≥ 0 , ∀ x, y ∈ dom F , u ∈ F ( x ) , v ∈ F ( y ) • subdifferential ∂f of closed convex function f • skew-symmetric linear operator, for example, � � � � A T 0 x F ( x, z ) = − A 0 z • sums of monotone operators, for example, � � � � � � A T 0 x ∂f ( x ) F ( x, z ) = + ∂g ∗ ( z ) − A 0 z 5/30
Resolvent The resolvent of a monotone operator F is the operator ( I + t F ) − 1 (with t > 0 ) Properties (for maximal monotone F ) • y = ( I + t F ) − 1 ( x ) exists and is unique for all x • y is the (unique) solution of the inclusion problem x ∈ y + t F ( y ) Examples • resolvent of subdifferential ∂f is called proximal operator of f • for linear monotone F , resolvent ( I + t F ) − 1 is matrix inverse 6/30
Proximal operator The proximal operator of a closed convex function f is the mapping � � f ( y ) + 1 2 t � y − x � 2 prox tf ( x ) = argmin 2 y Examples • f ( x ) = δ C ( x ) (indicator of closed convex set C ): Euclidean projection � y − x � 2 prox tf ( x ) = P C ( x ) = argmin 2 y • f ( x ) = � x � : shrinkage operation prox tf ( x ) = x − P tC ( x ) , C is unit ball for dual norm 7/30
Calculus rules for proximal operators Separable function: if f ( x 1 , x 2 ) = f 1 ( x 1 ) + f 2 ( x 2 ) , then � � prox f ( x 1 , x 2 ) = prox f 1 ( x 1 ) , prox f 2 ( x 2 ) Moreau decomposition: relates prox-operators of conjugates prox tf ∗ ( x ) + t prox t − 1 f ( x/t ) = x Composition with affine mappig: f ( x ) = g ( Ax + b ) with AA T = aI prox f ( x ) = ( I − 1 aA T A ) x + 1 aA T � � prox ag ( Ax + b ) − b 8/30
Douglas-Rachford splitting Problem: given maximal monotone operators A , B , solve 0 ∈ A ( x ) + B ( x ) Algorithm (Lions & Mercier, 1979) x + ( I + t A ) − 1 ( z ) = ( I + t B ) − 1 (2 x + − z ) y + = z + ρ ( y + − x + ) z + = • x converges under weak conditions (for any t > 0 and ρ ∈ (0 , 2) ) • useful when resolvents of A , B are inexpensive, but not resolvent of sum • includes other well-known algorithms as special cases ( e.g. , ADMM) 9/30
Outline • Introduction • Douglas-Rachford splitting method • Primal-dual splitting • Space-varying blur
Primal-dual splitting Composite problem and dual − f ∗ ( − A T z ) − g ∗ ( z ) minimize f ( x ) + g ( Ax ) maximize Primal-dual optimality conditions � � � � � � A T ∂f ( x ) 0 x 0 ∈ + ∂g ∗ ( z ) − A 0 z � �� � � �� � A ( x,z ) B ( x,z ) Resolvent computations • A : prox-operators of f and g • B : solution of a linear equation with coefficient I + t 2 A T A 10/30
Example: constrained L1-TV deblurring minimize � Kx − b � 1 + γ � Dx � iso subject to 0 ≤ x ≤ 1 • Gaussian blur with salt-and-pepper noise; periodic boundary conditions • I + K T K + D T D diagonalizable by DFT • 1024 × 1024 image original blurred restored 11/30
Primal Douglas-Rachford splitting Equivalent problem ( δ is indicator function of { 0 } ) minimize f ( x ) + g ( Ax ) − → minimize f ( x ) + g ( y ) + δ ( Ax − y ) � �� � � �� � F ( x,y ) G ( x,y ) Algorithm: Douglas-Rachford splitting applied to optimality conditions 0 ∈ ∂F ( x, y ) + ∂G ( x, y ) Resolvent computations • ∂F requires prox-operators of f , g • ∂G requires linear equation with coefficient I + A T A hence, similar complexity per iteration as primal-dual splitting 12/30
Alternating direction method of multipliers (ADMM) Douglas-Rachford applied to dual, after introducing splitting variable u minimize f ( x ) + g ( Ax ) − → minimize f ( u ) + g ( y ) � � � � I u subject to x − = 0 A y ADMM: alternating minimization of augmented Lagrangian f ( u ) + g ( y ) + w T ( x − u ) + z T ( Ax − y ) + t � � � x − u � 2 2 + � Ax − y � 2 2 2 • minimization over x : linear equation with coefficient I + A T A • minimization over ( u, y ) : prox-operators of f , g hence, similar complexity per iteration as primal-dual splitting 13/30
Chambolle-Pock method � � � � � � A T ∂f ( x ) 0 x 0 ∈ + ∂g ∗ ( z ) − A 0 z Algorithm z + x + ) = prox tg ∗ ( z + tA ¯ x + prox sf ( x − sA T z + ) = 2 x + − x x + ¯ = √ • convergence requires st < 1 / � A � 2 • no linear equations with A ; only multiplications with A and A T 14/30
Convergence ( f ( x k ) − f ⋆ ) /f ⋆ 0 10 CP ADMM −1 primal DR 10 primal−dual DR −2 10 −3 10 −4 10 −5 10 −6 10 −7 10 0 200 400 600 800 1000 iteration number k ∼ 1 . 4 seconds per iteration for each method 15/30
Additive structure in A − f ∗ ( − A T z ) − g ∗ ( z ) minimize f ( x ) + g ( Ax ) maximize • f , g have inexpensive prox-operators • A = B + C with structured B and C : equations with coefficients I + B T B, I + C T C are easy to solve, but not I + A T A Extended primal-dual optimality conditions A T 0 0 0 I x ∂g ( y ) 0 0 − I 0 y 0 ∈ + 0 − A I 0 0 z ∂f ∗ ( w ) − I 0 0 0 w 16/30
Primal-dual splitting B T 0 0 0 0 x ∂g ( y ) 0 0 0 0 y 0 ∈ + 0 − B 0 0 0 z ∂f ∗ ( w ) 0 0 0 0 w � �� � A ( x,y,z,w ) C T 0 0 I x 0 0 − I 0 y + − C I 0 0 z − I 0 0 0 w � �� � B ( x,y,z,w ) Resolvent computations • A : prox-operators of f , g , linear equation I + t 2 B T B t 2 (1+ t 2 ) 2 C T C • B : linear equation with coefficient I + 17/30
TV-L1 deblurring with replicate boundary conditions minimize � ( K c + K s ) x − b � 1 + γ � ( D c + D s ) x � iso subject to 0 ≤ x ≤ 1 • K c , D c : operators for periodic boundary conditions • K s , D s : sparse correction for replicate boundary conditions blurry, noisy image deblurred using deblurred using periodic b.c. replicate b.c. 18/30
Handling replicate boundary conditions K = K c + K s , D = D c + D s • K c , D c : operators assuming periodic boundary conditions • I + K T c K c + D T c D c is diagonalized by DFT • E = I + K T s K s + D T s D s is sparse 4 4 x 10 x 10 0 0 1 1 2 2 3 3 4 4 5 5 6 6 0 1 2 3 4 5 6 0 1 2 3 4 5 6 nz = 463680 nz = 592892 4 4 x 10 x 10 pattern of E Cholesky factor of E 19/30
Primal Douglas-Rachford splitting Equivalent problem: introduce splitting variables ˜ x , ˜ y minimize f ( x ) + g ( y + ˜ y ) + δ ( x − ˜ x ) + δ ( Bx − y ) + δ ( C ˜ x − ˜ y ) � �� � � �� � F ( x, ˜ x,y, ˜ y ) G ( x, ˜ x,y, ˜ y ) and apply Douglas-Rachford method to find zero of 0 ∈ ∂F ( x, ˜ x, y, ˜ y ) + ∂G ( x, ˜ x, y, ˜ y ) Resolvent computations • ∂F : require prox-operators of f , g • ∂G : linear equations with coefficients I + B T B , I + C T C more variables but similar complexity per iteration as primal-dual splitting 20/30
Recommend
More recommend