numerical behavior of saddle point solvers
play

Numerical behavior of saddle point solvers Miro Rozlo zn k joint - PowerPoint PPT Presentation

Numerical behavior of saddle point solvers Miro Rozlo zn k joint results with Pavel Jir anek and Valeria Simoncini Institute of Computer Science, Czech Academy of Sciences, Prague, Czech Republic, miro@cs.cas.cz Seminar at the


  1. Numerical behavior of saddle point solvers Miro Rozloˇ zn´ ık joint results with Pavel Jir´ anek and Valeria Simoncini Institute of Computer Science, Czech Academy of Sciences, Prague, Czech Republic, miro@cs.cas.cz Seminar at the Widzial Matematyki, Informatyki i Mechaniki Uniwersytetu Warszawskiego, Warszawa, November 12, 2009

  2. Saddle point problems We consider a saddle point problem with the symmetric 2 × 2 block form � A � � x � � f � B = . B T 0 y 0 ◮ A is a square n × n nonsingular (symmetric positive definite) matrix, ◮ B is a rectangular n × m matrix of (full column) rank m . Applications: mixed finite element approximations, weighted least squares, constrained optimization, computational fluid dynamics, electromagnetism etc. [Benzi, Golub and Liesen, 2005]. For the updated list of applications leading to saddle point problems contact [Benzi, 2009] or just ask P. Krzyzanowski

  3. 1. Segregated or coupled solution approach: Schur complement or null-space projection method; outer iteration for solving the reduced system; inexact solution of inner systems ; inner iteration with appropriate stopping criterion; 2. Preconditioning: iteration scheme for solving the preconditioned system; exact and inexact preconditioners ; approximate or incomplete factorization; structure or value-based schemes with appropriate dropping rules; 3. Iterative solver: finite termination property, theoretical rate of convergence; rounding errors in floating point arithmetic ; Numerous solution schemes, preconditioners and iterative solvers .......

  4. Delay of convergence and limit on the final accuracy

  5. Schur complement reduction method ◮ Compute y as a solution of the Schur complement system B T A − 1 By = B T A − 1 f, ◮ compute x as a solution of Ax = f − By. ◮ Segregated vs. coupled approach: x k and y k approximate solutions to x and y , respectively. ◮ Inexact solution of systems with A : every computed solution ˆ u of Au = b is interpreted an exact solution of a perturbed system ( A + ∆ A )ˆ u = b + ∆ b, � ∆ A � ≤ τ � A � , � ∆ b � ≤ τ � b � , τκ ( A ) ≪ 1 .

  6. Iterative solution of the Schur complement system choose y 0 , solve Ax 0 = f − By 0    compute α k and p ( y )    k     y k +1 = y k + α k p ( y )   k     solve Ap ( x ) = − Bp ( y ) �    � k k    �    back-substitution:  outer �    �   �  iteration A: x k +1 = x k + α k p ( x )  k , inner �    �   iteration �  B: solve Ax k +1 = f − By k +1 ,  �     �     C: solve Au k = f − Ax k − By k +1 , �     �     �   x k +1 = x k + u k .   �    − α k B T p ( x )  r ( y ) k +1 = r ( y )   k k

  7. Numerical experiments: a small model example A = tridiag(1 , 4 , 1) ∈ R 100 × 100 , B = rand(100 , 20) , f = rand(100 , 1) , κ ( A ) = � A � · � A − 1 � = 7 . 1695 · 0 . 4603 ≈ 3 . 3001 , κ ( B ) = � B � · � B † � = 5 . 9990 · 0 . 4998 ≈ 2 . 9983 .

  8. Accuracy in outer iteration k � ≤ O ( τ ) κ ( A ) � − B T A − 1 f + B T A − 1 By k − r ( y ) 1 − τκ ( A ) � A − 1 �� B � ( � f � + � B � Y k ) . Y k ≡ max {� y i � | i = 0 , 1 , . . . , k } . 0 10 0 || k ||/||r (y) τ = 10 −2 −2 10 relative residual norms ||B T A −1 f−B T A −1 By k ||/||B T A −1 f−B T A −1 By 0 ||, ||r (y) −4 10 τ = 10 −6 −6 10 −8 10 τ = 10 −10 −10 10 −12 10 −14 10 τ = O(u) −16 10 −18 10 0 50 100 150 200 250 300 iteration number k B T ( A + ∆ A ) − 1 B ˆ y = B T ( A + ∆ A ) − 1 f, τκ ( A ) � B T A − 1 f − B T A − 1 B ˆ 1 − τκ ( A ) � A − 1 �� B � 2 � ˆ y � ≤ y � .

  9. Does the final accuracy depend on the outer iteration method? ◮ Gap between the true and updated residual for any two-term recurrence method depends on the maximum norm of approximate solutions over the whole iteration process. [Greenbaum 1994, 1997] k � ≤ O ( τ ) κ ( A ) �− B T A − 1 f + B T A − 1 By k − r ( y ) 1 − τκ ( A ) � A − 1 �� B � ( � f � + � B � Y k ) . Y k ≡ max {� y i � | i = 0 , 1 , . . . , k } . ◮ Schur complement system is negative definite, some norm of the error or residual converges monotonically for almost all iterative methods. The quantity Y k then does not play an important role and it can be replaced by � y 0 � or a multiple of � y � .

  10. Accuracy in the saddle point system � f − Ax k − By k � ≤ O ( α 1 ) κ ( A ) 1 − τκ ( A ) ( � f � + � B � Y k ) , k � ≤ O ( α 2 ) κ ( A ) � − B T x k − r ( y ) 1 − τκ ( A ) � A − 1 �� B � ( � f � + � B � Y k ) , Y k ≡ max {� y i � | i = 0 , 1 , . . . , k } . Back-substitution scheme α 1 α 2 A : Generic update τ u x k +1 = x k + α k p ( x ) k B : Direct substitution τ τ � x k +1 = A − 1 ( f − By k +1 ) additional C : Corrected dir. subst. system with A u τ x k +1 = x k + A − 1 ( f − Ax k − By k +1 ) − B T A − 1 f + B T A − 1 By k = − B T x k − B T A − 1 ( f − Ax k − By k )

  11. Generic update: x k +1 = x k + α k p ( x ) k 0 0 10 10 τ = 10 −2 −2 10 −2 10 (y) || −4 (y) ||/||r 0 10 −4 10 τ = 10 −6 relative residual norms ||−B T x k ||/||−B T x 0 ||, ||r k −6 −6 10 residual norm ||f−Ax k −By k || 10 −8 −8 10 10 τ = 10 −10 −10 10 −10 10 −12 10 −12 10 τ = O(u) −14 −14 10 10 τ = O(u), τ = 10 −10 , τ =10 −6 , τ =10 −2 −16 −16 10 10 −18 −18 10 10 0 50 100 150 200 250 300 0 50 100 150 200 250 300 iteration number k iteration number k

  12. Direct substitution: x k +1 = A − 1 ( f − By k +1 ) 0 0 10 10 τ = 10 −2 −2 τ = 10 −2 10 −2 10 (y) || −4 (y) ||/||r 0 10 −4 10 τ = 10 −6 relative residual norms ||−B T x k ||/||−B T x 0 ||, ||r k τ = 10 −6 −6 −6 10 residual norm ||f−Ax k −By k || 10 −8 −8 10 10 τ = 10 −10 τ = 10 −10 −10 10 −10 10 −12 10 −12 10 τ = O(u) −14 −14 10 10 τ = O(u) −16 −16 10 10 −18 −18 10 10 0 50 100 150 200 250 300 0 50 100 150 200 250 300 iteration number k iteration number k

  13. Corrected direct substitution: x k +1 = x k + A − 1 ( f − Ax k − By k +1 ) 0 0 10 10 −2 τ = 10 −2 10 −2 10 (y) || −4 (y) ||/||r 0 10 −4 10 relative residual norms ||−B T x k ||/||−B T x 0 ||, ||r k τ = 10 −6 −6 −6 10 residual norm ||f−Ax k −By k || 10 −8 −8 10 10 τ = 10 −10 −10 10 −10 10 −12 10 −12 10 −14 −14 10 10 τ = O(u), τ = 10 −10 , τ =10 −6 , τ =10 −2 τ = O(u) −16 −16 10 10 −18 −18 10 10 0 50 100 150 200 250 300 0 50 100 150 200 250 300 iteration number k iteration number k

  14. Forward error of computed approximate solution � x − x k � ≤ γ 1 � f − Ax k − By k � + γ 2 � − B T x k � , � y − y k � ≤ γ 2 � f − Ax k − By k � + γ 3 � − B T x k � , min ( B T A − 1 B ) . γ 1 = σ − 1 min ( A ) , γ 2 = σ − 1 min ( B ) , γ 3 = σ − 1 0 10 τ = 10 −2 −1 B −2 10 T A −1 B /||y−y 0 || B −4 10 τ = 10 −6 T A −6 relative error norms ||x−x k || A /||x−x 0 || A , ||y−y k || B 10 −8 10 τ = 10 −10 −10 10 −12 10 −14 10 τ = O(u) −16 10 −18 10 0 50 100 150 200 250 300 iteration number k

  15. Null-space projection method ◮ compute x ∈ N ( B T ) as a solution of the projected system ( I − Π) A ( I − Π) x = ( I − Π) f, ◮ compute y as a solution of the least squares problem By ≈ f − Ax, Π = B ( B T B ) − 1 B T is the orthogonal projector onto R ( B ) . ◮ Schemes with the inexact solution of least squares with B . Every computed approximate solution ¯ v of a least squares problem Bv ≈ c is interpreted as an exact solution of a perturbed least squares ( B + ∆ B )¯ v ≈ c + ∆ c, � ∆ B � ≤ τ � B � , � ∆ c � ≤ τ � c � , τκ ( B ) ≪ 1 .

  16. Null-space projection method choose x 0 , solve By 0 ≈ f − Ax 0    ∈ N ( B T ) compute α k and p ( x )    k     x k +1 = x k + α k p ( x )   k     solve Bp ( y ) ≈ r ( x ) − α k Ap ( x ) �    � k k k    �    back-substitution:  outer �    �   �  iteration A: y k +1 = y k + p ( y )  k , inner �    �   iteration �  B: solve By k +1 ≈ f − Ax k +1 ,  �     �     C: solve Bv k ≈ f − Ax k +1 − By k , �     �     �   y k +1 = y k + v k .   �     r ( x ) k +1 = r ( x ) − α k Ap ( x ) − Bp ( y )   k k k

Recommend


More recommend