hybrid steepest descent method for variational inequality
play

Hybrid Steepest Descent Method for Variational Inequality Problem - PowerPoint PPT Presentation

Hybrid Steepest Descent Method for Variational Inequality Problem over Fixed Point Sets of Certain Quasi-Nonexpansive Mappings Isao Yamada Tokyo Institute of Technology VIC2004 @ Wellington, Feb. 13, 2004 This talk is based on a joint work


  1. Hybrid Steepest Descent Method for Variational Inequality Problem over Fixed Point Sets of Certain Quasi-Nonexpansive Mappings Isao Yamada Tokyo Institute of Technology VIC2004 @ Wellington, Feb. 13, 2004

  2. This talk is based on a joint work with N. Ogura (Tokyo Institute of Technology). 1

  3. We are trying to solve: in Real Hilbert Sp H Variational Inequality Problem over Fix ( T ) ✓ ✏ For given T : H → H and Θ : H → R (Convex func.), ✓ Find ✏ u ∗ ∈ Fix ( T ) := { x ∈ H | T ( x ) = x } closed convex � ≥ 0, ∀ u ∈ Fix ( T ). � u − u ∗ , Θ ′ ( u ∗ ) s.t. ✒ ✑ ✒ ✑ For T : Convex Projection ⇒ Gradient Projection Method (Goldstein’64/Levitin&Polyak’66) We propose Hybrid Steepest Descent Method ✓ ✏ ฀ T : H → H Nonexpansive Mapping (Yamada et al ’96— / Deutsch & Yamada ’98 / Yamada ’01) Appl: Convexly Constrained Inverse Problems ฀ T : H → H Quasi-Nonexpansive (Yamada&Ogura’03) Appl: Optimization of Fixed Point of Subgradient Projector ✒ ✑ 2

  4. Part 1 Background / Preliminaries ✓ ✏ Original Idea of Gradient Projection Method ✒ ✑ 3

  5. X P K(X) K Y P K (Y) Convex Projection: Basic Properties ✓ ✏ ฀ � P K ( x ) − P K ( y ) � ≤ � x − y � , ∀ x , y ∈ H ฀ Fix ( P K ) := { x ∈ H | P K ( x )= x } = K ฀ K must be simple to compute P K . ✒ ✑ 4

  6. Gradient Projection Method (1964—) ✓ ✏ u n − λ n +1 Θ ′ ( u n ) � � u n +1 := P K , n = 0 , 1 , 2 , . . . ✒ ✑ — under certain conditions — converges (strongly / weakly) to a solution to Smooth Convex Optimization Problem (P1) ✓ ✏ Minimize Θ : H → R G-differentiable convex func. x ∈ K ( ⊂ H ) closed convex set Subject to where H : Real Hilbert Space ✒ ✑ u ∗ ∈ K is a solution of (P1) NOTE: ⇔ u ∗ ∈ K satisfies u − u ∗ , Θ ′ ( u ∗ ) � � ≥ 0, ∀ u ∈ K . 5

  7. Part 2 Hybrid Steepest Descent Method ✓ ✏ From Projection to Nonexpansive Mapping / Quasi-Nonexpansive Mapping ✒ ✑ 6

  8. T : H → H is called κ -Lipschitzian if ∃ κ > 0 s.t. � T ( x ) − T ( y ) � ≤ κ � x − y � for all x, y ∈ H . If κ = 1 ✓ ✏ • T : H → H is Nonexpansive mapping . • Fix ( T ) := { x ∈ H | T ( x ) = x } is closed convex . ✒ ✑ ⇓ ฀ Generalization κ < 1 ⇒ κ < 1 or κ = 1 broadens Fixed Point Theory significantly. ฀ Many choices of T s.t. Fix ( T ) = K , e.g., m m m    = � � � Fix w i T i Fix ( T i ) if Fix ( T i ) � = ∅ .  i =1 i =1 i =1 7

  9. Is It Possible to Extend from Gradient Projection Method ✓ ✏ v n − λ n +1 Θ ′ ( v n ) � � v n +1 := P K ✒ ✑ to ✓ ✏ v n − λ n +1 Θ ′ ( v n ) � � v n +1 := T where T : H → H : Nonexpansive Mapping ✒ ✑ for Minimizing Θ over Fix ( T ) ? 8

  10. To Answer to the Question, we introduce ✓ ✏ Hybrid Steepest Descent Method (Yamada et al, 1996—) u n +1 := T ( u n ) − λ n +1 Θ ′ ( T ( u n )) where T : H → H : Nonexpansive Mapping ✒ ✑ This is because ✓ ✏ v n := T ( u n ) is generated by ฀ v n − λ n +1 Θ ′ ( v n ) � � v n +1 := T and n →∞ u n = u ∗ ∈ Fix ( T ) ฀ If s- lim n →∞ v n = u ∗ ∈ Fix ( T ) ⇒ s- lim ✒ ✑ 9

  11. In short, Hybrid Steepest Descent Method (Yamada2001): u n +1 := T ( u n ) − λ n +1 Θ ′ ( T ( u n )) can minimize Θ over Fix ( T ), where T : H → H : nonexpansive, and ( λ n ) ∞ n =1 ⊂ R + : slowly decreasing. 10

  12. Sequence Generation by Hybrid Steepest Descent Method Θ Π Fix(T) T(u n ) T(H) u n+1 H - λ n+1 Θ ’(T(u n )) 11

  13. Hybrid Steepest Descent Method (Yamada 2001) Suppose that (a) T : H → H : Nonexp. mapping, (b) Θ : H → R :Convex function, (c) Θ ′ : Lipschitzian & Strongly monotone over T ( H ), (d) ( λ n ) n ≥ 1 ⊂ [0 , ∞ ) satisfies � � (i) n →∞ λ n = 0, (ii) lim λ n = ∞ , (iii) | λ n − λ n +1 | < ∞ . n ≥ 1 n ≥ 1 ⇓ u n +1 := T ( u n ) − λ n +1 Θ ′ ( T ( u n )) satisfies n →∞ u n = u ∗ ∈ arg s- lim x ∈ Fix ( T ) Θ( x ) . (Unique) inf 12

  14. If we specially choose Θ( x ) := 1 2 � x − a � 2 in the Hybrid Steepest Descent Method , ⇓ Halpern (’67), P.L.Lions (’77), Wittmann (’92) ✓ ✏ u n +1 := λ n +1 a + (1 − λ n +1 ) T ( u n ) , converges strongly to P Fix ( T ) ( a ), where T : H → H : nonexpansive, and ( λ n ) ∞ n =1 ⊂ R + : slowly decreasing. ✒ ✑ More general cyclic versions were given by P.L. Lions (1977) and H.H. Bauschke (1996) 13

  15. Generalization of Θ Θ ′ : Lipschitzian & Paramonotone (Ogura,Yamada 2002) Robust Hybrid Steepest Descent Method (Yamada, Ogura, Shirakawa 2002) ✓ ✏ u n +1 := T ( n ) ( u n ) − λ n +1 Θ ′ � � T ( n ) ( u n ) where T ( n ) := (1 − t n +1 ) I + t n +1 T is gifted with notable numerical robustness. ✒ ✑ For detail, see Contemporary Mathematics 313 (2002) 14

  16. Convexly Constrained ✓ ✏ Generalized Inverse Problem K ⊂ H : a closed convex set, Let Ψ : H → R : the 1st convex function, satisfying K Ψ := arg inf x ∈ K Ψ( x ) � = ∅ . Then the problem is Find a point x ∗ ∈ arg inf Θ( x ) =: Γ( � = ∅ ) , x ∈ K Ψ where Θ : H → R is the 2nd convex function. ✒ ✑ 15

  17. Suppose that Ψ ′ : H → H (G-derivative) is γ -Lipschitzian. ⇓ Apply Hybrid Steepest Descent Method ✓ ✏ u n +1 := T ( u n ) − λ n +1 Θ ′ ( T ( u n )), T := P K ( I − ν Ψ ′ ) , � � ∀ ν ∈ (0 , 2 /γ ] Solves the Problem , i.e., n →∞ d ( u n , Γ) = 0. lim ✒ ✑ NOTE: Projected Landweber Iteration (Eicke 1992): λ n +1 A ∗ b + β n ( I − λ n +1 A ∗ A ) v n � � v n +1 := P K 2 � x � 2 and is the simplest realization for Θ( x ) := 1 Ψ( x ) := 1 2 � A ( x ) − b � 2 o ( A : H → H o : bdd linear). 16

  18. Part 3 Hybrid Steepest Descent Method ✓ ✏ From Nonexpansive Mapping to Quasi-Nonexpansive Mapping ✒ ✑ 17

  19. Quasi-Nonexpansive Mapping T : H → H is called Quasi-Nonexpansive if � T ( x ) − T ( f ) � ≤ � x − f � , ∀ ( x , f ) ∈ H × Fix ( T ). In this case, ✓ ✏ Fix ( T ) := { x ∈ H | T ( x ) = x } is closed convex set . ✒ ✑ Quasi-nonexpansive mapping T is not necessarily continuous. 18

  20. Convex Projection Firmly Nonexpansive Attracting Nonexpansive Nonexpansive Quasi-Nonexpansive Next Example shows The level set of continuous convex function can be expressed as Fixed Point Set of Simple Quasi-Nonexpansive Mapping. 19

  21. Example (Subgradient Projection T sp (Φ) ) ✓ ✏ Subgradient Projection for Cont. convex function Φ  Φ( x ) x − � g ( x ) � 2 g ( x ) if Φ( x ) > 0     T sp (Φ) : x �→ if Φ( x ) ≤ 0 , x     where g ( x ) ∈ ∂ Φ( x ) : subgradient of Φ at x ∈ H . ✒ ✑ ⇓ ✓ See for example (Bauschke & Combettes ’01) ✏ ฀ T sp (Φ) : ( 1 2 -averaged) quasi-nonexpansive, ฀ Fix ( T sp (Φ) ) = { x ∈ H | Φ( x ) ≤ 0 } =: lev ≤ 0 Φ ✒ ✑ 20

  22. Subgradient Projection : Approximation of Convex Projection x T Φ (x) sp( ) lev < 0 Φ = � � = lev ≤ 0 (Φ) Fix T sp (Φ) 21

  23. Is It Possible to Extend from ✓ ✏ u n +1 := T ( u n ) − λ n +1 Θ ′ ( T ( u n )) where T : H → H : Nonexpansive ✒ ✑ to ✓ ✏ u n +1 := T ( u n ) − λ n +1 Θ ′ ( T ( u n )) where T : H → H : Quasi-Nonexpansive ✒ ✑ for Minimizing Θ over Fix ( T ) ? 22

  24. Quasi-shrinking (Yamada & Ogura ’03) ✓ ✏ Let T : H → H : quasi-nonexpansive with Fix ( T ) ∩ C � = ∅ for ∃ C ( ⊂ H ): closed convex set. ⇓ T : H → H is called quasi-shrinking on C if D : r ∈ [0 , ∞ ) �→  u ∈ ⊲ ( Fix ( T ) ,r ) ∩ C d ( u, Fix ( T )) − d ( T ( u ) , Fix ( T )) inf       if ⊲ ( Fix ( T ) , r ) ∩ C � = ∅     ∞ otherwise   satisfies D ( r ) = 0 ⇔ r = 0. ✒ ✑ where ⊲ ( Fix ( T ) , r ) := { x ∈ H | d ( x, Fix ( T )) ≥ r } . 23

  25. Hybrid Steepest Descent Method ( Quasi-Nonexpansive ) Suppose that (a) T : H → H : Quasi-Nonexpansive , (b)Θ ′ : κ -Lipschitzian& η -Strongly monotone over T ( H ), (c) ∃ f ∈ Fix ( T ), s.t. T is quasi-shrinking on     � µ F ( f ) �     C f ( u 0 ) :=  x ∈ H | � x − f � ≤ max  � u 0 − f � ,   �  1 − µ (2 η − µκ 2 ) 1 −    where µ ∈ (0 , 2 η κ 2 ), ⇓ ✓ ✏ � With ( λ n ) n ≥ 1 ⊂ [0 , 1] s.t. (i) n →∞ λ n = 0, (ii) lim λ n = ∞ , n ≥ 1 u n +1 := T ( u n ) − λ n +1 µ Θ ′ ( T ( u n )) satisfies n →∞ u n = u ∗ ∈ arg s- lim x ∈ Fix ( T ) Θ( x ) (Unique) inf ✒ ✑ 24

  26. Proposition ✓ ✏ Suppose Φ : H → R (cont. convex function) satisfies ฀ lev ≤ 0 Φ � = ∅ and ฀ ∂ Φ bounded. ✓ Define ✏ T α := (1 − α ) I + αT sp (Φ) ( α ∈ (0 , 2)). ✒ ✑ ✓ Then ✏ (a) If dim( H ) < ∞ , ⇒ T α : quasi-shrinking on any bdd closed convex C satisfying C ∩ lev ≤ 0 Φ � = ∅ . (b) If Φ ′ ∈ ∂ Φ: Uniformly monotone over H , ⇒ T α : quasi-shrinking on any bdd closed convex C satisfying C ∩ lev ≤ 0 Φ � = ∅ . ✒ ✑ ✒ ✑ 25

Recommend


More recommend