optimization nd methods
play

Optimization (ND Methods) What is the optimal solution? (ND) f- ( x - PowerPoint PPT Presentation

Optimization (ND Methods) What is the optimal solution? (ND) f- ( x ) ! * = min $ ! * HI ) = A (First-order) Necessary condition 1D: ! " = 0 Q - gives : If (E) stationary solution NI = * (Second-order) Sufficient condition 1D: !


  1. Optimization (ND Methods)

  2. What is the optimal solution? (ND) f- ( x ) ! * ∗ = min $ ! * HI ) = A (First-order) Necessary condition 1D: !′ " = 0 Q - gives : If (E) stationary solution NI = × * (Second-order) Sufficient condition 1D: ! ## " > 0 * is definite positive ) is t → x NI i minimizer

  3. " → R f : IR Taking derivatives… I :÷÷÷l - , X n ) f- ( Xn ) = f- ( Xi , Xz , - - :# ⇒ ÷ ¥ . - - - gradient of . Cn xD f ¥ - Tian Hi . . If z . g¥ ) Ein Eas - ¥ - - - - o¥oxn a , :# . :* :* ¥÷ ⇒ - - - µ , ± , ox . . K . . ¥¥ YE ' o¥a - - - nth .

  4. I. i From linear algebra: ta O A symmetric ; ×; matrix = is positive definite if > & = > > @ for any > ≠ @ - A symmetric ; ×; matrix = is positive semi-definite if > & = > ≥ @ for any > ≠ @ A symmetric ; ×; matrix = is negative definite if > & = > < @ for any > ≠ @ A symmetric ; ×; matrix = is negative semi-definite if > & = > ≤ @ for any > ≠ @ A symmetric ; ×; matrix = that is not negative semi-definite and not positive semi- . I : . Hy G Eo II 's . s definite is called indefinite . e : : ÷ y

  5. ! * ∗ = min la . eight ) $ ! * First order necessary condition: +! * = , Second order sufficient condition: - * is positive definite How can we find out if the Hessian is positive definite? → are eigen pairs of H THW → ( X , y ) / ) jyxyty → " 5fa¥ :* - x " yn : - . > o tty ftp.defsxt is ti TH y * ki so → y minimizer Hy → His neg def ⇒ x * is ⇒ y ' < o Hy * Xi Lo fi maximizer * Tgi , Yo } → H is indefinite → x * is psaddetde o int

  6. Types of optimization problems ! * ∗ = min 5: nonlinear, continuous $ ! * and smooth Gradient-free methods Evaluate ! ' Gradient (first-derivative) methods Evaluate ! ' , !" # Second-derivative methods + HH Evaluate ! * , +! * , + % ! *

  7. Example (ND) me 9 + 4- : : + 2- : − 24- E Consider the function " - E , - : = 2- E " -244¥ :( Find the stationary point and check the sufficient condition : ) E- if " " 8 × 2+2 o → xf=4 → x,=±z 6 × 2--24 → ( GI ! :{ 4 ]=fg] ⇒ 1) of - - e → Xz= 8 × 2=-2 - 25 - O - fo ? ⇐ ] x # III. as ] * x' stationary points : - .fi#:.u.fttfo.-t-l::Ih::::i.n Hina ' ' et : .

  8. min FG ) Optimization in ND: x EI Steepest Descent Method " - E , - : = (- E − 1) O +(- : − 1) O Given a function FIFA - ! * : ℛ 7 → ℛ at a point " * , the function will decrease ⇒ if £ its value in the direction of steepest descent: −+! * Xz - • × - - - - - - - What is the steepest descent direction? ( Xo ) i X1

  9. = Ii - Of ¥2 Steepest Descent Method Start with initial guess: " - E , - : = (- E − 1) O +(- : − 1) O ! M = 3 3 Check the update: Xo KD It I es 0 missed - IT , ] Etan - i . ft ;] ± .

  10. Steepest Descent Method Update the variable with: " - E , - : = (- E − 1) O +(- : − 1) O ! ILE = ! I − P I Q" ! I = How far along the gradient should we go? What is the “best - µ size” for P I ? - 0=5 Of Go ) = Io I , . - 12=0.57 we get a ? How can I

  11. ÷÷÷÷÷:÷ ÷÷ " . . : i 05¥

  12. Steepest Descent Method ' " Algorithm: of Initial guess: * 3 # = ① . ÷÷ Evaluate: ; 4 = −+! * 4 Perform a line search to obtain < 4 (for example, Golden Section Search) O D < 4 = argmin ! * 4 + < ; 4 8 - ret ask Update: * 456 = * 4 + < 4 ; 4

  13. Xn ) - Ak Tf( = Xk Xktt Line Search de St . we want to find f- ( Hett ) - ④ DfC) min f ( Xn 1st order condition ¥70 → gives a) he x = of Art , ) . Tf ( Xr ) = O DI OXKH da pg(XnH)•Tf¥ zigpoEEfrwnergefa-a.GR# orthogonal to f( Xkti ) is

  14. Miu far , , Xz ) Example , g Xz X Consider minimizing the function ! " ! , " " = 10(" ! ) # − " " " + " ! − 1 =@ ± - I :] Given the initial guess . " ! = 2, " " = 2 a- =p - III ] what is the direction of the first step of gradient descent? if Iska - 4un±÷÷t → fi;

  15. Newton’s Method Using Taylor Expansion, we build the approximation: = ICI It ④ t ELITE tf f- ( Ets ) - f - t - - nonlinear quadratic : II - o approx off + St order condition → His :p I ⇒ t ⇒ .my#etric--/tfH.)s---TfGTf to find liusys → solve Newton slip s -

  16. ,#ii=f¥ any Newton’s Method Algorithm: Initial guess: * 3 0¥ ⑤ → solve Solve: - 9 * 4 ; 4 = −+! * 4 - - - Update: * 456 = * 4 + ; 4 - Note that the Hessian is related to the curvature and therefore contains the G T information about how large the step should be.

  17. Try this out! = 0.5- : + 2.5R : " -, R When using the Newton’s Method to find the minimizer of this function, estimate the number of iterations it would take for convergence? A) 1 B) 2-5 C) 5-10 D) More than 10 E) Depends on the initial guess

  18. Newton’s Method Summary Algorithm: Initial guess: * 3 Solve: - 9 * 4 ; 4 = −+! * 4 Update: * 456 = * 4 + ; 4 About the method… • Typical quadratic convergence J = • Need second derivatives L • Local convergence (start guess close to solution) • Works poorly when Hessian is nearly indefinite Cost per iteration: >(@ : ) • -

Recommend


More recommend