optimization introduction
play

Optimization (Introduction) : IR IR f ( x ) ID Optimization " - PowerPoint PPT Presentation

Optimization (Introduction) : IR IR f ( x ) ID Optimization " 112 FCI ) : IR NE Goal: Find the minimizer ! that minimizes the objective (cost) function " ! : " - fatal * Unconstrained Optimization x :


  1. Optimization (Introduction)

  2. : IR → IR f ( x ) ID Optimization " → 112 FCI ) : IR NE Goal: Find the minimizer ! ∗ that minimizes the objective (cost) function " ! : ℛ " → ℛ - fatal * Unconstrained Optimization x : arg yi±yf /H¥nfGY or -

  3. Optimization Goal: Find the minimizer ! ∗ that minimizes the objective (cost) function " ! : ℛ " → ℛ Constrained Optimization ( = min f Cx ) f- ( x * ) X . hi G) ⇐ o → equality s . t → inequality :# so gig in

  4. Unconstrained Optimization " ! t÷ • What if we are looking for a maximizer ! ∗ ? " ! ∗ = max # = main ( - fan ) f(x* )

  5. Calculus problem: maximize the rectangle area subject to perimeter constraint area ✓ - Max Area max ! ∈ ℛ ! → perimeter ① constraint - - perimeter ③ dzflo } ② ditto di .dz#Lwithoutpericoust)-s/di--dz=l0-sA=IooT -

  6. mata -100 [ ()*+ = ' $ ' % Area ( violates " t.a.pe ' % perimeter ' $ -f#¥Lr -*)./*0*) = 2(' $ + ' % ) . D ' % D ' $

  7. " sizes A :÷÷÷ILi € ¥ ⇐ ¥ ¥ .dz ) pig :i÷÷÷ :* * i ; . dies d ' ( d , .dz )

  8. What is the optimal solution? (1D) IT ! " ∗ = min ! " max " , - (First-order) Necessary condition • gifted , points (Second-order) Sufficient condition " txt ) > o → x * minimum is f " ( x* ) co - X * maximum is f-

  9. Does the solution exists? Local or global solution? "°*^# IEEE a

  10. Example (1D) miufcx ) X Consider the function " ! = 7 ! 8 − 7 " 9 − 11 - : + 40- . Find the stationary point and check the sufficient condition 100 max * lstarder necessary condition : - 3¥ : ' ( -4¥ ' f- x ) I - 22 × +40 - 6 - 4 - 2 2 4 6 min ; . - 100 ! ⇒ x3 - E - 22 × +40=0 f) G) = o x =/ ! -25 - 200 solutions ⇒ * min 4 ' ' as .ae#-aftIfa7::ii:i:.::.oIEhin * 2nd order condition : , ' 25 ) +10-2220 " f- 5) =3 ( f- ( Min )

  11. Types of optimization problems ! " ∗ = min 5: nonlinear, continuous ! " and smooth " Gradient-free methods Evaluate ! " If Gradient (first-derivative) methods Evaluate ! " , !′ " • Second-derivative methods Evaluate ! " , !′ " , !′′ " off

  12. ::¥:*"i÷ I Optimization in 1D: Huh fK¥ , if 'm Golden Section Search • Similar idea of bisection method for root finding • Needs to bracket the minimum inside an interval • Required the function to be unimodal / / 4 I A function ": ℛ → ℛ is unimodal on an interval [4, 5] fix . ü There is a unique ! ∗ ∈ [4, 5] such that "(! ∗ ) is the minimum in ✓ [4, 5] OI ü For any - E , - : ∈ [4, 5] with - E < - : * - : < ! ∗ ⟹ "(- E ) > "(- : ) - r § - E > ! ∗ ⟹ "(- E ) < "(- : ) - § - r -

  13. ( FIT ¥7 fa ta 5 5 $ $ Fb Fb 5 5 ' ' 5 ( 5 5 % % 5 ( B H $ • H $ B H ' H ' H % H ( H % H ( A B. G * a *2 × * * 22 G + * + + F F Ifif@HsxIIfIf1xisxIXttE-La.XzTfI.t [ 4 , b ]

  14. K£7 Propose the point asks - t . IT e- at Cl - E) hk x , Xz XI -- at Chic / Xz HI - C) hk The h$ ⇐ ( b - a) , Ill at the start 1 : I i 1 ← Ehk'#£E)hI f , sfz f , > f , or [ a. xD !¥% , 9k Ex , , b ] iteration * hkti-E.hn T.ee/-fEoery Em !i hatch interval gets aFI¥EIb . .ch#..EfET;db " ' ' - Ehr ( I - E) hrs I ¥t%h* , - E - 12=0.6-187 - ) Cte - t I

  15. 12-0.618-1 KI interval Caio ) - - Cb - a) h . . → x , = at ( t - E) ho I I ! hofz-t-GD-qh.pe#qs.--cyh-/ I , - → x*E[ a ,Xz ] if f , sfz : " I µ h-# b=X2 9k → fz - fi xz=x , a : ④ TTbtf+ III. Ethan . . t¥h¥¥/ ( , f , - FCK ) → x*E Ex , , b ] f , > fz if : ¥+,¥¥¥h µ Ej÷¥÷nn-f=fz That , fz=f(Xz ) Xz= At Chu -

  16. Golden Section Search

  17. Golden Section Search What happens with the length of the interval after one iteration? ℎ ! = ( ℎ " Or in general: ℎ #$! = ( ℎ # Hence the interval gets reduced by ) (for bisection method to solve nonlinear equations, ( =0.5) For recursion: ( ℎ ! = (1 − () ℎ " ( ( ℎ " = (1 − () ℎ " ( % = (1 − () ) = .. 012

  18. → the < tot Golden Section Search II HE ha evaluate FG ) ① I • Derivative free method! - Chien ef÷=Y÷ - hee - ¥ • Slow convergence: - - - T re I → |@ ILE | lim = 0.618 D = 1 (EFG@4D HIGJ@DK@GH@) @ I I→K - - • Only one function evaluation per iteration cheap , Xi

  19. Example → ho A = - to = 20 TT - to b h , - = ? = 12.36 = I ko he 0.618 × 20 → ,

  20. Yeti = Xk t h Newton’s Method Using Taylor Expansion, we can approximate the function " with a quadratic n¥near" function about - M = I " - ≈ " - M + " N - M (- − - M ) + E : " N ′ - M (- − - M ) : And we want to find the minimum of the quadratic function using the first-order necessary condition * = O stationary → point Go ) tyzf - to )¢= of ' ' Go ) Cx ' 5- sin :÷±÷÷÷ µ ÷¥¥÷7¥ -

  21. Newton’s Method i • Algorithm: " 3 = starting guess " 456 = " 4 − !′ " 4 /!′′ " 4 - - - - • Convergence: • Typical quadratic convergence • Local convergence (start guess close to solution) • May fail to converge, or converge to a maximum or point of inflection

  22. Newton’s Method (Graphical Representation) get A µ .¥i tho ) HA ' • IED I l l l #i → X3 Xl Xz Xo X sequence of opt . approx I quad using .

  23. Example Consider the function " - = 4 - 9 + 2 - : + 5 - + 40 - If we use the initial guess - M = 2 , what would be the value of - after one - iteration of the Newton’s method? x , = ? ' G) = 12 × 2+4 × +5 f- " G) = 24 X t 4 f- - CMg¥£¥t# - ft - fifty h - - = = - ¥2 -1 × 1--0.82697 X , = Xo th = 2 X , →

Recommend


More recommend