computational optimization
play

Computational Optimization Constrained Optimization m R b , m - PowerPoint PPT Presentation

Computational Optimization Constrained Optimization m R b , m n n Easiest Problem R R Linear equality constraints f A b ( ) = f x Ax min . . s t Null Space Representation Let x* be a feasible point, Ax*=b.


  1. Computational Optimization Constrained Optimization

  2. m R ∈ b , m n × n Easiest Problem R R ∈ ∈ Linear equality constraints f A b ( ) = f x Ax min . . s t

  3. Null Space Representation Let x* be a feasible point, Ax*=b. Any other feasible point can be written as x=x*+P where Ap=0 The feasible region {x : x*+p p ∈ N(A)} where N(A) is null space of A

  4. Example ( ) + + 1 2 2 3 min x x x 1 2 2 2 + + = . . 3 4 4 s t x x x 1 2 3 Solve by substitution ( ) + + 1 2 2 3 min x x x 1 2 2 2 = − − . . 4 3 4 s t x x x 1 2 3 becomes ( ) ( ) 2 − − + + 1 2 3 min 4 3 4 x x x x 2 3 2 2 2 3

  5. Null Space Method ( ) + + 1 2 2 3 min x x x 1 2 2 2 − − ⎡ ⎤ 3 4 + + = . . 3 4 4 s t x x x ⎢ ⎥ 1 2 3 = ⎢ x*= [4 0 0]’ 1 0 Z ⎥ ⎢ ⎥ ⎣ 0 1 ⎦ − − − − ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ 4 3 4 4 3 4 v v ⎡ ⎤ 1 1 ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ v + = 1 0 1 0 ⎢ ⎥ v ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ x=x*+v 1 ⎣ ⎦ v ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ 2 ⎣ ⎦ 0 ⎣ 0 1 ⎦ ⎣ ⎦ v 2 ( ) ( ) 2 − − + + 1 2 3 min 4 3 4 v v v v becomes 1 2 1 3 2 3

  6. Variable Reduction Method Let A=[B N] for x* (a basic feasible solution with at most m nonzero variables corresponding to columns of × × × − − × − ∈ ∈ ∈ ∈ ( ) ( ) ( ) m n m m m n m n m n m A R B R N R I R B) ⎡ ⎤ − − 1 B N assumes m < n = ⎢ ⎥ Z ⎣ ⎦ I is a basis matrix for null space of A ⎡ ⎤ ⎡ ⎤ − − − 1 1 B B [ ] = ⇒ = = + ⎢ ⎥ ⎢ ⎥ 0 A AA B N I r r ⎣ ⎦ ⎣ ⎦ 0 0

  7. Where did Z come from? A=[1 3 4] x* = [4 0 0] A=[B N] B=[1] N = [3 4] − − − ⎡ ⎤ ⎡ ⎤ 1*[3 4] 3 4 ⎡ − ⎤ − 1 ⎢ ⎥ ⎢ ⎥ B N = = = ⎢ ⎥ 1 0 1 0 Z ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ I ⎢ ⎥ ⎢ ⎥ ⎣ 0 1 ⎦ ⎣ 0 1 ⎦

  8. General Method There exists a Null Space Matrix × ∈ ≥ − n r Z R r n m The feasible region is: { } + | * x x Zv Equivalent “Reduced” Problem + min ( * ) v f x Zv

  9. Practice Problem 4 0 = ) = 2 3 3 x x 3 + x 4 − + 2 2 x 2 2 x + x 3 2 1 + x ( 1 x 2 1 min . . s t

  10. Optimality Conditions Assume feasible point and convert to null space formulation = + ( ) ( * ) g v f x Zv ∇ = ∇ + = ∇ = = + ( ) ' ( * ) ' ( ) 0 * g v Z f x Zv Z f y where z x Zv ∇ = ∇ + = ∇ 2 2 2 ( ) ' ( * ) ' ( ) g v Z f x Zv Z Z f y Z

  11. Lemma 14.1 Necessary Conditions (Nash + Sofer) If x* is a local min of f over {x|Ax=b}, and Z is a null matrix ⇒ ∇ = ' ( *) 0 Z f x ∇ 2 ' ( *) . . . and Z f x Z is p s d Or equivalently use KKT Conditions ⇒ ∇ − λ = ( *) ' 0 f x A has a solution = * Ax b ⇒ ∇ 2 ' ( *) . . . Z f x Z is p s d

  12. Lemma 14.2 Sufficient Conditions (Nash + Sofer) If x* satisfies (where Z is a basis matrix for Null(A)) = * Ax b ∇ = ' ( *) 0 Z f x ∇ 2 ' ( *) . . Z f x Z is p d then x* is a strict local minimizer

  13. Lemma 14.2 Sufficient Conditions (KKT form) If (x*, λ *) satisfies (where Z is a basis matrix for Null(A)) = * Ax b ∇ − λ = ( *) ' 0 f x A ∇ 2 ' ( *) . . Z f x Z is p d then x* is a strict local minimizer

  14. Lagrangian Multiplier λ * is called the Lagrangian Multiplier It represents the sensitivity of solution to small perturbations of constraints ≈ + − ∇ ˆ ˆ ( ) ( *) ( *)' ( *) f x f x x x f x = + − λ ˆ ( *) ( *)' ' * f x x x A by KKT OC = + δ ˆ Nowlet Ax b m + ∑ = + δ λ = δ λ * ( *) ' * ( *) f x f x i i = 1 i

  15. Optimality conditions Consider min (x 2 +4y 2 )/2 s.t. x-y=10 ∇ − λ = ( ) ' 0 f x A = Ax b � ⎡ ⎤ ⎡ ⎤ 1 x = λ ⎢ ⎥ ⎢ ⎥ − ⎣ 4 ⎦ ⎣ 1 ⎦ y − = 10 x y � = λ = = * * 8, * 2, x y

  16. Optimality conditions Find KKT point Check SOSC ∇ − λ = = ( ) ' 0 f x A ' [1 1] Z = ⎡ ⎤ Ax b 1 0 ∇ = ⎢ 2 ( ) f x ⎥ � ⎣ ⎦ 0 4 ⎡ ⎤ ⎡ ⎤ ∇ 1 x 2 ' ( ) . . Z f x Z is p d = λ ⎢ ⎥ ⎢ ⎥ − ⎣ 4 ⎦ ⎣ 1 ⎦ y So SOSC satisfied − = 10 x y Or we could just observe that � it is a convex program so FONC = λ = = − * * 8, * 2, x y are sufficient

  17. Linear Equality Constraints - I ( ) 1 2 + 2 min 4 x x 1 2 2 − = s.t. 10 x x 1 2 ∇ = λ = T ( ) Ax b f x A ⎡ ⎤ x [ ] ∇ = = − 1 f(x) A 1 1 ⎢ ⎥ ⎣ 4x ⎦ 2 ∴ ⎡ ⎤ ⎡ ⎤ 1 x = λ 1 ⎢ ⎥ ⎢ ⎥ − ⎣ 4 ⎦ ⎣ ⎦ 1 x 2 − = 10 x x 1 2

  18. Linear Equality Constraints - II Solve : = λ ⇒ = − ⇒ = − x 4 4 x x x x 1 2 1 1 2 − − = ⇒ 4 10 x x 2 2 − = 5 10 x 2 = 8 x 1 = − 2 x 2 λ = 8 ⎡ ⎤ 8 = λ = ← * , * 8 KKT point x ⎢ ⎥ − ⎣ 2 ⎦

  19. Linear Equality Constraints - III ⎡ ⎤ 1 [ ] = = SOSC A 1 - 1 Z ⎢ ⎥ ⎣ 1 ⎦ ⎡ ⎤ 1 0 ∇ = 2 ( ) f x ⎢ ⎥ ⎣ 0 4 ⎦ ⎡ ⎤ ⎡ ⎤ 1 0 1 [ ] ∇ = > 2 T ( ) 1 1 0 ⎢ ⎥ ⎢ ⎥ Z f x Z ⎣ 0 4 ⎦ ⎣ 1 ⎦ so SOSC satisfied, and x* is a strict local minimum Objective is convex, so KKT conditions are sufficient.

  20. Handy ways to compute Null Space Variable Reduction Method Orthogonal Projection Matrix QR factorization (best numerically) Z=Null(A) in matlab

  21. Orthogonal Projection Method Use optimization. Minimize distance between given point c and null space of A. 2 1 ∇ = λ ( *) ' − f p A min p c p 2 = * 0 Ap = . . 0 s t Ap or equivalently − = λ ( * ) ' p c A = * 0 Ap

  22. Orthogonal Projection Method Optimality conditions give us the solution FONC is − = λ ( * ) ' p c A = * 0 Ap ⇒ − = λ * ' Ap Ac AA ( ) − 1 ⇒ λ = − ' AA Ac ( ) − 1 ⇒ = λ + = − + * ' ' ' p A c A AA Ac c ( ) − = − 1 ( ' ' ) I A AA A c

  23. Orthogonal Projection Method Final result is: ( ) − − 1 ∈ ( ' ' ) I A AA A Null Matrices of A Note null space matrix is not unique Try it in Matlab for A= [1 3 5; 2 4 -1] Compare with Null(A) Null(‘A’,r)

  24. Get Lagrangian Multipliers for free! The matrix ( ) ( ) − − 1 1 = = = ' ' ' ' A A AA where AA AA AA I r r is the right inverse matrix for A. For general problems = min ( ) . . f x s t Ax b λ = ∇ ' * ( *) A f x r

  25. Let’s try it For ⎡ ⎤ = + + + 1 2 2 2 2 min ( ) f x x x x x ⎣ ⎦ 2 1 2 3 4 + + + = . . 1 s t x x x x 1 2 3 4 Projection matrix − 1 ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ 1 0 0 0 1 1 ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ 0 1 0 0 1 1 ⎢ ⎥ ( ) ⎢ ⎥ ⎢ ⎥ [ ] ⎢ ⎥ [ ] − 1 = − = − ' ' 1 1 1 1 1 1 1 1 Z I A AA A ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ 0 0 1 0 1 1 ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ 0 0 0 1 ⎦ ⎣ ⎦ 1 ⎣ ⎦ 1 ⎣ ⎦ ⎡ ⎤ 3 1 1 1 − − − ⎢ 4 4 4 4 ⎥ 1 3 1 1 ⎢ ⎥ − − − = ⎢ 4 4 4 4 ⎥ 1 1 3 1 − − − ⎢ ⎥ 4 4 4 4 1 1 1 3 ⎢ ⎥ ⎣ ⎦ − − − 4 4 4 4

  26. Solve FONC for Optimal Point FONC ⎡ ⎤ x ⎡ ⎤ 1 ⎢ ⎥ 1 ⎢ ⎥ ⎢ ⎥ x 1 ⎢ ⎥ ∇ − λ = = λ ⎢ ⎥ 2 ( ) ' f x A ⎢ ⎥ 1 ⎢ ⎥ x ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ 1 ⎢ ⎥ x ⎣ ⎦ 4 + + + = 1 x x x x 1 2 3 4

  27. Check Optimality Conditions ⎡ ⎤ ⎡ ⎤ 3 1 1 1 = 1 − − − * [1111]/ 4 x ⎢ ⎥ ⎢ ⎥ 4 4 4 4 1 3 1 1 For 1 ⎢ ⎥ ⎢ ⎥ 1 − − − ∇ = ∇ = = ( *) [1111]/ 4 f x 4 4 4 4 ( *) * 0 Z f x ⎢ ⎥ ⎢ ⎥ 1 1 3 1 1 4 − − − ⎢ ⎥ ⎢ ⎥ 4 4 4 4 ⎢ ⎥ 1 1 1 3 ⎢ ⎥ ⎣ ⎦ 1 ⎣ ⎦ − − − 4 4 4 4 = * Ax b Using Lagrangian 1 ( ) − 1 = = ' ' [1111]' 4 A A AA r λ = ∇ = ( *) 1/ 4 A f x r Clearly ∇ = λ ( *) ' f x A

  28. You try it = 1 min ( ) ' f x x Cx 2 = . . s t Ax b For − − − ⎡ ⎤ 0 13 6 3 ⎢ ⎥ − − ⎡ ⎤ ⎡ ⎤ 13 23 9 3 2 1 2 1 2 ⎢ ⎥ = = = ⎢ ⎥ ⎢ ⎥ C A b ⎢ ⎥ − − − − 6 9 12 1 ⎣ 1 1 3 1 ⎦ ⎣ ⎦ 3 ⎢ ⎥ − ⎢ ⎥ ⎣ 3 3 1 3 ⎦ Find projection matrix Confirm optimality conds are Z’Cx*=0, Ax* = b Find x* Compute Lagrangian multipliers Check Lagrangian form of the multipliers.

  29. Variable Reduction Method Let A=[B N] A is m by n B is m by m assume m < n ⎡ − ⎤ − 1 B N = ⎢ ⎥ Z ⎣ ⎦ I is a basis matrix for null space of A ⎡ ⎤ ⎡ ⎤ − − 1 1 B B [ ] = ⇒ = = + ⎢ ⎥ ⎢ ⎥ 0 A AA B N I r r ⎣ ⎦ ⎣ ⎦ 0 0

Recommend


More recommend