sensitivity analysis and farkas lemma
play

Sensitivity Analysis and Farkas Lemma Marco Chiarandini Department - PowerPoint PPT Presentation

DM545 Linear and Integer Programming Lecture 6 Sensitivity Analysis and Farkas Lemma Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Geometric Interpretation Sensitivity Analysis Outline


  1. DM545 Linear and Integer Programming Lecture 6 Sensitivity Analysis and Farkas Lemma Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark

  2. Geometric Interpretation Sensitivity Analysis Outline Farkas Lemma 1. Geometric Interpretation 2. Sensitivity Analysis 3. Farkas Lemma 2

  3. Geometric Interpretation Sensitivity Analysis Outline Farkas Lemma 1. Geometric Interpretation 2. Sensitivity Analysis 3. Farkas Lemma 3

  4. Geometric Interpretation Sensitivity Analysis Geometric Interpretation Farkas Lemma x 2 max x 1 + x 2 2 x 1 + x 2 ≤ 14 − x 1 + 2 x 2 ≤ 8 − x 1 + 2 x 2 ≤ 8 2 x 1 − x 2 ≤ 10 2 x 1 − x 2 ≤ 10 x 1 , x 2 ≥ 0 x 1 x 1 + x 2 2 x 1 + x 2 ≤ 14 Opt x ∗ = ( 4 , 6 ) , z ∗ = 10. To prove this we need to prove that y ∗ = ( 3 / 5 , 1 / 5 , 0 ) is a feasible solution of D : min 14 y 1 + 8 y 2 + 10 y 3 = w 2 y 1 − y 2 + 2 y 3 ≥ 1 y 1 + 2 y 2 − y 3 ≥ 1 y 1 , y 2 , y 3 ≥ 0 and that w ∗ = 10 4

  5. Geometric Interpretation Sensitivity Analysis Farkas Lemma x 2 3 5 · 2 x 1 + x 2 ≤ 14 1 5 · − x 1 + 2 x 2 ≤ 8 x 1 + x 2 ≤ 10 the feasibility region of P is a subset of the half plane x 1 + x 2 ≤ 10 x 1 x 1 + x 2 ≤ 10 ( 2 v − w ) x 1 + ( v + 2 w ) x 2 ≤ 14 v + 8 w set of half planes that contain the feasibility region of P and pass through [ 4 , 6 ] 2 v − w ≥ 1 v + 2 w ≥ 1 x 2 Example of boundary lines among those allowed: v = 1 , w = 0 = ⇒ 2 x 1 + x 2 = 14 x 1 + 3 x 2 = 22 v = 1 , w = 1 = ⇒ x 1 + 3 x 2 = 22 x 1 3 x 1 + 4 x 2 = 36 x 1 + x 2 ≤ 10 v = 2 , w = 1 = ⇒ 3 x 1 + 4 x 2 = 36 2 x 1 + x 2 = 14 5

  6. Geometric Interpretation Sensitivity Analysis Outline Farkas Lemma 1. Geometric Interpretation 2. Sensitivity Analysis 3. Farkas Lemma 7

  7. Geometric Interpretation Sensitivity Analysis Sensitivity Analysis Farkas Lemma aka Postoptimality Analysis Instead of solving each modified problems from scratch, exploit results obtained from solving the original problem. max { c T x | Ax = b , l ≤ x ≤ u } (*) (I) changes to coefficients of objective function: c T x | Ax = b , l ≤ x ≤ u } max { ˜ (primal) x ∗ of (*) remains feasible hence we can restart the simplex from x ∗ (II) changes to RHS terms: max { c T x | Ax = ˜ b , l ≤ x ≤ u } (dual) x ∗ optimal feasible solution of (*) x B = ˜ x N = x ∗ basic sol ¯ x of (II): ¯ N , A B ¯ b − A N ¯ x N x is dual feasible and we can start the dual simplex from there. If ˜ ¯ b differs from b only slightly it may be we are already optimal. 8

  8. Geometric Interpretation Sensitivity Analysis Farkas Lemma (III) introduce a new variable: (primal) 6 7 � � max c j x j max c j x j j = 1 j = 1 6 7 � � a ij x j = b i , i = 1 , . . . , 3 a ij x j = b i , i = 1 , . . . , 3 j = 1 j = 1 l j ≤ x j ≤ u j , j = 1 , . . . , 6 l j ≤ x j ≤ u j , j = 1 , . . . , 7 [ x ∗ 1 , . . . , x ∗ 6 ] feasible [ x ∗ 1 , . . . , x ∗ 6 , 0 ] feasible (IV) introduce a new constraint: (dual) 6 [ x ∗ 1 , . . . , x ∗ 6 ] optimal � a 4 j x j = b 4 [ x ∗ 1 , . . . , x ∗ 6 , x ∗ 7 , x ∗ 8 ] feasible j = 1 6 6 � � x ∗ a 4 j x ∗ 7 = b 4 − a 5 j x j = b 5 j j = 1 j = 1 6 l j ≤ x j ≤ u j j = 7 , 8 � x ∗ 8 = b 5 − a 5 j x ∗ j j = 1 9

  9. Geometric Interpretation Sensitivity Analysis Examples Farkas Lemma (I) Variation of reduced costs: max 6 x 1 + 8 x 2 x 1 x 2 x 3 x 4 − z b 5 x 1 + 10 x 2 ≤ 60 x 3 5 10 1 0 0 60 4 x 1 + 4 x 2 ≤ 40 x 4 4 4 0 1 0 40 x 1 , x 2 ≥ 0 6 8 0 0 1 0 The last tableau gives the possibility x 1 x 2 x 3 x 4 − z b to estimate the effect of variations x 2 0 1 1 / 5 − 1 / 4 0 2 x 1 1 0 − 1 / 5 1 / 2 0 8 0 0 − 2 / 5 − 1 1 − 64 For a variable in basis the perturbation goes unchanged in the red. costs. Eg: c 1 = − 2 max ( 6 + δ ) x 1 + 8 x 2 = ⇒ ¯ 5 · 5 − 1 · 4 + 1 ( 6 + δ ) = δ then need to bring in canonical form and hence δ changes the obj value. For a variable not in basis, if it changes the sign of the reduced cost = ⇒ worth bringing in basis = ⇒ the δ term propagates to other columns 10

  10. Geometric Interpretation Sensitivity Analysis Farkas Lemma (II) Changes in RHS terms x 1 x 2 x 3 x 4 − z b x 3 5 10 1 0 0 60 + δ x 4 4 4 0 1 0 40 + ǫ 6 8 0 0 1 0 x 1 x 2 x 3 x 4 − z b x 2 0 1 1 / 5 − 1 / 4 0 2 + 1 / 5 δ − 1 / 4 ǫ x 1 1 0 − 1 / 5 1 / 2 0 8 − 1 / 5 δ + 1 / 2 ǫ 0 0 − 2 / 5 − 1 1 − 64 − 2 / 5 δ − ǫ (It would be more convenient to augment the second. But let’s take ǫ = 0.) If 60 + δ = ⇒ all RHS terms change and we must check feasibility Which are the multipliers for the first row? k 1 = 1 5 , k 2 = − 1 4 , k 3 = 0 I: 1 / 5 ( 60 + δ ) − 1 / 4 · 40 + 0 · 0 = 12 + δ/ 5 − 10 = 2 + δ/ 5 II: − 1 / 5 ( 60 + δ ) + 1 / 2 · 40 + 0 · 0 = − 60 / 5 + 20 − δ/ 5 = 8 − 1 / 5 δ Risk that RHS becomes negative Eg: if δ = − 20 = ⇒ tableau stays optimal but not feasible = ⇒ apply dual simplex 11

  11. Geometric Interpretation Sensitivity Analysis Graphical Representation Farkas Lemma f . o . 60 + 2 / 5 δ δ -10 40 12

  12. Geometric Interpretation Sensitivity Analysis Farkas Lemma (III) Add a variable max 5 x 0 + 6 x 1 + 8 x 2 6 x 0 + 5 x 1 + 10 x 2 ≤ 60 8 x 0 + 4 x 1 + 4 x 2 ≤ 40 x 0 , x 1 , x 2 ≥ 0 Reduced cost of x 0 ? c j + � π i a ij = + 1 · 5 − 2 5 · 6 + ( − 1 ) 8 = − 27 5 To make worth entering in basis: ◮ increase its cost ◮ decrease the amount in constraint II: − 2 / 5 · 6 − a 20 + 5 > 0 13

  13. Geometric Interpretation Sensitivity Analysis Farkas Lemma (IV) Add a constraint max 6 x 1 + 8 x 2 5 x 1 + 10 x 2 ≤ 60 4 x 1 + 4 x 2 ≤ 40 5 x 1 + 6 x 2 ≤ 50 x 1 , x 2 ≥ 0 Final tableau not in canonical form, need to iterate x 1 x 2 x 3 x 4 x 5 − z b x 2 0 1 1 / 5 − 1 / 4 0 2 x 1 1 0 − 1 / 5 1 / 2 0 8 0 0 5 / 5 6 / 4 1 0 − 2 0 0 − 2 / 5 − 1 0 1 − 64 14

  14. Geometric Interpretation Sensitivity Analysis Farkas Lemma (V) change in a technological coefficient: x 1 x 2 x 3 x 4 − z b x 3 5 10 + δ 1 0 0 60 x 4 4 4 0 1 0 40 6 8 0 0 1 0 ◮ first effect on its column ◮ then look at c ◮ finally look at b x 1 x 2 x 3 x 4 − z b x 2 0 ( 10 + δ ) 1 / 5 + 4 ( − 1 / 4 ) 1 / 5 − 1 / 4 0 2 x 1 1 ( 10 + δ )( − 1 / 5 ) + 4 ( 1 / 2 ) − 1 / 5 1 / 2 0 8 0 − 2 / 5 δ − 2 / 5 − 1 1 − 64 15

  15. Geometric Interpretation Sensitivity Analysis Farkas Lemma The dominant application of LP is mixed integer linear programming. In this context it is extremely important being able to begin with a model instantiated in one form followed by a sequence of problem modifications (such as row and column additions and deletions and variable fixings) interspersed with resolves 16

  16. Geometric Interpretation Sensitivity Analysis Outline Farkas Lemma 1. Geometric Interpretation 2. Sensitivity Analysis 3. Farkas Lemma 17

  17. Geometric Interpretation Sensitivity Analysis Strong Duality Farkas Lemma Summary of Proof seen earlier in matrix notation: Assuming that P and D have feasible solutions: there exists an optimal basis B and an optimal solution x B − 1 , aka multipliers for B Dual solution corresponding to B , y B = c T B A B From the simplex: c = c + π A ¯ and at optimality c B = 0 for basic variables and c ¯ B ≥ 0 for non basic variables Setting y B = − π we obtain y B A ≤ c and hence y B is feasible for the dual. What is the value of this dual solution? B A − 1 y T B b = c T B b = c T B x B = c T x 18

  18. Geometric Interpretation Sensitivity Analysis Farkas Lemma We now look at Farkas Lemma with two objectives: ◮ giving another proof of strong duality ◮ understanding a certificate of infeasibility 19

  19. Geometric Interpretation Sensitivity Analysis Farkas Lemma Farkas Lemma Lemma (Farkas) Let A ∈ R m × n and b ∈ R m . Then, ∃ x ∈ R n : Ax = b and x ≥ 0 either I . ∃ y ∈ R m : y T A ≥ 0 T and y T b < 0 II . or Easy to see that both I and II cannot occur together: ( y T A ) y T Ax = y T b ( 0 ≤ ) x ( < 0 ) ���� � �� � ≥ 0 ≥ 0 20

  20. Geometric Interpretation Sensitivity Analysis Geometric interpretation of Farkas L. Farkas Lemma Linear combination of a i with nonnegative terms generates a convex cone: { λ 1 a 1 + . . . + λ n a n , | λ 1 , . . . , λ n ≥ 0 } polyhedral cone: C = { x | Ax ≤ 0 } , intersection of many ax ≤ 0 Convex hull of rays p i = { λ i a i , λ i ≥ 0 } Either point b lies in convex cone C ∃ hyperplane h passing through point 0 h = { x ∈ R m : y T x = 0 } or for y ∈ R m such that all vectors a 1 , . . . , a n (and thus C ) lie on one side and b lies (strictly) on the other side (ie, y T a i ≥ 0 , ∀ i = 1 . . . n and y T b < 0). 21

  21. Geometric Interpretation Sensitivity Analysis Variants of Farkas Lemma Farkas Lemma Corollary ⇒ ∀ y ∈ R m with y T A ≥ 0 T , y T b ≥ 0 (i) Ax = b has sol x ≥ 0 ⇐ ⇒ ∀ y ≥ 0 with y T A ≥ 0 T , y T b ≥ 0 (ii) Ax ≤ b has sol x ≥ 0 ⇐ (iii) Ax ≤ 0 has sol x ∈ R n ⇐ ⇒ ∀ y ≥ 0 with y T A = 0 T , y T b ≥ 0 i) = ⇒ ii): ¯ A = [ A | I m ] ⇒ ¯ Ax ≤ b has sol x ≥ 0 ⇐ Ax = b has sol x ≥ 0 By (i): relation with Fourier & y T A ≥ 0 ∀ y ∈ R m y T b ≥ 0 , y T ¯ Moutzkin method y ≥ 0 A ≥ 0 22

Recommend


More recommend