mixed integer conic optimization and mosek
play

Mixed-integer conic optimization and MOSEK Dagstuhl seminar on - PowerPoint PPT Presentation

Mixed-integer conic optimization and MOSEK Dagstuhl seminar on MINLP, February 20th 2018 Sven Wiese www.mosek.com What is MOSEK ? MOSEK ApS is a Danish company founded in 1997. Creates software for mathematical optimization problems.


  1. Mixed-integer conic optimization and MOSEK Dagstuhl seminar on MINLP, February 20th 2018 Sven Wiese www.mosek.com

  2. What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs 1 / 30

  3. What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs 1 / 30

  4. What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs 1 / 30

  5. What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs 1 / 30

  6. What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP MIP MIP MIP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs MIP MIP 1 / 30

  7. What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP MIP MIP MIP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs MIP MIP 1 / 30

  8. What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP MIP MIP MIP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs MIP MIP 1 / 30

  9. Linear optimization A special case of conic optimization The classical linear optimization problem: c T x minimize subject to Ax = b , x ≥ 0 . Pro: • Structure is explicit and simple. • Data is simple: c , A , b . • Structure implies convexity i.e. data independent. • Powerful duality theory including Farkas lemma. • Smoothness, gradients, Hessians are not an issue. Therefore, we have powerful algorithms and software. 2 / 30

  10. Linear optimization A special case of conic optimization The classical linear optimization problem: c T x minimize subject to Ax = b , x ≥ 0 . Pro: • Structure is explicit and simple. • Data is simple: c , A , b . • Structure implies convexity i.e. data independent. • Powerful duality theory including Farkas lemma. • Smoothness, gradients, Hessians are not an issue. Therefore, we have powerful algorithms and software. 2 / 30

  11. Linear optimization A special case of conic optimization The classical linear optimization problem: c T x minimize subject to Ax = b , x ≥ 0 . Pro: • Structure is explicit and simple. • Data is simple: c , A , b . • Structure implies convexity i.e. data independent. • Powerful duality theory including Farkas lemma. • Smoothness, gradients, Hessians are not an issue. Therefore, we have powerful algorithms and software. 2 / 30

  12. Linear optimization A special case of conic optimization The classical linear optimization problem: c T x minimize subject to Ax = b , x ≥ 0 . Pro: • Structure is explicit and simple. • Data is simple: c , A , b . • Structure implies convexity i.e. data independent. • Powerful duality theory including Farkas lemma. • Smoothness, gradients, Hessians are not an issue. Therefore, we have powerful algorithms and software. 2 / 30

  13. Linear optimization A special case of conic optimization Con: • It is linear only. 3 / 30

  14. The classical nonlinear optimization problem The classical nonlinear optimization problem: minimize f ( x ) subject to g ( x ) ≤ 0 . Pro • It is very general. Con: • Structure is hidden. • How to specify the problem at all in software? • How to compute gradients and Hessians if needed? • How to exploit structure? • Convexity checking! • Verifying convexity is NP-hard. • Solution: Disciplined convex modeling by Grant, Boyd and Ye [1] to assure convexity. 4 / 30

  15. The classical nonlinear optimization problem The classical nonlinear optimization problem: minimize f ( x ) subject to g ( x ) ≤ 0 . Pro • It is very general. Con: • Structure is hidden. • How to specify the problem at all in software? • How to compute gradients and Hessians if needed? • How to exploit structure? • Convexity checking! • Verifying convexity is NP-hard. • Solution: Disciplined convex modeling by Grant, Boyd and Ye [1] to assure convexity. 4 / 30

  16. The classical nonlinear optimization problem The classical nonlinear optimization problem: minimize f ( x ) subject to g ( x ) ≤ 0 . Pro • It is very general. Con: • Structure is hidden. • How to specify the problem at all in software? • How to compute gradients and Hessians if needed? • How to exploit structure? • Convexity checking! • Verifying convexity is NP-hard. • Solution: Disciplined convex modeling by Grant, Boyd and Ye [1] to assure convexity. 4 / 30

  17. The classical nonlinear optimization problem The classical nonlinear optimization problem: minimize f ( x ) subject to g ( x ) ≤ 0 . Pro • It is very general. Con: • Structure is hidden. • How to specify the problem at all in software? • How to compute gradients and Hessians if needed? • How to exploit structure? • Convexity checking! • Verifying convexity is NP-hard. • Solution: Disciplined convex modeling by Grant, Boyd and Ye [1] to assure convexity. 4 / 30

  18. The classical nonlinear optimization problem The classical nonlinear optimization problem: minimize f ( x ) subject to g ( x ) ≤ 0 . Pro • It is very general. Con: • Structure is hidden. • How to specify the problem at all in software? • How to compute gradients and Hessians if needed? • How to exploit structure? • Convexity checking! • Verifying convexity is NP-hard. • Solution: Disciplined convex modeling by Grant, Boyd and Ye [1] to assure convexity. 4 / 30

  19. A fundamental question Is there a class of nonlinear optimization problems that preserve almost all of the good properties of the linear optimization problem? 5 / 30

  20. Conic optimization Linear cone problem: c T x minimize subject to Ax = b x ∈ K , with K = K 1 × K 2 × · · · × K K a product of proper cones. 6 / 30

  21. Conic optimization Linear cone problem: c T x minimize subject to Ax = b x ∈ K , with K = K 1 × K 2 × · · · × K K a product of proper cones. 6 / 30

  22. The beauty of conic optimization • Separation of data and structure: • Data: c , A and b . • Structure: K . • Structural convexity. • Duality (almost...). • No issues with smoothness and differentiability. Lubin et al. [2] show that all convex instances (333) in MINLPLIB2 are conic representable using only 4 types of cones. 7 / 30

  23. The beauty of conic optimization • Separation of data and structure: • Data: c , A and b . • Structure: K . • Structural convexity. • Duality (almost...). • No issues with smoothness and differentiability. Lubin et al. [2] show that all convex instances (333) in MINLPLIB2 are conic representable using only 4 types of cones. 7 / 30

  24. The beauty of conic optimization • Separation of data and structure: • Data: c , A and b . • Structure: K . • Structural convexity. • Duality (almost...). • No issues with smoothness and differentiability. Lubin et al. [2] show that all convex instances (333) in MINLPLIB2 are conic representable using only 4 types of cones. 7 / 30

  25. The beauty of conic optimization • Separation of data and structure: • Data: c , A and b . • Structure: K . • Structural convexity. • Duality (almost...). • No issues with smoothness and differentiability. Lubin et al. [2] show that all convex instances (333) in MINLPLIB2 are conic representable using only 4 types of cones. 7 / 30

  26. Extremely disciplined convex programming These 4 cones, including symmetric and non-symmetric ones, and extended by another popular cone, are: conic-QP LP (SOCP) conic MOSEK SDP optimization power exponential cones cones Allowing for the nonsymmetric conic formulation leads to extremely disciplined convex programming . Simple, yet flexible for modeling, and with efficient numerical algorithms. 8 / 30

  27. Extremely disciplined convex programming These 4 cones, including symmetric and non-symmetric ones, and extended by another popular cone, are: conic-QP LP (SOCP) conic MOSEK SDP optimization power exponential cones cones Allowing for the nonsymmetric conic formulation leads to extremely disciplined convex programming . Simple, yet flexible for modeling, and with efficient numerical algorithms. 8 / 30

Recommend


More recommend