Mixed-integer conic optimization and MOSEK Dagstuhl seminar on MINLP, February 20th 2018 Sven Wiese www.mosek.com
What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs 1 / 30
What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs 1 / 30
What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs 1 / 30
What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs 1 / 30
What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP MIP MIP MIP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs MIP MIP 1 / 30
What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP MIP MIP MIP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs MIP MIP 1 / 30
What is MOSEK ? • MOSEK ApS is a Danish company founded in 1997. • Creates software for mathematical optimization problems. convex NLP MIP MIP MIP conic-QP convex LP QP (SOCP) conic SDP MOSEK optimization Fusion power exponential optimizer cones cones APIs MIP MIP 1 / 30
Linear optimization A special case of conic optimization The classical linear optimization problem: c T x minimize subject to Ax = b , x ≥ 0 . Pro: • Structure is explicit and simple. • Data is simple: c , A , b . • Structure implies convexity i.e. data independent. • Powerful duality theory including Farkas lemma. • Smoothness, gradients, Hessians are not an issue. Therefore, we have powerful algorithms and software. 2 / 30
Linear optimization A special case of conic optimization The classical linear optimization problem: c T x minimize subject to Ax = b , x ≥ 0 . Pro: • Structure is explicit and simple. • Data is simple: c , A , b . • Structure implies convexity i.e. data independent. • Powerful duality theory including Farkas lemma. • Smoothness, gradients, Hessians are not an issue. Therefore, we have powerful algorithms and software. 2 / 30
Linear optimization A special case of conic optimization The classical linear optimization problem: c T x minimize subject to Ax = b , x ≥ 0 . Pro: • Structure is explicit and simple. • Data is simple: c , A , b . • Structure implies convexity i.e. data independent. • Powerful duality theory including Farkas lemma. • Smoothness, gradients, Hessians are not an issue. Therefore, we have powerful algorithms and software. 2 / 30
Linear optimization A special case of conic optimization The classical linear optimization problem: c T x minimize subject to Ax = b , x ≥ 0 . Pro: • Structure is explicit and simple. • Data is simple: c , A , b . • Structure implies convexity i.e. data independent. • Powerful duality theory including Farkas lemma. • Smoothness, gradients, Hessians are not an issue. Therefore, we have powerful algorithms and software. 2 / 30
Linear optimization A special case of conic optimization Con: • It is linear only. 3 / 30
The classical nonlinear optimization problem The classical nonlinear optimization problem: minimize f ( x ) subject to g ( x ) ≤ 0 . Pro • It is very general. Con: • Structure is hidden. • How to specify the problem at all in software? • How to compute gradients and Hessians if needed? • How to exploit structure? • Convexity checking! • Verifying convexity is NP-hard. • Solution: Disciplined convex modeling by Grant, Boyd and Ye [1] to assure convexity. 4 / 30
The classical nonlinear optimization problem The classical nonlinear optimization problem: minimize f ( x ) subject to g ( x ) ≤ 0 . Pro • It is very general. Con: • Structure is hidden. • How to specify the problem at all in software? • How to compute gradients and Hessians if needed? • How to exploit structure? • Convexity checking! • Verifying convexity is NP-hard. • Solution: Disciplined convex modeling by Grant, Boyd and Ye [1] to assure convexity. 4 / 30
The classical nonlinear optimization problem The classical nonlinear optimization problem: minimize f ( x ) subject to g ( x ) ≤ 0 . Pro • It is very general. Con: • Structure is hidden. • How to specify the problem at all in software? • How to compute gradients and Hessians if needed? • How to exploit structure? • Convexity checking! • Verifying convexity is NP-hard. • Solution: Disciplined convex modeling by Grant, Boyd and Ye [1] to assure convexity. 4 / 30
The classical nonlinear optimization problem The classical nonlinear optimization problem: minimize f ( x ) subject to g ( x ) ≤ 0 . Pro • It is very general. Con: • Structure is hidden. • How to specify the problem at all in software? • How to compute gradients and Hessians if needed? • How to exploit structure? • Convexity checking! • Verifying convexity is NP-hard. • Solution: Disciplined convex modeling by Grant, Boyd and Ye [1] to assure convexity. 4 / 30
The classical nonlinear optimization problem The classical nonlinear optimization problem: minimize f ( x ) subject to g ( x ) ≤ 0 . Pro • It is very general. Con: • Structure is hidden. • How to specify the problem at all in software? • How to compute gradients and Hessians if needed? • How to exploit structure? • Convexity checking! • Verifying convexity is NP-hard. • Solution: Disciplined convex modeling by Grant, Boyd and Ye [1] to assure convexity. 4 / 30
A fundamental question Is there a class of nonlinear optimization problems that preserve almost all of the good properties of the linear optimization problem? 5 / 30
Conic optimization Linear cone problem: c T x minimize subject to Ax = b x ∈ K , with K = K 1 × K 2 × · · · × K K a product of proper cones. 6 / 30
Conic optimization Linear cone problem: c T x minimize subject to Ax = b x ∈ K , with K = K 1 × K 2 × · · · × K K a product of proper cones. 6 / 30
The beauty of conic optimization • Separation of data and structure: • Data: c , A and b . • Structure: K . • Structural convexity. • Duality (almost...). • No issues with smoothness and differentiability. Lubin et al. [2] show that all convex instances (333) in MINLPLIB2 are conic representable using only 4 types of cones. 7 / 30
The beauty of conic optimization • Separation of data and structure: • Data: c , A and b . • Structure: K . • Structural convexity. • Duality (almost...). • No issues with smoothness and differentiability. Lubin et al. [2] show that all convex instances (333) in MINLPLIB2 are conic representable using only 4 types of cones. 7 / 30
The beauty of conic optimization • Separation of data and structure: • Data: c , A and b . • Structure: K . • Structural convexity. • Duality (almost...). • No issues with smoothness and differentiability. Lubin et al. [2] show that all convex instances (333) in MINLPLIB2 are conic representable using only 4 types of cones. 7 / 30
The beauty of conic optimization • Separation of data and structure: • Data: c , A and b . • Structure: K . • Structural convexity. • Duality (almost...). • No issues with smoothness and differentiability. Lubin et al. [2] show that all convex instances (333) in MINLPLIB2 are conic representable using only 4 types of cones. 7 / 30
Extremely disciplined convex programming These 4 cones, including symmetric and non-symmetric ones, and extended by another popular cone, are: conic-QP LP (SOCP) conic MOSEK SDP optimization power exponential cones cones Allowing for the nonsymmetric conic formulation leads to extremely disciplined convex programming . Simple, yet flexible for modeling, and with efficient numerical algorithms. 8 / 30
Extremely disciplined convex programming These 4 cones, including symmetric and non-symmetric ones, and extended by another popular cone, are: conic-QP LP (SOCP) conic MOSEK SDP optimization power exponential cones cones Allowing for the nonsymmetric conic formulation leads to extremely disciplined convex programming . Simple, yet flexible for modeling, and with efficient numerical algorithms. 8 / 30
Recommend
More recommend