Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Context Formal proofs for Global Optimization: Bernstein polynomial methods [Zumkeller thesis 08] Restricted to polynomials Certified interval arithmetic in C OQ [Melquiond 12] Taylor methods in HOL Light [Solovyev thesis 13] Formal verification of floating-point operations SMT methods [Gao et al. 12] Sums of squares techniques Formalized in H OL - LIGHT [Harrison 07] C OQ [Besson 07] Precise methods but scalibility and robustness issues (numerical) Restricted to polynomials Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 12 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Context Formal proofs for Global Optimization: Bernstein polynomial methods [Zumkeller thesis 08] Restricted to polynomials Certified interval arithmetic in C OQ [Melquiond 12] Taylor methods in HOL Light [Solovyev thesis 13] Formal verification of floating-point operations SMT methods [Gao et al. 12] Sums of squares techniques Formalized in H OL - LIGHT [Harrison 07] C OQ [Besson 07] Precise methods but scalibility and robustness issues (numerical) Restricted to polynomials Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 12 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Context Formal proofs for Global Optimization: Bernstein polynomial methods [Zumkeller thesis 08] Restricted to polynomials Certified interval arithmetic in C OQ [Melquiond 12] Taylor methods in HOL Light [Solovyev thesis 13] Formal verification of floating-point operations SMT methods [Gao et al. 12] Sums of squares techniques Formalized in H OL - LIGHT [Harrison 07] C OQ [Besson 07] Precise methods but scalibility and robustness issues (numerical) Restricted to polynomials Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 12 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Existing Frameworks Interval analysis robust but subject to the curse of dimensionality Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 13 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Existing Frameworks Lemma 9922699028 from Flyspeck: ∂ 4 ∆ x � � ∀ x ∈ K , arctan + l ( x ) � 0 √ 4 x 1 ∆ x Dependency issue using Interval Calculus: � One can bound ∂ 4 ∆ x / 4 x 1 ∆ x and l ( x ) separately Too coarse lower bound: − 0.87 Subdivide K to prove the inequality K 3 K K 0 K 1 K 4 = ⇒ K 2 Curse of Dimensionality Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 14 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Existing Frameworks Sums of squares techniques powerful: global optimality certificates without branching but not so robust: handles moderate size problems Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 15 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Existing Frameworks Approximation theory: Chebyshev/Taylor models mandatory for non-polynomial problems hard to combine with SOS techniques (degree of approximation) Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 16 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Question Can we develop a new approach with both keeping the respective strength of interval and precision of SOS? Proving Flyspeck Inequalities is challenging: medium-size and tight Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 17 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Answer Certificates for lower bounds of Global Optimization Problems using SOS and new ingredients in Global Optimization: Maxplus approximation (Optimal Control) Nonlinear templates (Static Analysis) Verification of these certificates inside C OQ Implementation of all these techniques in NLCertify Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 18 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion The General Framework Given K a compact set and f a transcendental function, bound f ∗ = inf x ∈ K f ( x ) and prove f ∗ � 0 f is underestimated by a semialgebraic function f sa 1 We reduce the problem f ∗ sa : = inf x ∈ K f sa ( x ) to a polynomial 2 optimization problem (POP) We solve the POP problem f ∗ pop : = inf f pop ( x , z ) using a 3 ( x , z ) ∈ K pop hierarchy of SOS relaxations When the relaxations are accurate enough, f ∗ � f ∗ sa � f ∗ pop � 0. Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 19 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Outline Introduction 1 SOS Certificates 2 Maxplus Approximation 3 Nonlinear Templates 4 Formal SOS 5 Conclusion 6 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 20 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Polynomial Optimization Problems (POP) Input data: multivariate polynomials p , g 1 , . . . , g m ∈ R [ x ] K : = { x ∈ R n : g 1 ( x ) � 0, . . . , g m ( x ) � 0 } is a semialgebraic set How to certify a lower bound of p ∗ : = inf x ∈ K p ( x ) ? Example with the box [ 4, 6.3504 ] 6 g 1 : = x 1 − 4, g 2 : = 6.3504 − x 1 , . . . , g 11 : = x 6 − 4, g 12 : = 6.3504 − x 6 K : = { x ∈ R n : g 1 ( x ) � 0, . . . , g 12 ( x ) � 0 } ∆ ( x ) : = x 1 x 4 ( − x 1 + x 2 + x 3 − x 4 + x 5 + x 6 ) + x 2 x 5 ( x 1 − x 2 + x 3 + x 4 − x 5 + x 6 ) + x 3 x 6 ( x 1 + x 2 − x 3 + x 4 + x 5 − x 6 ) − x 2 ( x 3 x 4 + x 1 x 6 ) − x 5 ( x 1 x 3 + x 4 x 6 ) Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 21 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Polynomial Optimization Problems (POP) Input data: multivariate polynomials p , g 1 , . . . , g m ∈ R [ x ] K : = { x ∈ R n : g 1 ( x ) � 0, . . . , g m ( x ) � 0 } is a semialgebraic set How to certify a lower bound of p ∗ : = inf x ∈ K p ( x ) ? Example with the box [ 4, 6.3504 ] 6 g 1 : = x 1 − 4, g 2 : = 6.3504 − x 1 , . . . , g 11 : = x 6 − 4, g 12 : = 6.3504 − x 6 K : = { x ∈ R n : g 1 ( x ) � 0, . . . , g 12 ( x ) � 0 } ∆ ( x ) : = x 1 x 4 ( − x 1 + x 2 + x 3 − x 4 + x 5 + x 6 ) + x 2 x 5 ( x 1 − x 2 + x 3 + x 4 − x 5 + x 6 ) + x 3 x 6 ( x 1 + x 2 − x 3 + x 4 + x 5 − x 6 ) − x 2 ( x 3 x 4 + x 1 x 6 ) − x 5 ( x 1 x 3 + x 4 x 6 ) Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 21 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion The Cone of Sums of Squares Let Σ [ x ] be the cone of sums of squares (SOS) Let g 0 : = 1 and M ( g ) be the quadratic module generated by g 0 , . . . , g m : � m � ∑ σ j ( x ) g j ( x ) , with σ j ∈ Σ [ x ] M ( g ) = j = 0 When q ∈ M ( g ) , σ 0 , . . . , σ m is a positivity certificate for q q = q’ can be checked in C OQ Much simpler to verify certificates using sceptical approach Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 22 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion The Cone of Sums of Squares Let Σ [ x ] be the cone of sums of squares (SOS) Let g 0 : = 1 and M ( g ) be the quadratic module generated by g 0 , . . . , g m : � m � ∑ σ j ( x ) g j ( x ) , with σ j ∈ Σ [ x ] M ( g ) = j = 0 When q ∈ M ( g ) , σ 0 , . . . , σ m is a positivity certificate for q q = q’ can be checked in C OQ Much simpler to verify certificates using sceptical approach Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 22 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion The Lasserre Hierarchy of SOS Relaxations K : = { x ∈ R n : g 1 ( x ) � 0, . . . , g m ( x ) � 0 } , p ∗ : = inf x ∈ K p ( x ) ? Definition M ( g ) is Archimedean if there exists a positive constant ρ such that the polynomial x �→ ρ − � x � 2 2 belongs to M ( g ) . Proposition [Putinar 93] Suppose that M ( g ) is Archimedean. Then, every polynomial strictly positive on K belongs to M ( g ) . Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 23 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion The Lasserre Hierarchy of SOS Relaxations The search space for σ 0 , . . . , σ m ∈ Σ [ x ] is infinite Consider the truncated quadratic module: � m � M k ( g ) : = ∑ σ j ( x ) g j ( x ) , with σ j ∈ Σ [ x ] , ( σ j g j ) ∈ R 2 k [ x ] . j = 0 M 0 ( g ) ⊂ M 1 ( g ) ⊂ M 2 ( g ) ⊂ · · · ⊂ M ( g ) � � Hierarchy of SOS programs: µ k : = sup µ : p ( x ) − µ ∈ M k ( g ) µ , σ 0 ,..., σ m Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 24 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Convergence of Lasserre Hierarchy Proposition [Lasserre 01] Let k � k 0 : = max {⌈ deg p /2 ⌉ , ⌈ deg g 1 /2 ⌉ , . . . , ⌈ deg g m /2 ⌉} . The sequence inf ( µ k ) k � k 0 is non-decreasing. When M ( g ) is Archimedean, it converges to p ∗ . Compute µ k by solving a semidefinite program (SDP) External tools: SDP solvers freely available ( SDPA , CSDP , . . . ) Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 25 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion How to Deal with Semialgebraic Expressions? Let A be the semialgebraic functions algebra obtained by 1 p ( p ∈ N 0 ) , composition of polynomials with | · | , ( · ) + , − , × , /, sup, inf ∂ 4 ∆ x Example: f sa ( x ) : = √ 4 x 1 ∆ x K : = { x ∈ R n : g 1 ( x ) � 0, . . . , g m ( x ) � 0 } is a semialgebraic set f ∗ sa : = inf x ∈ K f sa ( x ) ? Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 26 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion How to Deal with Semialgebraic Expressions? Definition: Basic semialgebraic lifting (b.s.a.l) A semialgebraic function f sa is said to have a b.s.a.l if there exist p , s ∈ N , polynomials h 1 , . . . , h s ∈ R [ x , z 1 , . . . , z p ] and a basic semialgebraic set K pop defined by: K pop : = { ( x , z 1 , . . . , z p ) ∈ R n + p : x ∈ K , h 1 ( x , z ) � 0, . . . , h s ( x , z ) � 0 } , with { ( x , f sa ( x )) : x ∈ K } = { ( x , z p ) : ( x , z ) ∈ K pop } . b.s.a.l. lemma [Lasserre-Putinar 10] : Every well-defined f sa ∈ A has a basic semialgebraic lifting. Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 27 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion The “No Free Lunch” Rule Dependency in the relaxation order k (SOS degree) and the number of variables n � n + 2 k � Computing µ k leads to an SOS with variables n At k fixed, O ( n 2 k ) variables Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 28 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Examples Previous Example g 1 : = x 1 − 4, g 2 : = 6.3504 − x 1 , . . . , g 11 : = x 6 − 4, g 12 : = 6.3504 − x 6 K : = { x ∈ R n : g 1 ( x ) � 0, . . . , g 12 ( x ) � 0 } ∆ x : = x 1 x 4 ( − x 1 + x 2 + x 3 − x 4 + x 5 + x 6 ) + x 2 x 5 ( x 1 − x 2 + x 3 + x 4 − x 5 + x 6 ) + x 3 x 6 ( x 1 + x 2 − x 3 + x 4 + x 5 − x 6 ) − x 2 ( x 3 x 4 + x 1 x 6 ) − x 5 ( x 1 x 3 + x 4 x 6 ) With SOS of degree at most 4: µ 2 = 128 Lemma from Flyspeck (inequality ID 4717061266) ∀ x ∈ [ 4, 6.3504 ] 6 , ∆ x � 0 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 29 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Sparse Variant of SOS Relaxations Partial Remedy: Sparse variant of SOS Relaxations [Waki et al. 04] Correlative sparsity pattern (csp) graph for the POP variables ∂ 4 ∆ x : = x 1 ( − x 1 + x 2 + x 3 − x 4 + x 5 + x 6 ) + x 2 x 5 + x 3 x 6 − x 2 x 3 − x 5 x 6 6 5 4 1 2 3 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 30 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Sparse Variant of SOS Relaxations csp graph G for the POP variables Compute C 1 , . . . , C l the maximal cliques of G Let κ be the average size of the cliques � κ + 2 k � Hierarchy of SOS Relaxations involving variables κ C 1 : = { 1, 4 } , C 2 : = { 1, 2, 3, 5 } , C 3 : = { 1, 3, 5, 6 } 6 5 Dense SOS: 210 variables 4 1 Sparse SOS: 115 variables 2 3 But only a partial remedy! Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 31 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Examples Example from Flyspeck K : = [ 4, 6.3504 ] 3 × [ 6.3504, 8 ] × [ 4, 6.3504 ] 2 ∂ 4 ∆ x f sa ( x ) : = √ 4 x 1 ∆ x Two lifting variables z 1 , z 2 to represent the square root and the division Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 32 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Examples Example from Flyspeck � z 1 : = 4 x 1 ∆ x , m 1 = inf x ∈ K z 1 ( x ) , M 1 = sup z 1 ( x ) . x ∈ K K pop : = { ( x , z ) ∈ R 8 : x ∈ K , h 1 ( x , z ) � 0, . . . , h 6 ( x , z ) � 0 } , with h 4 ( x , z ) : = − z 2 h 1 ( x , z ) : = z 1 − m 1 , 1 + 4 x 1 ∆ x , h 2 ( x , z ) : = M 1 − z 1 , h 5 ( x , z ) : = z 2 z 1 − ∂ 4 ∆ x , h 3 ( x , z ) : = z 2 1 − 4 x 1 ∆ x , h 6 ( x , z ) : = − z 2 z 1 + ∂ 4 ∆ x . p ∗ : = z 2 = f ∗ inf sa . We obtain µ 2 = − 0.618 and µ 3 = − 0.445. ( x , z ) ∈ K pop More complex certificates Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 33 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Examples Example from Flyspeck � z 1 : = 4 x 1 ∆ x , m 1 = inf x ∈ K z 1 ( x ) , M 1 = sup z 1 ( x ) . x ∈ K K pop : = { ( x , z ) ∈ R 8 : x ∈ K , h 1 ( x , z ) � 0, . . . , h 6 ( x , z ) � 0 } , with h 4 ( x , z ) : = − z 2 h 1 ( x , z ) : = z 1 − m 1 , 1 + 4 x 1 ∆ x , h 2 ( x , z ) : = M 1 − z 1 , h 5 ( x , z ) : = z 2 z 1 − ∂ 4 ∆ x , h 3 ( x , z ) : = z 2 1 − 4 x 1 ∆ x , h 6 ( x , z ) : = − z 2 z 1 + ∂ 4 ∆ x . p ∗ : = inf z 2 = f ∗ sa . We obtain µ 2 = − 0.618 and µ 3 = − 0.445. ( x , z ) ∈ K pop More complex certificates Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 33 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion High-degree Polynomial Approximation + SOS n x i sin ( √ x i ) SWF : min ∑ x ∈ [ 1,500 ] n f ( x ) = − i = 1 Classical idea: replace sin ( √· ) by a degree- d Chebyshev polynomial Hard to combine with SOS Indeed: Small d : lack of accuracy = ⇒ expensive Branch and Bound � n + d � Large d : “No free lunch” rule with SOS variables n SWF with n = 10, d = 4: 38 min to compute a lower bound of − 430 n Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 34 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion High-degree Polynomial Approximation + SOS n x i sin ( √ x i ) SWF : min ∑ x ∈ [ 1,500 ] n f ( x ) = − i = 1 Classical idea: replace sin ( √· ) by a degree- d Chebyshev polynomial Hard to combine with SOS Indeed: Small d : lack of accuracy = ⇒ expensive Branch and Bound � n + d � Large d : “No free lunch” rule with SOS variables n SWF with n = 10, d = 4: 38 min to compute a lower bound of − 430 n Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 34 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion High-degree Polynomial Approximation + SOS n x i sin ( √ x i ) SWF : min ∑ x ∈ [ 1,500 ] n f ( x ) = − i = 1 Classical idea: replace sin ( √· ) by a degree- d Chebyshev polynomial Hard to combine with SOS Indeed: Small d : lack of accuracy = ⇒ expensive Branch and Bound � n + d � Large d : “No free lunch” rule with SOS variables n SWF with n = 10, d = 4: 38 min to compute a lower bound of − 430 n Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 34 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion High-degree Polynomial Approximation + SOS Minimax approximations + Sparse SOS not enough to check the hardest inequalities (multiple variables, multiple semialgebraic lifting) Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 35 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Outline Introduction 1 SOS Certificates 2 Maxplus Approximation 3 Nonlinear Templates 4 Formal SOS 5 Conclusion 6 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 36 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Maxplus Approximation Initially introduced to solve Optimal Control Problems [Fleming-McEneaney 00] Further work by [McEneaney 07, Akian-Gaubert-Lakhoua 08, Dower ] Value function approximated with “maxplus linear combination” of simple (e.g. quadratic) functions Curse of dimensionality reduction [McEaneney Kluberg, Gaubert-McEneaney-Qu 11, Qu 13]. Allowed to solve instances of dim up to 15 (inaccessible by grid methods) In our context: approximate transcendental functions Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 37 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Maxplus Approximation for Semiconvex Functions Definition: Semiconvex function Let γ � 0. A function φ : R n → R is said to be γ -semiconvex if the function x �→ φ ( x ) + γ 2 � x � 2 2 is convex. Proposition The set of functions f which can be written as the previous maxplus linear combination for some function a : B → R ∪ {− ∞ } is precisely the set of lower semicontinuous γ -semiconvex functions. Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 38 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Maxplus Approximation for Semiconvex Functions y par + a 2 par + a 1 arctan par − a 2 a a 1 a 2 m M par − a 1 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 39 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Maxplus Approximation Error Theorem [Akian-Gaubert-Lakhoua 08] Let γ ∈ R , η > 0. Let φ be ( γ − η ) -semiconvex and Lipschitz- continuous on a full dimensional compact convex subset K ⊂ R n . Let φ N denote the best maxplus approximation by N quadratic forms of Hessian − γ I . Then � φ − φ N � ∞ = O ( 1/ N 2/ n ) . Differentiability not mandatory by contrast with Taylor When in addition, φ is of class C 2 , then the upper bound is tight [Gaubert-McEneaney-Qu 11] � 2 n as N → ∞ . α � � 1 K [ det ( D 2 ( φ )( x ) + γ I n )] 2 d x � φ − φ N � ∞ ∼ N 2/ n In our case n = 1, one needs O ( 1/ √ ǫ ) basis functions Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 40 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Maxplus Approximation Error Exact parsimonious maxplus representations y a Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 41 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Maxplus Approximation Error Exact parsimonious maxplus representations y a Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 42 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Function Representation Abstract syntax tree representations of multivariate transcendental functions: leaves are semialgebraic functions of A nodes are univariate functions of D or binary operations For the “Simple” Example from Flyspeck: + arctan l ( x ) ∂ 4 ∆ x √ 4 x 1 ∆ x Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 43 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Contents Maxplus Approximation 3 Maxplus Approximation Maxplus Approximation for Semiconvex Functions Maxplus Approximation Error Nonlinear Function Representation Nonlinear Maxplus Approximation Algorithm Maxplus Approximation Example Minimax Approximation / For Comparison Nonlinear Maxplus Optimization Algorithm Numerical Results for Flyspeck Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 44 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Maxplus Approximation Algorithm Input: tree t , box K , SOS relaxation order k , precision p Output: lower bound m , upper bound M , lower semialgebraic estimator t − 2 , upper semialgebraic estimator t + 2 1: if t ∈ A then t − : = t , t + : = t 2: else if u : = root ( t ) ∈ D with child c then m c , M c , c − , c + : = samp _ approx ( c , K , k , p ) 3: I : = [ m c , M c ] 4: u − , u + : = unary _ approx ( u , I , c , p ) 5: t − , t + : = compose _ approx ( u , u − , u + , I , c − , c + ) 6: 7: else if bop : = root ( t ) is a binary operation with children c 1 and c 2 then m i , M i , c − i , c + 8: i : = samp _ approx ( c i , K , k , p ) for i ∈ { 1, 2 } t − , t + : = compose _ bop ( c − 1 , c + 1 , c − 2 , c + 2 , bop , [ m 2 , M 2 ]) 9: 10: end 11: return min _ sa ( t − , K , k ) , max _ sa ( t + , K , k ) , t − , t + Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 45 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Maxplus Approximation Algorithm Input: tree t , box K , SOS relaxation order k , precision p Output: lower bound m , upper bound M , lower semialgebraic estimator t − 2 , upper semialgebraic estimator t + 2 1: if t ∈ A then t − : = t , t + : = t 2: else if u : = root ( t ) ∈ D with child c then m c , M c , c − , c + : = samp _ approx ( c , K , k , p ) 3: I : = [ m c , M c ] 4: u − , u + : = unary _ approx ( u , I , c , p ) 5: t − , t + : = compose _ approx ( u , u − , u + , I , c − , c + ) 6: 7: else if bop : = root ( t ) is a binary operation with children c 1 and c 2 then m i , M i , c − i , c + 8: i : = samp _ approx ( c i , K , k , p ) for i ∈ { 1, 2 } t − , t + : = compose _ bop ( c − 1 , c + 1 , c − 2 , c + 2 , bop , [ m 2 , M 2 ]) 9: 10: end 11: return min _ sa ( t − , K , k ) , max _ sa ( t + , K , k ) , t − , t + Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 46 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Maxplus Approximation Example Consider the function arctan on I : = [ m , M ] . a ( x ) : = − γ 2 ( x − a ) 2 + f ′ ( a )( x − a ) + f ( a ) arctan ( x ) � par − − f ′′ ( x ) always work Choosing γ = sup x ∈ I The precision p is the number of control points y par + a 2 par + a 1 arctan par − a 2 a a 1 a 2 m M par − a 1 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 47 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Minimax Approximation / For Comparison More classical approximation method The precision is an integer d The best-uniform degree- d polynomial approximation of u is the solution of the following optimization problem: h ∈ R d [ x ] � u − h � ∞ = min min h ∈ R d [ x ] ( sup | u ( x ) − h ( x ) | ) x ∈ I Implementation in Sollya [Chevillard-Joldes-Lauter 10] Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 48 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Contents Maxplus Approximation 3 Maxplus Approximation Maxplus Approximation for Semiconvex Functions Maxplus Approximation Error Nonlinear Function Representation Nonlinear Maxplus Approximation Algorithm Maxplus Approximation Example Minimax Approximation / For Comparison Nonlinear Maxplus Optimization Algorithm Numerical Results for Flyspeck Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 49 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Maxplus Optimization Algorithm First iteration: y + arctan arctan par − l ( x ) a 1 a a 1 m M ∂ 4 ∆ x √ 4 x 1 ∆ x Evaluate t with randeval and obtain a minimizer guess x 1 opt . 1 ∂ 4 ∆ x √ 4 x 1 ∆ x ( x 1 opt ) = f sa ( x 1 Compute a 1 : = opt ) = 0.84460 x ∈ K ( l ( x ) + par − Compute m 1 � min a 1 ( f sa ( x ))) 2 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 50 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Maxplus Optimization Algorithm Second iteration: y + arctan par − arctan l ( x ) a 1 a a 2 a 1 m M ∂ 4 ∆ x √ 4 x 1 ∆ x par − a 2 For k = 2, m 1 = − 0.746 < 0, obtain a new minimizer x 2 opt . 1 Compute a 2 : = f sa ( x 2 opt ) = − 0.374 and par − 2 a 2 Compute m 2 � min x ∈ K ( l ( x ) + max i ∈{ 1,2 } { par − a i ( f sa ( x )) } ) 3 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 51 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Maxplus Optimization Algorithm Third iteration: y + arctan par − par − a 3 arctan l ( x ) a 1 a a 2 a 3 a 1 m M ∂ 4 ∆ x √ 4 x 1 ∆ x par − a 2 For k = 2, m 2 = − 0.112 < 0, obtain a new minimizer x 3 opt . 1 Compute a 3 : = f sa ( x 3 opt ) = 0.357 and par − 2 a 3 Compute m 3 � min x ∈ K ( l ( x ) + max i ∈{ 1,2,3 } { par − a i ( f sa ( x )) } ) 3 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 52 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Maxplus Optimization Algorithm m 3 = − 0.0333 < 0, obtain a new minimizer x 4 opt and iterate again... Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 53 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Maxplus Optimization Algorithm Input: abstract syntax tree t , semialgebraic set K , iter max (optional argument), precision p Output: lower bound m 1: s : = [ argmin ( randeval ( t ))] ⊲ s ∈ K 2: m : = − ∞ , iter : = 0 3: while iter � iter max do Choose an SOS relaxation order k � k 0 4: m , M , t − , t + : = samp _ approx ( t , K , k , p ) 5: x opt : = guess _ argmin ( t − ) ⊲ t − ( x opt ) ≃ m 6: s : = s ∪ { x opt } 7: p : = update _ precision ( p ) , iter : = iter + 1 8: 9: done 10: return m , x opt Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 54 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Convergence of the Optimization Algorithm Let f be a multivariate transcendental function Let t − p be the underestimator of f , obtained at precision p Let x p opt be a minimizer of t − p over K Theorem Every accumulation point of the sequence ( x p opt ) p is a global minimizer of f on K . Ingredients of the proof: Convergence of Lasserre SOS hierarchy Uniform approximation schemes (Maxplus/Minimax) Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 55 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Numerical Results for Flyspeck Branch and bound subdivisions to reduce the relaxation gap: #boxes sub-problems n = 6 variables, SOS of degree 2 k = 4 n D univariate transcendental functions Maxplus arctan + Lifting √ x i Inequality id #boxes time n D n lifting 9922699028 1 9 47 241 s 3318775219 1 9 338 26 min 7726998381 3 15 70 43 min 7394240696 3 15 351 1.8 h 4652969746_1 6 15 81 1.3 h OXLZLEZ 6346351218_2_0 6 24 200 5.7 h Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 56 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Contents Introduction 1 SOS Certificates 2 Maxplus Approximation 3 Nonlinear Templates 4 Formal SOS 5 Conclusion 6 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 57 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Reducing the Number of Lifting Variables Lifting strategy: n lifting increases with the number of control points and components of the semialgebraic functions At fixed relaxation order k , the number of SOS variables is in O (( n + n lifting ) 2 k ) Improvements for more scalibility: Limit the blow-up at the price of coarsening the semialgebraic 1 estimators Still produce certificates 2 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 58 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Template Abstraction Linear templates in static analysis [Sankaranarayana-Sipma-Manna 05] Nonlinear extension [Adje-Gaubert-Goubault 12] Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 59 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Template Approximation Invariants of programs with parametric families of subsets of R n of the form S ( α ) = { x | w i ( x ) � α i , 1 � i � p } , where: α ∈ R p is the parameter w 1 , . . . , w p is the template Level sets of maxplus approximation ⇔ templates description Special cases of templates ( w i ): bounds constraints ( ± x i ): interval calculus degree- d minimax polynomials: Chebyshev approximation Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 60 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Template Approximation Input: tree t , box K , SOS relaxation order k , precision p Output: lower bound m , upper bound M , lower semialgebraic estimator t − 2 , upper semialgebraic estimator t + 2 1: if t ∈ A then t − : = t , t + : = t 2: else if u : = root ( t ) ∈ D with child c then m c , M c , c − , c + : = template _ approx ( c , K , k , p ) 3: I : = [ m c , M c ] 4: u − , u + : = unary _ approx ( u , I , c , p ) 5: t − , t + : = compose _ approx ( u , u − , u + , I , c − , c + ) 6: 7: else if bop : = root ( t ) is a binary operation with children c 1 and c 2 then m i , M i , c − i , c + i : = template _ approx ( c i , K , k , p ) for i ∈ { 1, 2 } 8: t − , t + : = compose _ bop ( c − 1 , c + 1 , c − 2 , c + 2 , bop , [ m 2 , M 2 ]) 9: 10: end 11: t − 2 : = reduce _ lift ( t , K , k , p , t − ) , t + 2 : = − reduce _ lift ( t , K , k , p , − t + ) 12: return min _ sa ( t − 2 , K , k ) , max _ sa ( t + 2 , K , k ) , t − 2 , t + 2 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 61 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion How to Construct Templates? Nonlinear Templates 4 Nonlinear Template Abstraction Nonlinear Template Approximation Nonlinear Quadratic Templates Polynomial Estimators for Semialgebraic Functions Comparison Results on Global Optimization Problems Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 62 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Quadratic Templates Let x 1 , . . . , x p ∈ K Quadratic underestimators of f over K : f x c , λ ′ : K − → R x �− → f ( x c ) + D ( f )( x c ) ( x − x c ) + 1 2 ( x − x c ) T D 2 ( f )( x c )( x − x c ) + 1 2 λ ′ � x − x c � 2 2 , with λ ′ � λ : = min x ∈ K { λ min ( D 2 ( f )( x ) − D 2 ( f )( x c )) } . Computation of λ ′ can be certified (Robust SDP) Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 63 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Quadratic Templates Let x 1 , . . . , x p ∈ K Quadratic underestimators of f over K : f x c , λ ′ : K − → R x �− → f ( x c ) + D ( f )( x c ) ( x − x c ) + 1 2 ( x − x c ) T D 2 ( f )( x c )( x − x c ) + 1 2 λ ′ � x − x c � 2 2 , with λ ′ � λ : = min x ∈ K { λ min ( D 2 ( f )( x ) − D 2 ( f )( x c )) } . Computation of λ ′ can be certified (Robust SDP) Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 63 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Nonlinear Quadratic Templates Let x 1 , . . . , x p ∈ K Quadratic underestimators of f over K : f x c , λ ′ : K − → R x �− → f ( x c ) + D ( f )( x c ) ( x − x c ) + 1 2 ( x − x c ) T D 2 ( f )( x c )( x − x c ) + 1 2 λ ′ � x − x c � 2 2 , with λ ′ � λ : = min x ∈ K { λ min ( D 2 ( f )( x ) − D 2 ( f )( x c )) } . Computation of λ ′ can be certified (Robust SDP) Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 63 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Contents Nonlinear Templates 4 Nonlinear Template Abstraction Nonlinear Template Approximation Nonlinear Quadratic Templates Polynomial Estimators for Semialgebraic Functions Comparison Results on Global Optimization Problems Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 64 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Polynomial Estimators for Semialgebraic Functions Inspired from [Lasserre - Thanh 13] Let f sa ∈ A defined on a box K ⊂ R n Let λ n be the standard Lebesgue measure on R n (normalized) Best polynomial underestimator h ∈ R d [ x ] of f sa for the L 1 norm: � min K ( f sa − h ) d λ n ( P sa ) h ∈ R d [ x ] s.t. f sa − h � 0 on K . Lemma Problem ( P sa ) has a degree- d polynomial minimizer h d . Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 65 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Polynomial Estimators for Semialgebraic Functions b.s.a.l. K pop : = { ( x , z ) ∈ R n + p : g 1 ( x , z ) � 0, . . . , g m ( x , z ) � 0 } The quadratic module M ( g ) is Archimedean The optimal solution h d of ( P sa ) is a maximizer of: � max [ 0,1 ] n h d λ n ( P d ) h ∈ R d [ x ] s.t. ( z p − h ) ∈ M ( g ) . Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 66 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Polynomial Estimators for Semialgebraic Functions Let m d be the optimal value of Problem ( P sa ) Let h dk be a maximizer of the SOS relaxation of ( P d ) Convergence of the SOS Hierarchy The sequence ( � f sa − h dk � 1 ) k � k 0 is non-increasing and converges to m d . Each accumulation point of the sequence ( h dk ) k � k 0 is an optimal solu- tion of Problem ( P sa ) . Upper bound of � f sa − h dk � 1 Bound d k 2 0.8024 -1.171 2 3 0.3709 -0.4479 ∂ 4 ∆ x f sa ( x ) : = √ 4 x 1 ∆ x 2 1.617 -1.056 4 3 0.1766 -0.4493 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 67 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Polynomial Estimators for Semialgebraic Functions Let m d be the optimal value of Problem ( P sa ) Let h dk be a maximizer of the SOS relaxation of ( P d ) Convergence of the SOS Hierarchy The sequence ( � f sa − h dk � 1 ) k � k 0 is non-increasing and converges to m d . Each accumulation point of the sequence ( h dk ) k � k 0 is an optimal solu- tion of Problem ( P sa ) . Upper bound of � f sa − h dk � 1 Bound d k 2 0.8024 -1.171 2 3 0.3709 -0.4479 ∂ 4 ∆ x f sa ( x ) : = √ 4 x 1 ∆ x 2 1.617 -1.056 4 3 0.1766 -0.4493 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 67 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Polynomial Estimators for Semialgebraic Functions rad 2 : ( x 1 , x 2 ) �→ − 64 x 2 1 + 128 x 1 x 2 + 1024 x 1 − 64 x 2 2 + 1024 x 2 − 4096 − 8 x 2 1 + 8 x 1 x 2 + 128 x 1 − 8 x 2 2 + 128 x 2 − 512 Linear and quadratic underestimators for rad 2 ( k = 3): Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 68 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Polynomial Estimators for Semialgebraic Functions rad 2 : ( x 1 , x 2 ) �→ − 64 x 2 1 + 128 x 1 x 2 + 1024 x 1 − 64 x 2 2 + 1024 x 2 − 4096 − 8 x 2 1 + 8 x 1 x 2 + 128 x 1 − 8 x 2 2 + 128 x 2 − 512 Linear and quadratic underestimators for rad 2 ( k = 3): 0.12 0.12 0.11 d = 2 d = 1 0.11 1 0 0.2 0.5 0.4 0.6 0.8 1 0 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 68 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Contents Nonlinear Templates 4 Nonlinear Template Abstraction Nonlinear Template Approximation Nonlinear Quadratic Templates Polynomial Estimators for Semialgebraic Functions Comparison Results on Global Optimization Problems Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 69 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Comparison Results on Global Optimization Problems n x i sin ( √ x i ) min ∑ x ∈ [ 1,500 ] n f ( x ) = − Minimax Approximation + SOS i = 1 f ∗ � − 418.9 n d = 4, n = 10 38 min to certify a lower bound of − 430 n Poor accuracy of Minimax Estimators Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 70 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Comparison Results on Global Optimization Problems n x i sin ( √ x i ) min ∑ x ∈ [ 1,500 ] n f ( x ) = − Inteval Arithmetic for sin + SOS i = 1 f ∗ � − 418.9 n lower bound #boxes time n n lifting 10 − 430 n 0 3830 129 s 10 − 430 n 2 n 16 40 s Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 71 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Comparison Results on Global Optimization Problems y par + n x i sin ( √ x i ) a 3 par + min ∑ x ∈ [ 1,500 ] n f ( x ) = − sin par + a 1 a 2 i = 1 a √ 1 a 1 a 2 500 a 3 = par − a 3 f ∗ � − 418.9 n par − par − a 2 a 1 lower bound #boxes time n n lifting 10 − 430 n 0 3830 129 s 10 − 430 n 2 n 16 40 s Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 72 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Comparison Results on Global Optimization Problems n x i sin ( √ x i ) min ∑ x ∈ [ 1,500 ] n f ( x ) = − Inteval Arithmetic for sin + SOS i = 1 f ∗ � − 418.9 n lower bound #boxes time n n lifting 100 − 440 n 0 > 10000 > 10 h 100 − 440 n 2 n 274 1.9 h Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 73 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Comparison Results on Global Optimization Problems y par + n x i sin ( √ x i ) a 3 par + min ∑ x ∈ [ 1,500 ] n f ( x ) = − sin par + a 1 a 2 i = 1 a √ 1 a 1 a 2 500 a 3 = par − a 3 f ∗ � − 418.9 n par − par − a 2 a 1 lower bound #boxes time n n lifting 100 − 440 n 0 > 10000 > 10 h 100 − 440 n 2 n 274 1.9 h Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 74 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Comparison Results on Global Optimization Problems y par + a 3 par + sin par + a 1 n − 1 a 2 ( x i + x i + 1 ) sin ( √ x i ) a √ min ∑ x ∈ [ 1,500 ] n f ( x ) = − 1 a 1 a 2 500 a 3 = i = 1 par − a 3 par − par − a 2 a 1 lower bound #boxes time n n lifting 1000 − 967 n 2 n 1 543 s 1000 − 968 n 1 272 s n Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 75 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Comparison Results on Global Optimization Problems y par + b 3 √ b �→ sin ( par + b ) b 1 n − 1 ( x i + x i + 1 ) sin ( √ x i ) b b 3 = 500 min ∑ 1 b 1 b 2 x ∈ [ 1,500 ] n f ( x ) = − par + par − b 2 b 3 i = 1 par − b 2 par − b 1 lower bound #boxes time n n lifting 1000 − 967 n 2 n 1 543 s 1000 − 968 n 1 272 s n Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 76 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Outline Introduction 1 SOS Certificates 2 Maxplus Approximation 3 Nonlinear Templates 4 Formal SOS 5 Conclusion 6 Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 77 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Hybrid Symbolic-Numeric Certification Certified lower bound of inf x ∈ K p ( x ) ? At relaxation order k , SOS solvers output: floating-point lower bound µ k floating-point SOS σ 0 , . . . , σ m Projection and rounding by [Parrilo-Peyrl 08]: m Seek rational SOS σ ′ 0 , . . . , σ ′ σ ′ m so that p − µ k = ∑ j ( x ) g j ( x ) j = 0 Try with a lower bound µ ′ k � µ k when it fails Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 78 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Hybrid Symbolic-Numeric Certification Alternative to the projection and rounding by [Parrilo-Peyrl 08]: Normalized POP ( x ∈ [ 0, 1 ] n ) Conversion into rationals: SOS ˜ σ 0 , . . . , ˜ σ m , lower bound ˜ µ k m ǫ pop ( x ) : = p ( x ) − ˜ ∑ ˜ µ k − σ j ( x ) g j ( x ) j = 0 pop : = ∑ Bounding: ∀ x ∈ [ 0, 1 ] n , ǫ pop ( x ) � ǫ ∗ ǫ α ǫ α � 0 More concise SOS certificates / Simpler rounding Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 79 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Customized Polynomial Ring Check symbolic polynomial equalities q = q’ Existing tactic ring [Grégoire-Mahboubi 05] Polynomials coefficients: arbitrary-size rationals bigQ [Grégoire-Théry 06] Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 80 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Checking Polynomial Equalities Sparse Horner normal form Inductive PolC: Type := | Pc : bigQ → PolC | Pinj: positive → PolC → PolC : PolC → positive → PolC → PolC. | PX (Pc c) for constant polynomials (Pinj i p) shifts the index of i in the variables of p (PX p j q) evaluates to px j 1 + q ( x 2 , . . . , x n ) Encoding SOS certificates with Sparse Horner polynomials Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 81 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Bounding the Polynomial Remainder Fixpoint lower_bnd := Normalized POP ( x ∈ [ 0, 1 ] n ) match eps_pol with m | Pc c ⇒ cmin c zero ǫ pop ( x ) : = p ( x ) − ˜ ∑ ˜ µ k − σ j ( x ) g j ( x ) | Pinj _ p ⇒ lower_bnd p j = 0 | PX p _ q ⇒ lower_bnd p pop : = ∑ ∀ x ∈ [ 0, 1 ] n , ǫ pop ( x ) � ǫ ∗ ǫ α +! lower_bnd q ǫ α � 0 end. Lemma remainder_lemma l eps_pol : eps_pol → 0 <= l i ∧ l i <= 1) (forall i, i \in vars → [lower_bnd eps_pol] <= PolCeval l eps_pol. Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 82 / 92
Introduction SOS Certificates Maxplus Approximation Nonlinear Templates Formal SOS Conclusion Bounding the Polynomial Remainder Fixpoint lower_bnd := Normalized POP ( x ∈ [ 0, 1 ] n ) match eps_pol with m | Pc c ⇒ cmin c zero ǫ pop ( x ) : = p ( x ) − ˜ ∑ ˜ µ k − σ j ( x ) g j ( x ) | Pinj _ p ⇒ lower_bnd p j = 0 | PX p _ q ⇒ lower_bnd p pop : = ∑ ∀ x ∈ [ 0, 1 ] n , ǫ pop ( x ) � ǫ ∗ ǫ α +! lower_bnd q ǫ α � 0 end. Lemma remainder_lemma l eps_pol : eps_pol → 0 <= l i ∧ l i <= 1) (forall i, i \in vars → [lower_bnd eps_pol] <= PolCeval l eps_pol. Victor MAGRON (PhD Defense) Formal Proofs for Global Optimization 82 / 92
Recommend
More recommend