UTOPIAE Definitions and Preliminary Ideas Scalarisation Multi-Objective Optimal Control Methods Necessary Conditions for Optimality Massimiliano Vasile Direct Finite Aerospace Centre of Excellence, Department of Mechanical and Element Transcription Aerospace Engineering University of Strathclyde, Glasgow (United Kingdom) Solution with Memetic Algorithms Examples of September 1, 2017 Application Final Remarks References 1 / 38
UTOPIAE Pareto Dominance and Efficiency Pareto Dominance Consider the vector functions F : R n → R m , with Definitions F ( x ) = [ f 1 ( x ) , f 2 ( x ) , ..., f i ( x ) , ..., f m ( x )] T , g : R n → R q , with and g ( x ) = [ g 1 ( x ) , g 2 ( x ) , ..., g j ( x ) , ..., g q ( x )] T and problem Preliminary Ideas min F Scalarisation x Methods s . t . (MOP) Necessary Conditions for g ( x ) ≤ 0 Optimality Direct Finite Given the feasible set X = { x ∈ R n | g ( x ) ≤ 0 } and two feasible vectors x , ˆ x ∈ x , we Element say that x is dominated by ˆ x if f i (ˆ x ) ≤ f i ( x ) for all i = 1 , ..., m and there exists a Transcription k so that f k (ˆ x ) � = f k ( x ). We use the relation ˆ x ≺ x that states that ˆ x dominates x . Solution with Memetic Algorithms Pareto Efficiency A vector x ∗ ∈ X will be said to be Pareto efficient, or optimal, with respect to Examples of Problem (MOP) if there is no other vector x ∈ X dominating x ∗ or: Application Final Remarks x ⊀ x ∗ , ∀ x ∈ X References Pareto Set All non-dominated decision vectors in X form the Pareto set X P and the corresponding image in criteria space is the Pareto front. 2 / 38
UTOPIAE Karush-Khun-Tucker Optimality Conditions [Cha08] Necessary condition for x ∗ to be locally optimal. Definitions and Preliminary Ideas Theorem (KKT) Scalarisation If x ∗ ∈ X is an efficient solution to problem MOP, then there Methods Necessary exist vectors η ∈ R m and λ ∈ R q such that: Conditions for Optimality m q � � Direct Finite Element η i ∇ f i ( x ∗ ) + λ j ∇ g j ( x ∗ ) = 0 (1) Transcription i j Solution with Memetic g j ( x ∗ ) = 0 , j = 1 , ..., q (2) Algorithms Examples of λ j ≥ 0 , j = 1 , ..., q (3) Application η i ≥ 0 , i = 1 , ..., m (4) Final Remarks References ∃ η i > 0 (5) 3 / 38
UTOPIAE Pareto Set and Front Definitions and Preliminary In the unconstrained case KKT optimality conditions reduce to: Ideas Scalarisation m Methods � η i ∇ f i ( x ∗ ) = 0 (6) Necessary Conditions for Optimality i η i ≥ 0 , i = 1 , ..., m (7) Direct Finite Element Transcription ∃ η i > 0 (8) Solution with Memetic Algorithms Condition 6 leads to an interesting result (Hillermeier2001 Examples of [Hil01]) that the Pareto set is an m − 1 dimensional manifold. Application This also implies that the Pareto set has zero measure in R n Final Remarks with m ≤ n . References 4 / 38
UTOPIAE Multi-Objective Optimal Control Definitions Consider the following multi-objective optimal control problem (MOCP): and Preliminary Ideas min u F Scalarisation Methods s . t . (MOCP) Necessary x = h ( x , u , p , t ) ˙ Conditions for Optimality g ( x , u , p , t ) ≤ 0 Direct Finite ψ ( x 0 , x f , t 0 , t f ) ≤ 0 Element Transcription t ∈ [ t 0 , t f ] Solution with Memetic where F is a vector function of the state variables x : [ t 0 , t f ] → R n , control Algorithms variables u ∈ L ∞ , time t and some static parameters p ∈ R q . Functions x Examples of belong to the Sobolev space W 1 , ∞ , objective functions are Application f i : R n +2 n × R p × [ t 0 , t f ] − → R , h : R n × R p × R q × [ t 0 , t f ] − → R n , Final Remarks algebraic constraints g : R n × R p × R q × [ t 0 , t f ] − → R s , and boundary References conditions R 2 n +2 − → R q . Note that problem (MOCP) is generally non-smooth and can have many local minima. 5 / 38
UTOPIAE MOCP: How to Solve it? Definitions and Preliminary Ideas • Option 1 is to attempt the solution of the problem in Scalarisation vector form. Methods Necessary • Option 2 is to find a suitable form of scalarisation and Conditions for Optimality then use the existing machineries to solve single objective Direct Finite optimal control problems. Element Transcription • Option 3 is to use a mix of Option 1 and Option 2. Solution with Memetic In the following we will introduce some suitable scalarisation Algorithms techniques and we will then show how to combine Option 1 Examples of Application and Option 2 into a single method with some desirable Final Remarks theoretical and computational properties. References 6 / 38
UTOPIAE Pascoletti-Serafini Scalarisation[Eic08] The scalarisation of Pascoletti-Serafini is based on the idea of descent cones K . An optimal (K-minimal) solution to problem MOP is solution to Definitions and the following constrained single objective optimisation problem: Preliminary Ideas f 2 F=[ f 1 ,f 2 ] Scalarisation r+at Methods min t t K F( X ) s . t . Necessary (9) Conditions for a t − F ( x ) + r ∈ K Optimality K g ( x ) ≤ 0 Direct Finite Element or, in a more computationally friendly, form: Transcription f 1 Solution with min s s Memetic s . t . Algorithms (PS) w j ( f j ( x ) − z j ) ≤ s ∀ j = 1 , ..., m Examples of Application g ( x ) ≤ 0 f 2 Final Remarks A point is K-minimal when: F( X ) References (¯ F − K ) ∩ F ( X ) = { ¯ F } K F=[ f 1 ,f 2 ] From this definition one can understand that a K-minimal point is Pareto efficient. f 1 7 / 38
UTOPIAE Chebyshev Scalarisation[Eic08] Chebyshev scalarisation is based on the idea of descent directions identified by the weights w : Definitions and Preliminary Ideas f 2 Scalarisation min x ∈ X max j ∈{ 1 ,..., m } w j ( f j ( x ) − z j ) Methods s . t . (CS) Necessary g ( x ) ≤ 0 Conditions for Increasing g ( F ) Optimality F ( x ) Direct Finite Theorem (CS) Element Transcription A point ( s , x ) ∈ R × X is a minimal solution Solution with Memetic of problem (PS) with z ∈ R m , Algorithms Decreasing g ( F ) z j < min x ∈ X f j ( u ) , j = 1 , ..., m, and Examples of w ∈ int ( R m + ) , if and only if x is a solution of Application f 1 problem (CS). Final Remarks References From theorem CS one can expect that the solution of the PS and CS problems are equivalent. This is an important property when designing algorithms because, in some cases, the solution of PS translates into the solution of CS. 8 / 38
UTOPIAE (Scalar) Pontryagin Maximum Principle Given the following optimal control problem in Mayer’s form: Definitions min f ( x f , t f ) and Preliminary s . t Ideas x = h ( x , u , t ) ˙ Scalarisation Methods g ( x , u , t ) ≥ 0 Necessary ψ ( x 0 , x f , t 0 , t f ) ≥ 0 Conditions for Optimality t ∈ [ t 0 , t f ] Direct Finite If u ∗ is a locally optimal solution for problem (PSOCP) then there exist a Element Transcription vector η ∈ R m , λ ∈ R n and a vector µ ∈ R q such that: Solution with Memetic u ∗ = argmin ( λ T h ( x ∗ , u , t ) + µ T g ( x ∗ , u , t )) Algorithms u ∈ U Examples of λ T ∇ x h ( x ∗ , u ∗ , t ) + µ T ∇ x g ( x ∗ , u ∗ , t ) + ˙ Application λ = 0 Final Remarks λ ≥ 0; µ ≥ 0 References with transversality conditions: ∇ x f + ν T ∇ x ψ = λ x ( t f ) ν ≥ 0 9 / 38
UTOPIAE Pascoletti-Serafini Scalarised MOCP Definitions and Preliminary Consider each objective function to be f j ( x f , t f ) and the scalarised Ideas Multi-Objective Optimal Control problem: Scalarisation Methods min s f s f Necessary s . t . Conditions for Optimality w j ( f j ( x f , t f ) − z j ) − s f ≤ 0 ∀ j = 1 , ..., m x = h ( x , u , t ) ˙ (PSOCP) Direct Finite Element g ( x , u , t ) ≥ 0 Transcription ψ ( x 0 , x f , t 0 , t f ) ≥ 0 Solution with t ∈ [ t 0 , t f ] Memetic Algorithms If s is a slack variable with final condition s f and zero time variation ˙ s = 0, Examples of Application then problem (PSOCP) presents itself in a form similar to Mayer’s problem. The major difference is the mixed boundary constraint on x f , t f Final Remarks and s f for every j = 1 , ..., m . References 10 / 38
UTOPIAE Necessary Conditions for Local Optimality Theorem (Vasile2017) Definitions Consider the function H = λ T h ( x , u , t ) + µ T g ( x , u , t ) . If u ∗ is a locally and Preliminary optimal solution for problem (PSOCP), with associated state vector x ∗ , Ideas and H is Frechet differentiable at u ∗ , then there exist a vector η ∈ R m , Scalarisation λ ∈ R n and a vector µ ∈ R q such that: Methods Necessary u ∗ = argmin λ T h ( x ∗ , u , t ) + µ T g ( x ∗ , u , t ) Conditions for Optimality u ∈ U Direct Finite λ T ∇ x h ( x ∗ , u ∗ , t ) + µ T ∇ x g ( x ∗ , u ∗ , t ) + ˙ λ = 0 Element Transcription ˙ λ s = 0 Solution with λ ≥ 0; µ ≥ 0 Memetic Algorithms with transversality conditions: Examples of Application m Final Remarks � 1 − η j = λ s ( t f ) References j η T ∇ x F + ν T ∇ x ψ = λ x ( t f ) η > 0; ν ≥ 0 11 / 38
Recommend
More recommend