Solving MOOP: Non-Pareto MOEA approaches Debasis Samanta Indian Institute of Technology Kharagpur dsamanta@iitkgp.ac.in 22.03.2016 Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 1 / 32
Multi-objective evolutionary algorithm To distinguish the GA to solve single objective optimization problems to that of MOOPs, a new terminology called Evolutionary Algorithm (EA) has been coined. In many research articles, it is popularly abbreviated as MOEA, the short form of M ulti- O bjective E volutionary A lgorithm. The following is the MOEA framework, where Reproduction is same as in GA but different strategies are followed in Selection . Initialization of Yes MOOP Selection Convergence Solution Population Test No Reproduction Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 2 / 32
Difference between GA and MOEA Difference between GA and MOEA are lying in input (single 1 objective vs. multiple objectives) and output (single solution vs. trade-off solutions, also called Pareto-optimal solutions). Two major problems are handled in MOEA 2 How to accomplish fitness assignment (evaluation) and selection thereafter in order to guide the search toward the Pareto optimal set. How to maintain a diverse population in order to prevent premature convergence and achieve a well distributed Pareto-optimal front. Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 3 / 32
Classification of MOEA techniques MOEA Techniques A priori approach A posteriori approach Independent sampling Aggregation Hybrid Selection (Ordering) Criterion selection (VEGA) Lexicographic ordering Pareto Selection Aggregation Ranking (Scalarization Ranking and Niching Linear fitness evaluation (SOEA) Demes Non-linear fitness evaluation (SOEA) Elitist Goal attainment Weighted Min-max method Game theory Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 4 / 32
Classification of MOEA techniques Note : A priory technique requires a knowledge to define the relative importances of objectives prior to search A posteriori technique searches for Pareto-optimal solutions from a set of feasible solutions Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 5 / 32
MOEA technoiques to be discussed A priori approches 1 Lexicographic ordering Simple weighted approach (SOEA) A posteriori approaches 2 Criterion selection (VEGA) Pareto-based approaches Rank-based approach (MOGA) Rank + Niche based approach (NPGA) Non-dominated sorting based approach (NSGA) Elitist non-dominated sorting based approach (NSGA-II) Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 6 / 32
MOEA technoiques to be discussed Non-Pareto based approches 1 Lexicographic ordering Simple weighted approach (SOEA) Criterion selection (VEGA) Pareto-based approaches 2 Rank-based approach (MOGA) Rank + Niche based approach (NPGA) Non-dominated sorting based approach (NSGA) Elitist non-dominated sorting based approach (NSGA-II) Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 7 / 32
Lexicographic Ordering Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 8 / 32
Lexicographic ordering method Reference : ”Compaction of Symbolic Layout using Genetic Algorithms” by M.P Fourman in Proceedings of 1st International Conference on Genetic Algorithms, Pages 141-153, 1985. It is an a priori technique based on the principle of ”aggregation with ordering”. Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 9 / 32
Lexicographic ordering method Suppose, a MOOP with k objectives and n constraints over a decision space x and is denoted as. Minimize f = [ f 1 , f 2 , · · · , f k ] Subject to g j ( x ) ≤ c j , where j = 1 , 2 , · · · , n Objectives are ranked in the order of their importance (done by 1 the programmer). Suppose, the objectives are arranged in the following order. f = [ f 1 < f 2 < f 3 < · · · < f k ] Here, f i < f j implies f i is of higher importance than f j Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 10 / 32
Lexicographic ordering method x ∗ is then obtained by minimizing each The optimum solution ¯ 2 objective function at a time, which is as follows. (a) Minimize f 1 ( x ) Subject to g j ( x ) ≤ c j , j = 1 , 2 , · · · , n Let its solution be ¯ 1 = f 1 (¯ x ∗ 1 , that is f ∗ x ∗ 1 ) (b) Minimize f 2 ( x ) Subject to g j ( x ) ≤ c j , j = 1 , 2 , · · · , n f 1 ( x ) = f ∗ 1 Let its solution be ¯ 2 = f 2 (¯ 2 ) x ∗ 2 , that is f ∗ x ∗ ................................................................. ................................................................. (c) At the i -th step, we have Minimize f i ( x ) Subject to g j ( x ) ≤ c j , j = 1 , 2 , · · · , n f l ( x ) = f ∗ l , l = 1 , 2 , · · · , i − 1 Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 11 / 32
Lexicographic ordering method This procedure is repeated until all k objectives have been considered in the order of their importances. The solution obtained at the end is ¯ k = f k (¯ k ) . x ∗ k , that is, f ∗ x ∗ x ∗ of the given multiobjective This is taken as the desired solution ¯ optimization problem Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 12 / 32
Remarks on Lexicographic ordering method Remarks : Deciding priorities (i.e. ranks) of objective functions is an issue. Solution may vary if a different ordering is taken. Different strategies can be followed to address the above issues. Random selection of an objective function at each run 1 Naive approach to try with k ! number of orderings of k objective 2 functions and then selecting the best observed result. Note : It produces a single solution rather than a set of Pareto-optimal solutions. Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 13 / 32
Single Objective Evolutionary Agorithm Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 14 / 32
SOEA: Single-Objective Evolutionary Algorithm This is an a priori technique based on the principle of ”linear aggregation of functions”. It is also alternatively termed as (SOEA) ”Single Objective Evoluationary Algorithm”. In many literature, this is also termed as Weighted sum approach. In fact, it is a naive approach to solve a MOOP . Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 15 / 32
SOEA approach to solve MOOPs This method consists of adding all the objective functions together using different weighting coefficients for each objective. This means that our multi-objective optimization problem is transformed into a scalar optimization problem. In other words, in order to optimize say n objective functions f 1 , f 2 , · · · , f n . It compute fitness using fitness = � n i = 1 w i × f i ( x ) where w i ≥ 0 for each i = 1 , 2 , ... n are the weighting coefficients representing the relative importance of the objectives. It is usually assume that � n i = 1 w i = 1 Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 16 / 32
Comments on SOEA This is the simplest approach and works in the same framework of 1 Simple GA. The results of solving an optimization problem can vary 2 significantly as the weighting coefficient changes. In other words,it produces different solutions with different values 3 of w i ’s. Since very little is usually known about how to choose these 4 coefficients, it may result into a local optima. Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 17 / 32
Local optimum solution in SOEA Pareto -front Minimize f 2 Pareto -front 2 W 1 f 1 +w 2 f 2 P a r e Minimize f 1 t o - f r o n t 2 Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 18 / 32
Comments on SOEA As a way out of this, it is necessary to solve the same problem for 3 many different values of w i ’s. The wighting coefficients do not proportionally reflects the relative 4 importance of the objectives, but are only factors, which, when varied, locate points in the Pareto set. This method depends on not only w i ’s values but also on the units 5 in which functions are expressed. In that case, we have to scale the objective values. that is 6 fitness = � n i = 1 w i × f i ( x ) × c i where c i ’s are constant multipliers that scales the objectives properly. Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 19 / 32
Naive Approach : Weighted sum approach The technique can not be used to find Pareto-optimal solutions 7 which lie on the non-convex portion of the Pareto optimal front. In that case, it gives only one solution, which might be on the Pareto front. Feasible objective space Minimize f 2 Pareto-optimal front min f 1 SOEA Solution Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 20 / 32
Vector Evaluated Genetic Agorithm Debasis Samanta (IIT Kharagpur) Soft Computing Applications 22.03.2016 21 / 32
Recommend
More recommend