multiple objective function optimization
play

Multiple objective function optimization R.T. Marker, J.S. Arora, - PowerPoint PPT Presentation

Multiple objective function optimization R.T. Marker, J.S. Arora, Survey of multi-objective optimization methods for engineering Structural and Multidisciplinary Optimization Volume 26, Number 6, April 2004 , pp. 369-395(27) Multiple


  1. Multiple objective function optimization R.T. Marker, J.S. Arora, “Survey of multi-objective optimization methods for engineering” Structural and Multidisciplinary Optimization Volume 26, Number 6, April 2004 , pp. 369-395(27)

  2. Multiple Objective Functions Assume all f,g,h are differentiable

  3. Preliminaries Feasible design space - satisfies all constraints Feasible criterion space - objective function values of feasible design space region Preferences - user’s opinion about points in criterion space Scalarization methods v. vector methods

  4. rugged fitness landscape sensitivity issue http://www.calresco.org/lucas/pmo.htm

  5. economic resources money M( t+1 ) = a * M(t) + b * I ( t ) + c * T ( t ) I ( t+1 ) = d * I ( t ) + e * T ( t ) + f * M( t ) ideas T ( t+1 ) = g * T (t) + h * M( t ) + j * I ( t ) time non-linear cross-coupling Strange Attractors http://www.calresco.org/lucas/pmo.htm

  6. Organization a priori articulation of preferences a posteriori articulation of preferences progressive articulation of preferences genetic algorithms

  7. utopia (ideal) point F 0 point that optimizes all objective functions often doesn’t exist compromise solution one or more objective functions not optimal close as possible to utopia point

  8. x1 is superior to x2 iff x1 dominates x2 x1 > x2

  9. Pareto optimal solution if there does not exist another feasible design objective vector such that all objective functions are better than or equal to and at least one objective function is better i.e., there is no x’ such that x’ > x i.e., it is not dominated by any other point

  10. Weakly Pareto Optimal no other point with better object values Properly Pareto Optimal

  11. Pareto optimal set Set of all Pareto optimal points possibly infinite set Various Approaches Identify Pareto optimal set Identify some subset of optimal set seek a single final point

  12. Solving multiple objective optimization provides: Necessary condition for Pareto optimality and / or Sufficient condition for Pareto optimality

  13. Common function transformation methods to remove dimensions or balance magnitude differences

  14. Methods with a priori articulation of preferences Allow user to specify preferences for, or relative importance of, objective functions

  15. Weighted Sum Method Sufficient for Pareto optimality no guarantee of final result acceptable impossible to find points in non-convex sections not even distribution

  16. Weighted global criterion method

  17. Lexicographic Method objective functions arranged in order of importance solve following optimization problems one at a time

  18. Goal Programming Method

  19. Goal Attainment Method computationally faster than typical goal programming methods

  20. Physcial Programming Class function for each metric monotonically increasing, monotonically decresing, or unimodal function specify numeric ranges for degrees of preference desirable, tolerable, undesirable, etc.

  21. Methods for a posteriori articualtion of preference generate first, choose later approaches generate representative Pareto optimal set user selects from palette of solutions

  22. Physical Programming systematically vary parameters traverses criterion space

  23. Normal boundary intersection method

  24. Normal constraint method determine utopia point normalize objective functions individual minimization of objective functions form vertices of utopia hyperplane

  25. Methods no articulation of preferences similar to a priori techniques with no weights Global criterion methods with w i = 1.0

  26. Min max method treat as single objective function provides weakly Pareto optimal point

  27. Objective sum method To avoid additional constraints and discontinuities

  28. Nast arbitration and objective product method Maximize where s i >= F i (x)

  29. Rao’s method normalize so F inorm is between zero and one and F inorm =1 is worst possible

  30. Genetic Algorithms no derivative information needed global optimization e.g., generate sub-populations by optimizing one objective function

  31. directions in shaded area reduce both objective functions

Recommend


More recommend