tuning optimization algorithms under multiple objective
play

Tuning Optimization Algorithms under Multiple Objective Function - PowerPoint PPT Presentation

Tuning Optimization Algorithms under Multiple Objective Function Evaluation Budgets Antoine S. Dymond Department of Mechanical and Aeronautical Engineering University of Pretoria, South Africa Departmental Public Defence 25 July 2014 A.


  1. Tuning Optimization Algorithms under Multiple Objective Function Evaluation Budgets Antoine S. Dymond Department of Mechanical and Aeronautical Engineering University of Pretoria, South Africa Departmental Public Defence 25 July 2014 A. Dymond Tuning Optimization Algorithms 25 July 2014 1 / 20

  2. Introduction Numerical optimization Numerical optimization forms a pivotal part of many design processes. 3 parts: Modeling Searching for the optimum of the generated model Validation Focus of PhD : optimization algorithms for searching for the model’s optimum A. Dymond Tuning Optimization Algorithms 25 July 2014 2 / 20

  3. Introduction Optimization considerations No one optimization algorithm works for all problems (No Free Lunch [4]). Algorithm and CPVs need to be selected according to objective function characteristics constraints imposed termination criteria - objective function evaluation (OFE) budgets. A. Dymond Tuning Optimization Algorithms 25 July 2014 3 / 20

  4. Introduction Goal and Contributions Goal Aid Practitioners in selecting an optimization algorithm and CPVs appropriate for the problem at hand. Approach present tools for determining algorithms and CPVs well suited for representative 1 testing problems. Contributions 1 tMOPSO 2 MOTA 3 Benchmarking via tunability 1 suspected of being representative... A. Dymond Tuning Optimization Algorithms 25 July 2014 4 / 20

  5. tMOPSO tuning multi-objective particle swarm optimization algorithm Tuning entails changing an algorithm settings (CPVs) as to improve performance. tMOPSO tunes an algorithm to a single criteria for Multiple OFEs: best CPVs OFE budget N F C r 20 20 0 . 7 0 . 2 50 5 0 . 5 0 . 1 100 10 0 . 9 0 . 4 . . . . . . . . . . . . A. Dymond Tuning Optimization Algorithms 25 July 2014 5 / 20

  6. tMOPSO Multi-objective optimization Pareto dominance: x 1 ≺ x 2 when:   f 1 ( x ) f k ( x 1 ) ≤ f k ( x 2 ) , ∀ k ∈ { 1 , 2 , . . . , n f } f 2 ( x )   minimize F ( x ) =   and . . .   f n f ( x ) ∃ k ∈ 1 , 2 , . . . , n f : f k ( x 1 ) < f k ( x 2 ) . X F ( x ) ∀ x ∈ X f 2 P ℜ 3 f 1 A. Dymond Tuning Optimization Algorithms 25 July 2014 6 / 20

  7. tMOPSO contribution accuracy a b c tMOPSO Combines into one algorithm: multi-objective tuning according to speed vs. accuracy -speed history information accuracy a noise handling for tuning c stochastic algorithms efficient Pareto archives -speed A. Dymond Tuning Optimization Algorithms 25 July 2014 7 / 20

  8. tMOPSO Gauging tMOPSO Conducted numerical experiments showed the tMOPSO is better than or equivalent to existing techniques In particular for the conducted experiments tMOPSO outperformed the FBM algorithm tMOPSO was found to be a more efficient alternative then setting up numerous single OFE budget tuning problems A. Dymond Tuning Optimization Algorithms 25 July 2014 8 / 20

  9. tMOPSO tMOPSO’s limitations tMOPSO tunes according to one performance measure under multiple OFE budgets. Can be problematic when tuning to representative testing problems: an average performance measure may result in over-tuning Risk can be greatly reduced through many objective tuning. A. Dymond Tuning Optimization Algorithms 25 July 2014 9 / 20

  10. MOTA many objective tuning algorithm MOTA is designed to tune an algorithm to multiple performance criteria over multiple OFE budgets Contribution : Not been done before... Applications tuning to a problem suite, multi-objectively (lower risk of over-tuning) tuning multi-objective algorithms A. Dymond Tuning Optimization Algorithms 25 July 2014 10 / 20

  11. MOTA many objective optimization Many objective = 4 or more objectives Pareto Dominance is not enough: PF size grows exponentially with the number of objectives. MOTA uses bi-objective decomposition. Bi-objective decomposition is well-suite for many objective tuning under multiple OFE budgets: history information efficient Pareto operations A. Dymond Tuning Optimization Algorithms 25 July 2014 11 / 20

  12. MOTA Numerical experiments Conducted Numerical experiments consisted of tuning NSGA-II [1] MOEA/D [5] MOTA’s design is successful: Efficient at many objective tuning Built from the ground up as an many objective tuning algorithm! A. Dymond Tuning Optimization Algorithms 25 July 2014 12 / 20

  13. Which algorithm to tune? MOTA and tMOPSO are for methods for tuning a selected algorithm. Question which algorithm to tune? A. Dymond Tuning Optimization Algorithms 25 July 2014 13 / 20

  14. Benchmarking via tunability Motivation Normal practice: benchmark algorithms using default CPVs on a standard test suite(s) Issue CPVs choices available to the practitioner not incorporated. CPVs are important: they allow application to a vast range of objective function characteristics, constraints, termination criteria (OFE budgets!). A. Dymond Tuning Optimization Algorithms 25 July 2014 14 / 20

  15. Benchmarking via tunability Algorithms versus Parameters Deterministic search process basic building block. Sensitive to objective function constraints termination criteria Algorithm set of search processes unified by a central idea CPVs determine which deterministic process is executed A. Dymond Tuning Optimization Algorithms 25 July 2014 15 / 20

  16. Benchmarking via tunability Description Premis An algorithm is well suited to a problem, if it is easy to find CPVs resulting in favourable performance. Tuning effort must be incorporated! A. Dymond Tuning Optimization Algorithms 25 July 2014 16 / 20

  17. Benchmarking via tunability Demonstration Numerical experiments showed that Benchmarking via tunability effective method: DE [3] versus EGO [2] : Default CPVs EGO better at low OFE budgets DE better at higher OFE budgets A. Dymond Tuning Optimization Algorithms 25 July 2014 17 / 20

  18. Benchmarking via tunability Demonstration Numerical experiments showed that Benchmarking via tunability effective method: DE [3] versus EGO [2] : Default CPVs EGO better at low OFE budgets DE better at higher OFE budgets Benchmarking via tunability OFE budget largely insignificant, EGO better for problems compatible with its surrogate meta-model. A. Dymond Tuning Optimization Algorithms 25 July 2014 17 / 20

  19. Conclusion Select an algorithm and CPVs appropriate for the problem at hand: Objective function characteristics Constraints imposed Termination criteria - OFE budget Tuning to testing problems suspected of being representative, can assist in this regard. Contributions tMOPSO, for tuning a single criteria under multiple OFE budgets. MOTA, for tuning to multiple criteria under multiple OFE budgets. Benchmarking via tunability, to help select the algorithm to tune. A. Dymond Tuning Optimization Algorithms 25 July 2014 18 / 20

  20. Acknowledgements Family Friends Prof. Heyns and Prof. Kok NRF GNU/Linux CHPC The giants upon whose shoulders we stand. “The first principle is that you must not fool yourself, and you are the easiest person to fool.” Richard P. Feynman A. Dymond Tuning Optimization Algorithms 25 July 2014 19 / 20

  21. Bibliography K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation , 6(2):182–197, 2002. Donald R Jones, Matthias Schonlau, and William J Welch. Efficient global optimization of expensive black-box functions. Journal of Global optimization , 13(4):455–492, 1998. R. Storn and K. Price. Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization , 11(4):341–359, 1997. D.H. Wolpert and W.G. Macready. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation , 1(1):67–82, 1997. Q. Zhang and H. Li. MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Transactions on Evolutionary Computation , 11(6):712–731, 2007. A. Dymond Tuning Optimization Algorithms 25 July 2014 20 / 20

Recommend


More recommend