genetic algorithms for optimization of noisy fitness
play

Genetic Algorithms for Optimization of Noisy Fitness Functions and - PowerPoint PPT Presentation

Genetic Algorithms for Optimization of Noisy Fitness Functions and Adaptation to Changing Environments Hajime Kita 1 and Yasuhito Sano 2 1 Kyoto University 2 Nissan Motor Co. Ltd. SMAPIP, July 2003 (1) Outline of


  1. ✬ ✩ Genetic Algorithms for Optimization of Noisy Fitness Functions and Adaptation to Changing Environments Hajime Kita 1 and Yasuhito Sano 2 1 Kyoto University 2 Nissan Motor Co. Ltd. ✫ ✪ SMAPIP, July 2003 (1)

  2. ✬ ✩ Outline of Talk • Optimization of Uncertain Functions – Problems – Applications • Approach: Genetic Algorithms • Genetic Algorithms for Noisy Fitness Functions – Memory-based Fitness Estimation GA • Genetic Algorithms for Adaptation to Changing Environments – Genetic Algorithm using Sub-Population ✫ ✪ SMAPIP, July 2003 (2)

  3. ✬ ✩ Optimization of Uncertain Functions Problems • Optimization Problem having uncertain parameters δ : min x f ( x, δ ) • Optimization of Noisy Fitness Function min x � f ( x ) + δ � δ • Adaptation to Changing Environments min x t f ( x t , δ t ) for t = 1 , 2 , · · · where δ t : unknown or predictable to some extent in deciding x t ✫ ✪ SMAPIP, July 2003 (3)

  4. ✬ ✩ Applications Online adaptation of the systems working in the real world. • Systems facing difficulty in precise simulation. • Optimization through experiments. • Online adaptation during usage of it. Simulation-based optimization of large and complex systems. • Optimization through simulation using random numbers. • Random fluctuation of observed performance. • Ex.: Traffic, communication and production systems. ✫ ✪ SMAPIP, July 2003 (4)

  5. ✬ ✩ Challenges • Problems themselves are more difficult than usual optimization problems. • Available numbers of fitness evaluation is severely restricted in practical applications. • Trade-off between – ‘to know more about the system (for good estimation)’ and – ‘to behave better in the system (for good optimization)’ ✫ ✪ SMAPIP, July 2003 (5)

  6. ✬ ✩ Example (Noisy Fitness, Online Adaptation): Problem of Engine Control for Motorcycle • Task: To improve response of the engine by dynamic control of air-fuel ratio. – Acceleration operation of throttle – Time lag of fuel injection cause power down of engine. – Dynamic compensation of fuel injection for acceleration • Challenge – Difficulty in constructing a precise simulator. – Large noise in observation of acceleration. – Optimization within 1000 evaluations. ✫ ✪ SMAPIP, July 2003 (6)

  7. ✬ ✩ Intake tube of a Engine Fuel flow Fuel flo Air flo Air flow Throttle plate Fuel Injector p p 2, 1, 0 + Initial A/F MVEM : Throttle M e, M f p Angle c d/dt Neural Network Limiter : Engine Speed (a) Intake of Engine (b) Evaluation System 39000 39000 Noiseless GA MFEGA Noiseless GA MFEGA Sample 3-GA 38500 38500 tested-MFEGA f Best f Best 38000 38000 Sample 10-GA Standard GA 37500 37500 370000 37000 0 200 400 600 800 1000 1200 1400 1600 1800 2000 200 400 600 800 1000 1200 1400 1600 1800 2000 ✫ ✪ Evaulation Evaluation (c) Simulation Result SMAPIP, July 2003 (7)

  8. ✬ ✩ Example (Noisy Fitness, Simulation-based Optimization): Problem of Multi-Car Elevator Control • Multi-car Elevator: an elevator system having several cars in a single elevator shaft using linear motor. • Applicability of conventional elevator group control is limited. • Task: to design controller through simulation-based optimization. • Challenge: – Discrete event simulation use random numbers. – Observable variables are imperfect. – Single simulation run takes about 30sec. ✫ ✪ SMAPIP, July 2003 (8)

  9. ✬ ✩ Hall operation panel Zone 1 Up Car Down Car Zone 2 Terminal floor Grage floor Escape Multi-Car Elevator System ✫ ✪ SMAPIP, July 2003 (9)

  10. ✬ ✩ 80 70 MFEGA Sample-5 GA 70 60 60 50 50 40 Frequency Frequency 40 30 30 20 20 10 10 0 0 1500 2000 2500 3000 3500 4000 4500 1500 2000 2500 3000 3500 4000 4500 Performance Performance (a) MFEGA (b) Sample-5 GA 70 8000 Standard GA 11dim 22dim, seed=27 22dim, seed=37 22dim, seed=47 60 7000 50 6000 Performance 40 Frequency 5000 30 4000 20 3000 10 0 2000 1500 2000 2500 3000 3500 4000 4500 0 50 100 150 200 250 300 350 400 Performance Generation (c) Standard GA (d) 11-dim/22-dim Problem ✫ ✪ SMAPIP, July 2003 (10)

  11. ✬ ✩ Example: (Changing Environment) Problem of Boat Engine Control • Operation of motor boat: steering and engine throttle • Dynamics of boat changes largely depending on steering • Task: to achieve steady velocity during steering by engine control • Challenge – To detect driver’s will of steering. – Optimization of engine control in several operation modes (Go straight, Turn, Do Slalom). – Limited numbers of evaluation. ✫ ✪ SMAPIP, July 2003 (11)

  12. ✬ ✩ START Prior Estimation of the Environment Choice of a Control Scheme among the Population Adoption of the Control Scheme Operation of the System Evaluation of the Performance Posterior Estimation of the Environment Update the Population of Control Scheme Control scheme in changing environment. ✫ ✪ SMAPIP, July 2003 (12)

  13. ✬ ✩ 3 Estimated Environment Straight Environment Slalom True Environment 2 1 Turn 100 120 140 160 180 200 term (a) Model of Environment (b) Sample Data 3000 2500 2000 ave E 3 1500 f t E 2 Optimal of E 3 1000 Optimal of E 2 500 E 1 Optimal of E 1 0 0 50 100 150 200 250 300 350 400 Fitness: Artificial Evaluation ✫ (c) Result of GASP ✪ SMAPIP, July 2003 (13)

  14. ✬ ✩ Approach: Genetic Algorithms Genetic Algorithms (GAs) Optimization, adaptation and learning algorithms inspired by the natural selection theory of evolution: • Generate an initial population of solution candidates. • Repeat – Generate new individuals using crossover and mutation operations. – Evaluate individuals by fitness function. – Select good individuals as survivors. ✫ ✪ SMAPIP, July 2003 (14)

  15. ✬ ✩ Advantages of GAs • Direct optimization method that uses only fitness function values, • Stochastic search for global optimization, and • Robustness of population-based search. ✫ ✪ SMAPIP, July 2003 (15)

  16. ✬ ✩ GAs for Noisy Fitness Functions Problem min x � F ( x ) � δ F ( x ) = f ( x ) + δ where x : continuous decision variable, F ( x ): observation of fitness value, f ( x ): true fitness function, δ :additive noise and � δ � δ = 0. ✫ ✪ SMAPIP, July 2003 (16)

  17. ✬ ✩ GA Approaches to Noisy Fitness Application of Conventional GAs : • Use self-averaging nature of the population based search[Tsutsui,Ghosh 1997]. • It requires large population sizes, and takes long time for convergence. GA with Multiple Sampling : • To sample fitness values several times for each individual, and use the mean[Fitzpatrick, Greffenstette, 1988]. • Reduce variance without specific assumptions on fitness function. ✫ ✪ SMAPIP, July 2003 (17)

  18. ✬ ✩ • Improvement by adjustment and restriction of sampling[Branke 1998, 1998]. • Require large number of fitness evaluation. Referring to Fitness Values of Other Individuals : • To evaluate an individual using fitness values of near individuals. • Require some assumption (model) of fitness function. • To use fitness values of parents[Tamaki, Arai 1997, Tanooka et al. 1999]. Systematic error due to selection. • To use nearby individuals[Branke 1998]. • Memory-based Fitness Evaluation GA(MFEGA)[Sano, Kita 2000]. ✫ ✪ SMAPIP, July 2003 (18)

  19. ✬ ✩ Memory-based Fitness Estimation GA Concept of MFEGA • Aim: to use sampled fitness values through search as far as possible. • To store sampled fitness values into memory as search history. • To introduce a simple stochastic model of fitness values for estimation. • To estimate fitness values of points of interests using the history for selection operation in GA. ✫ ✪ SMAPIP, July 2003 (19)

  20. ✬ ✩ A Stochastic Model of Fitness Functions • Fitness values of individuals F ( h ) distributed randomly around the fitness value at the point of interest, • The variance of the fitness value depends only on the distance d from the point of interest x . f ( h ) ∼ N ( f ( x ) , kd ) N (0 , σ 2 ) δ ∼ f ( h ) + δ ∼ N ( f ( x ) , kd + σ 2 ) F ( h ) = ✫ ✪ SMAPIP, July 2003 (20)

  21. ✬ ✩ F ( h ) h 2 kd kd+ σ F ( x ) d x Stochastic model of noisy fitness. ✫ ✪ SMAPIP, July 2003 (21)

  22. ✬ ✩ Estimation of Fitness ML Estimation: H 1 � F ( x ) + ( k ′ d l + 1) F ( h l ) ˜ l =2 f ( x ) = H 1 � 1 + ( k ′ d l + 1) l =2 where h l , l = 1 , ..., H be the H individuals in the search history, k ′ = k/σ 2 . Estimation of k ′ : ML estimation assuming f ( x ) be the mean of fitness values of the solutions nearby the found best solution. ✫ ✪ SMAPIP, July 2003 (22)

  23. ✬ ✩ Prototype Algorithm of the MFEGA Use UNDX[Ono, Kobayashi 1997] for crossover, No mutation. ( Initialization ) 1. Initialize the population of M individuals x 1 , ..., x M randomly. 2. Let evaluation counter e = 0. Set the maximal evaluations to E . 3. Let history H = φ . ( Main Loop ) 4. Choose x p 1 and x p 2 from the population as parents. 5. Produce x c 1 , ..., x c C by applying the crossover to the parents. 6. Let y 1 = x p 1 , y 2 = x p 2 and y i +2 = x c i , i = 1 , ..., C . Call F = { y 1 , · · · , y C +2 } a family. ✫ ✪ SMAPIP, July 2003 (23)

Recommend


More recommend