evolutionary algorithm
play

Evolutionary Algorithm 2. Swarm Intelligence and Ant Colony - PowerPoint PPT Presentation

Outline DM63 HEURISTICS FOR COMBINATORIAL OPTIMIZATION 1. Evolutionary Algorithms Lecture 10 Evolutionary Algorithm 2. Swarm Intelligence and Ant Colony Optimization Ant Colony Optimization Background Ant Colony Optimization: the


  1. Outline DM63 HEURISTICS FOR COMBINATORIAL OPTIMIZATION 1. Evolutionary Algorithms Lecture 10 Evolutionary Algorithm 2. Swarm Intelligence and Ant Colony Optimization Ant Colony Optimization Background Ant Colony Optimization: the Metaheuristic Marco Chiarandini DM63 – Heuristics for Combinatorial Optimization Problems 2 Solution representation Outline ◮ neat separation between solution encode or representation ( genotype ) from actual variables ( phenotype ) ◮ a solution s ∈ S is represented by a string that is: the genotype set is made of strings of length l whose elements are symbols from an alphabet A such that there exists a map: 1. Evolutionary Algorithms c : A l → S ◮ the elements of strings are the genes ◮ the values of elements can take are the alleles 2. Swarm Intelligence and Ant Colony Optimization Background ◮ the search space is then X ⊆ A l , Ant Colony Optimization: the Metaheuristic ◮ if the strings are member of a population they are called chromosomes and their recombination crossover ◮ strings are evaluated by f ( c ( x )) = g ( x ) which gives them a fitness ⇒ binary representation is appealing but not always good ( e.g. , in constrained problems binary crossovers might not be good) DM63 – Heuristics for Combinatorial Optimization Problems 3 DM63 – Heuristics for Combinatorial Optimization Problems 4

  2. Example Conjectures on the goodness of EA schema : subset of A l where strings have a set of variables fixed. Ex.: 1 * * 1 ◮ exploit intrinsic parallelism of schemata ◮ Schema Theorem: E [ N ( S, t + 1 )] ≥ F ( S, t ) F ( S ) N ( s, t )[ 1 − ǫ ( S, t )] ¯ ◮ a method for solving all problems ⇒ disproved by the No Free Lunch Theorems ◮ building block hypothesis DM63 – Heuristics for Combinatorial Optimization Problems 5 DM63 – Heuristics for Combinatorial Optimization Problems 6 Initial Population Selection ◮ Which size? Trade-off Main idea: selection should be related to fitness ◮ Minimum size: connectivity by recombination is achieved if at least one ◮ Fitness proportionate selection (Roulette-wheel method) instance of every allele is guaranteed to be be present at each locus. Ex: if binary: f i P ∗ 2 = ( 1 − ( 0.5 ) M − 1 ) l p i = � j f j for l = 50 , it is sufficient M = 17 to guarantee P ∗ 2 > 99.9 % . ◮ Tournament selection: a set of chromosomes is chosen and compared ◮ Often: independent, uninformed random picking from and the best chromosome chosen. given search space. ◮ Attempt to cover at best the search space, eg, Latin hypercube. ◮ Rank based and selection pressure ◮ But: can also use multiple runs of construction heuristic. DM63 – Heuristics for Combinatorial Optimization Problems 7 DM63 – Heuristics for Combinatorial Optimization Problems 8

  3. Example: crossovers for binary representations Recombination (Crossover) ◮ Binary or assignment representations ◮ one-point, two-point, m-point (preference to positional bias w.r.t. distributional bias ◮ uniform cross over (through a mask controlled by a Bernoulli parameter p ) ◮ Non-linear representations ◮ (Permutations) Partially mapped crossover ◮ (Permutations) mask based ◮ More commonly ad hoc crossovers are used as this appears to be a crucial feature of success ◮ Two off-springs are generally generated ◮ Crossover rate controls the application of the crossover. May be adaptive: high at the start and low when convergence DM63 – Heuristics for Combinatorial Optimization Problems 9 DM63 – Heuristics for Combinatorial Optimization Problems 10 Mutation ◮ Goal: Introduce relatively small perturbations in candidate solutions in current population + offspring obtained from recombination . Subsidiary perturbative search ◮ Typically, perturbations are applied stochastically and independently to each candidate solution; amount of perturbation is controlled by ◮ Often useful and necessary for obtaining high-quality candidate solutions. mutation rate . ◮ Mutation rate controls the application of bit-wise mutations. May be ◮ Typically consists of selecting some or all individuals in adaptive: low at the start and high when convergence the given population and applying an iterative improvement procedure to ◮ Possible implementation through Poisson variable which determines the each element of this set independently. m genes which are likely to change allele. ◮ Can also use subsidiary selection function to determine subset of candidate solutions to which mutation is applied. ◮ The role of mutation (as compared to recombination) in high-performance evolutionary algorithms has been often underestimated DM63 – Heuristics for Combinatorial Optimization Problems 11 DM63 – Heuristics for Combinatorial Optimization Problems 12

  4. New Population ◮ Determines population for next cycle ( generation ) of the algorithm by selecting individual candidate solutions from current population + new candidate solutions obtained from recombination , mutation (+ subsidiary perturbative search ). ( λ, µ ) ( λ + µ ) ◮ Goal: Obtain population of high-quality solutions while maintaining population diversity . ◮ Selection is based on evaluation function ( fitness ) of candidate solutions such that better candidate solutions have a higher chance of ‘surviving’ the selection process. ◮ It is often beneficial to use elitist selection strategies , which ensure that the best candidate solutions are always selected. ◮ Most commonly used: steady state in which only one new chromosome is generated at each iteration ◮ Diversity is checked and duplicates avoided DM63 – Heuristics for Combinatorial Optimization Problems 13 DM63 – Heuristics for Combinatorial Optimization Problems 14 Example: A memetic algorithm for TSP Types of evolutionary algorithms ◮ Search space: set of Hamiltonian cycles Note: tours can be represented as permutations of vertex indexes. ◮ Genetic Algorithms (GAs) [Holland, 1975; Goldberg, 1989]: ◮ Initialization: by randomized greedy heuristic (partial tour of n/4 ◮ have been applied to a very broad range of (mostly discrete) vertices constructed randomly before completing with greedy). combinatorial problems; ◮ often encode candidate solutions as bit strings of fixed length, which is ◮ Recombination: greedy recombination operator GX applied to n/2 now known to be disadvantageous for combinatorial problems such as the pairs of tours chosen randomly: TSP. 1) copy common edges (param. p e ) ◮ Evolution Strategies [Rechenberg, 1973; Schwefel, 1981]: 7 2) add new short edges (param. p n ) 3) copy edges from parents ordered by increasing length (param. p c ) ◮ originally developed for (continuous) numerical optimization problems; ◮ operate on more natural representations of candidate solutions; 4) complete using randomized greedy. ◮ use self-adaptation of perturbation strength achieved by mutation ; ◮ Subsidiary perturbative search: LK variant. ◮ typically use elitist deterministic selection . ◮ Mutation: apply double-bridge to tours chosen uniformly at random. ◮ Evolutionary Programming [Fogel et al. , 1966]: ◮ similar to Evolution Strategies (developed independently), ◮ Selection: Selects the µ best tours from current population of µ + λ but typically does not make use of recombination and uses stochastic tours (=simple elitist selection mechanism ). selection based on tournament mechanisms . ◮ Restart operator: whenever average bond distance in the population ◮ often seek to adapt the program to the problem rather than the solutions falls below 10. DM63 – Heuristics for Combinatorial Optimization Problems 15 DM63 – Heuristics for Combinatorial Optimization Problems 16

  5. Theoretical studies No Free Lunch Theorem ◮ Through Markov chains modelling some versions of evolutionary algorithms can be made to converge with probability 1 to the best possible solutions in the limit [Fogel, 1992; Rudolph, 1994]. ◮ Convergence rates on mathematically tractable functions or with local approximations [B¨ ack and Hoffmeister, 2004; Beyer, 2001]. ◮ ”No Free Lunch Theorem” [Wolpert and Macready, 1997]. On average, within some assumptions, blind random search is as good at finding the minimum of all functions as is hill climbing. However: ◮ These theoretical findings are not very practical. ◮ EAs are made to produce useful solutions rather than perfect solutions. DM63 – Heuristics for Combinatorial Optimization Problems 17 DM63 – Heuristics for Combinatorial Optimization Problems 18 Research Goals Outline ◮ Analyzing classes of optimization problems and determining the best 1. Evolutionary Algorithms components for evolutionary algorithms. ◮ Applying evolutionary algorithms to problems that are dynamically 2. Swarm Intelligence and Ant Colony Optimization changing. Background Ant Colony Optimization: the Metaheuristic ◮ Gaining theoretical insights for the choice of components. DM63 – Heuristics for Combinatorial Optimization Problems 19 DM63 – Heuristics for Combinatorial Optimization Problems 20

Recommend


More recommend