ai ai
play

AI AI Department of Computer Science University of Calgary CPSC - PowerPoint PPT Presentation

Particle Swarm Optimization Particle Swarm Optimization (PSO) (PSO) Adaptive Swarms for Optimization Adaptive Swarms for Optimization Christian Jacob AI AI Department of Computer Science University of Calgary CPSC 565 Winter 2003


  1. Particle Swarm Optimization Particle Swarm Optimization (PSO) (PSO) Adaptive Swarms for Optimization Adaptive Swarms for Optimization Christian Jacob AI AI Department of Computer Science University of Calgary CPSC 565 — Winter 2003 Emergent Computing — CPSC 565 — Winter 2003 1 Christian Jacob, University of Calgary

  2. PSO: An Overview PSO: An Overview • Developed by – Russ Eberhart, Purdue School of Engineering and Technology, Indianapolis – Jim Kennedy, Bureau of Labor Statistics, Washington, DC • A concept for optimizing non-linear functions using particle swarm methodology • Has roots in Artificial Life and Evolutionary Computation • Simple concept • Easy to implement • Computationally efficient • Effective on a wide variety of problems Emergent Computing — CPSC 565 — Winter 2003 2 Christian Jacob, University of Calgary

  3. Evolution of Concept and Paradigms Evolution of Concept and Paradigms • Discovered through simplified social model simulation • Related to bird flocking, fish schooling and swarming theory • Related to evolutionary computation: – Genetic algorithms – Evolution strategies • Kennedy developed the “cornfield vector” for birds seeking food • Bird flock became swarm • Expanded to multi-dimensional search • Incorporated acceleration by distance • Paradigm simplified Emergent Computing — CPSC 565 — Winter 2003 3 Christian Jacob, University of Calgary

  4. Flocks, Herds, and Schools Flocks, Herds, and Schools • Avoid collisions • Match neighbours’ velocity and orientation • Steer toward the center Separation Alignment Cohesion Emergent Computing — CPSC 565 — Winter 2003 4 Christian Jacob, University of Calgary

  5. PSO Algorithm PSO Algorithm 1. Initialize population in hyperspace. stochastically assign locations and velocities • 2. Evaluate fitness of individual particles. 3. Keep track of location where individual had its highest fitness. 4. Modify velocities based on previous best and global (or neighbourhood) best positions. neighbourhoods don’t change • 5. Terminate if some condition is met. 6. Go to step 2. Fly solutions through problem space … Emergent Computing — CPSC 565 — Winter 2003 5 Christian Jacob, University of Calgary

  6. Particle Swarm Optimization Particle Swarm Optimization DEMO Emergent Computing — CPSC 565 — Winter 2003 6 Christian Jacob, University of Calgary

  7. Particle Swarms Particle Swarms • Sociocognitive Space – High-dimensional – Abstract: attitudes, behaviours, cognition – Heterogeneous with respect to evaluation (dissonance) – Multiple individuals • Individual is characterized by – Position = “mental state”: x i – Changes = velocity: v i Emergent Computing — CPSC 565 — Winter 2003 7 Christian Jacob, University of Calgary

  8. Particle Swarms: “ “Code Code” ” Particle Swarms: • Individuals (particles) learn from their own experience: – v i := v i + j () · ( p i - x i ) – x i := x i + v i • x i : current position of individual i • v i : current velocity of individual i • p i : so-far best position for individual I • ( p i - x i ): acceleration towards previous best • j (): generates random positive number • This formula, iterated over time, causes the individual’s trajectory to oscillate around their previous best point p i in sociocognitive space. • The velocity of individual i is stochastically adjusted depending on previous successes. Emergent Computing — CPSC 565 — Winter 2003 8 Christian Jacob, University of Calgary

  9. Particle Swarms with Neighbourhood Best Particle Swarms with Neighbourhood Best • Sociocognitive space can contain many individuals that influence one another: – v i := v i + j 1 () · ( p i - x i ) + j 2 () · ( p g - x i ) – x i := x i + v i • p g : previous best position in the population • ( p i - x i ): acceleration towards previous best • ( p g - x i ): acceleration towards global best • j 1 (), j 2 (): generate random positive numbers • Evaluate your present position. • Compare it to your previous best and neighbourhood best. • Imitate self and others. Emergent Computing — CPSC 565 — Winter 2003 9 Christian Jacob, University of Calgary

  10. General PSO Update Algorithm General PSO Update Algorithm • Global version: – v id := w i v id + c 1 j 1 () · ( p id - x id ) + c 2 j 2 () · ( p gd - x id ) – x id := x id + v id • d : dimension • c 1 , c 2 : positive constants set exploration vs. exploitation • w : inertia • j 1 (), j 2 (): generate random positive numbers • For neighbourhood version: – Change p gd to p ld . Emergent Computing — CPSC 565 — Winter 2003 10 Christian Jacob, University of Calgary

  11. The “ “Drunkard Drunkard’ ’s Walk s Walk” ” The • The particle will explode out of control if it is not limited in some way. Three methods are widely used: • V max : v i := v i + j 1 () · ( p i - x i ) + j 2 () · ( p g - x i ) if v i > V max then v i := V max else if v i < - V max then v i := V max • Inertia weight a : v i := a v i + j 1 () · ( p i - x i ) + j 2 () · ( p g - x i ) • Constriction coefficient c : v i := c ( v i + j 1 () · ( p i - x i ) + j 2 () · ( p g - x i )) Emergent Computing — CPSC 565 — Winter 2003 11 Christian Jacob, University of Calgary

  12. Important Parameters: V max Important Parameters: V max • An important parameter in PSO; typically the only one adjusted • Clamps particles’ velocities on each dimension • Determines “finteness” with which regions are searched: – If too high, can fly past optimal solutions – If too low, can get stuck in local minima • Set V max to dynamic range of the variables. Emergent Computing — CPSC 565 — Winter 2003 12 Christian Jacob, University of Calgary

  13. Important Parameters: Inertia Weight a Important Parameters: Inertia Weight a • Inertia weight a : v i := a v i + j 1 () · ( p i - x i ) + j 2 () · ( p g - x i ) • It seems possible that one can get rid of V max by setting a equal to the dynamic range of each variable. • Then a must be selected carefully and/or decreased over the run. • Hence, the inertia weight a seems to have attributes of the temperature in simulated annealing. Emergent Computing — CPSC 565 — Winter 2003 13 Christian Jacob, University of Calgary

  14. Evolutionary Computation & Particle Swarms Evolutionary Computation & Particle Swarms • Culture as evolution (anthropology) • Adaptation / learning • Memetics • Evolutionary epistemology • Change vs. selection • Fitness and dissonance • Cooperation vs. competition • Evolution = competitive struggle • PS = cooperation inherent PS = 5th EC paradigm? Emergent Computing — CPSC 565 — Winter 2003 14 Christian Jacob, University of Calgary

  15. Basic Principles of Swarm Intelligence Basic Principles of Swarm Intelligence • Proximity principle: – The population should be able to carry out simple space and time computations. • Quality principle: – The population should be able to respond to quality factors in the environment. • Diverse response principle: – The population should not commit its activities along excessively narrow channels. • Stability principle: – The population should not change its mode of behaviour every time the environment changes. • Adaptability principle: – The population must be able to change its behaviour mode when it’s worth the computational price. Emergent Computing — CPSC 565 — Winter 2003 15 Christian Jacob, University of Calgary

  16. Adherence to Swarm Intelligence Principles Adherence to Swarm Intelligence Principles • Proximity: – N-dimensional space calculations carried out over a series of time steps • Quality: – Population responds to quality factors p best and g best (or l best ) • Diverse response: – Responses allocated between p best and g best (or l best ) • Stability: – Population changes state only when g best (or l best ) changes • Adaptability: – Population does change state when g best (or l best ) changes Emergent Computing — CPSC 565 — Winter 2003 16 Christian Jacob, University of Calgary

  17. Enhancements of PSO Enhancements of PSO • Elitist concept from GA might be helpful in PSO. – Carry global best particle into next generation? • Could incorporate Gaussian distribution into stochastic velocity changes. – Variance might then be like inertia weight. – Put noise on decrease of inertia weights (better convergence) • Could assign V max on a parameter-by-parameter basis. – Analogous to controlling severity of mutation in GA & EP Emergent Computing — CPSC 565 — Winter 2003 17 Christian Jacob, University of Calgary

  18. GAs vs. PSO: Crossover GAs vs. PSO: Crossover • Does not have crossover. • Acceleration toward personal and global best is similar concept. • Particles midway between swarms also exhibit crossover features. • Recombination operator in evolution strategies may be more analogous. Emergent Computing — CPSC 565 — Winter 2003 18 Christian Jacob, University of Calgary

Recommend


More recommend