SWARM SWARM INTELLIGENCE INTELLIGENCE Milad Abolhassani Supervisor: Hamid Mir Vaziri 3
WHY? WHY? 4
WHAT KIND OF PROBLEMS? WHAT KIND OF PROBLEMS? 5
WHAT KIND OF PROBLEMS? WHAT KIND OF PROBLEMS? Optimization 5
WHAT KIND OF PROBLEMS? WHAT KIND OF PROBLEMS? Optimization Modeling 5
WHAT KIND OF PROBLEMS? WHAT KIND OF PROBLEMS? Optimization Modeling Simulation 5
OPTIMIZATION OPTIMIZATION 6 . 1
MODELING MODELING 6 . 2
SIMULATION SIMULATION 6 . 3
OPTIMIZATION OPTIMIZATION 7
EXHAUSTIVE SEARCH EXHAUSTIVE SEARCH 8
OTHER METHODS OTHER METHODS 9
OTHER METHODS OTHER METHODS Analytical 9
OTHER METHODS OTHER METHODS Analytical Uninformed 9
OTHER METHODS OTHER METHODS Analytical Uninformed Informed 9
METAHEURISTIC METAHEURISTIC 10
LOCAL & GLOBAL OPTIMUM LOCAL & GLOBAL OPTIMUM 11
COMPLEX SPACES COMPLEX SPACES 12
EXPLORATION EXPLORATION 13
EXPLOITATION EXPLOITATION 14
EXPLOITATION EXPLOITATION 15
CATEGORIES CATEGORIES 16
SWARM SWARM 17 . 1
GOAL GOAL Is to model their simple behaviors to �ndout about more complex behaviors. 17 . 2
SIGN BASED ALGORITHMS SIGN BASED ALGORITHMS 18 . 1
STEPS STEPS 18 . 2
STEPS STEPS 1. Init memory 18 . 2
STEPS STEPS 1. Init memory 2. Generate a solution 18 . 2
STEPS STEPS 1. Init memory 2. Generate a solution 3. Calculate the �tness of generated solution 18 . 2
STEPS STEPS 1. Init memory 2. Generate a solution 3. Calculate the �tness of generated solution 4. Continue this for all population 18 . 2
STEPS STEPS 1. Init memory 2. Generate a solution 3. Calculate the �tness of generated solution 4. Continue this for all population 5. Update signs memory 18 . 2
STEPS STEPS 1. Init memory 2. Generate a solution 3. Calculate the �tness of generated solution 4. Continue this for all population 5. Update signs memory 6. Repeat until stop condition meets 18 . 2
ACO ACO 18 . 3
ACO ACO Marco Dorigo (1992) 18 . 3
ACO ACO Marco Dorigo (1992) Finding good paths through graphs 18 . 3
HOW IT WORKS? HOW IT WORKS? 18 . 4
18 . 5
18 . 6
18 . 7
18 . 8
18 . 9
18 . 10
18 . 11
18 . 12
18 . 13
ACO ADVANTAGES ACO ADVANTAGES Search among a population in parallel Can give rapid discovery of good solutions Can adapt to changes in graph 18 . 14
ACO ADVANTAGES ACO ADVANTAGES Search among a population in parallel Can give rapid discovery of good solutions Can adapt to changes in graph 18 . 14
ACO DISADVANTAGES ACO DISADVANTAGES Prone to stagnation Premature convergence Uncertain converge time Long calculation time Solutions might be far from optimum 18 . 15
IMITATION BASED ALGORITHMS IMITATION BASED ALGORITHMS 19 . 1
STEPS STEPS 19 . 2
STEPS STEPS 1. Init Parameters 19 . 2
STEPS STEPS 1. Init Parameters 2. Init Population 19 . 2
STEPS STEPS 1. Init Parameters 2. Init Population 3. Move Particles 19 . 2
STEPS STEPS 1. Init Parameters 2. Init Population 3. Move Particles 4. Calculate the �tness 19 . 2
STEPS STEPS 1. Init Parameters 2. Init Population 3. Move Particles 4. Calculate the �tness 5. Update particles memories 19 . 2
STEPS STEPS 1. Init Parameters 2. Init Population 3. Move Particles 4. Calculate the �tness 5. Update particles memories 6. Repeat until stop condition meets 19 . 2
PSO PSO 19 . 3
19 . 4
19 . 5
19 . 6
19 . 7
19 . 8
19 . 9
19 . 10
PSO ADVANTAGES PSO ADVANTAGES 19 . 11
PSO ADVANTAGES PSO ADVANTAGES Fast 19 . 11
PSO ADVANTAGES PSO ADVANTAGES Fast Easy to implement 19 . 11
PSO ADVANTAGES PSO ADVANTAGES Fast Easy to implement No complex calculations 19 . 11
PSO ADVANTAGES PSO ADVANTAGES Fast Easy to implement No complex calculations Doesn't have so much parameters 19 . 11
PSO DISADVANTAGES PSO DISADVANTAGES 19 . 12
PSO DISADVANTAGES PSO DISADVANTAGES Prone to premature convergence 19 . 12
LET'S HA LET'S HA VE A LOOK TO VE A LOOK TO OTHER ALGORITHMS OTHER ALGORITHMS 20
HARMONY SEARCH HARMONY SEARCH 21 . 1
HARMONY SEARCH HARMONY SEARCH 21 . 2
HARMONY SEARCH HARMONY SEARCH Init Harmony Memory (RANDOM) 21 . 2
HARMONY SEARCH HARMONY SEARCH Init Harmony Memory (RANDOM) Improvise NEW harmony 21 . 2
HARMONY SEARCH HARMONY SEARCH Init Harmony Memory (RANDOM) Improvise NEW harmony If NEW is better than min(HM) 21 . 2
HARMONY SEARCH HARMONY SEARCH Init Harmony Memory (RANDOM) Improvise NEW harmony If NEW is better than min(HM) Replace(min(HM), NEW) 21 . 2
HARMONY SEARCH HARMONY SEARCH Init Harmony Memory (RANDOM) Improvise NEW harmony If NEW is better than min(HM) Replace(min(HM), NEW) Loop till end condition meets 21 . 2
HARMONY SEARCH HARMONY SEARCH 21 . 3
HS ADVANTAGES HS ADVANTAGES Quick convergence Easy implementation Less adjustable parameters Fewer mathematical requirements Generates a new solution, after considering all of the existing solutions 21 . 4
HS DISADVANTAGES HS DISADVANTAGES Premature convergence 21 . 5
ICA ICA 22 . 1
ICA ICA 22 . 2
ICA ICA 22 . 3
ICA PROS & CONS ICA PROS & CONS 22 . 4
PROS PROS Good speed Same and better solutions compared with other metaheuristic algorithms 22 . 5
PROS PROS Good speed Same and better solutions compared with other metaheuristic algorithms CONS CONS Complex implementation 22 . 5
GWO GWO Mimics leadership hierachy of wolves 23 . 1
GWO HIERACHY GWO HIERACHY 23 . 2
SOCIAL BEHA SOCIAL BEHA VIOR OF GREY WOLVES VIOR OF GREY WOLVES Tracking, chasing, and approaching the prey. Pursuing, encircling, and harassing the prey until it stops moving. Attack towards the prey. 23 . 3
23 . 4
GWO GWO ENCIRCLING PREY ENCIRCLING PREY 23 . 5
GWO GWO
23 . 6
GWO GWO ATTACK ATTACK 23 . 7
TO SUM UP: TO SUM UP: 23 . 8
TO SUM UP: TO SUM UP: Creating a random population of grey wolves 23 . 8
TO SUM UP: TO SUM UP: Creating a random population of grey wolves Alpha, beta, and delta wolves estimate the probable position of the prey 23 . 8
TO SUM UP: TO SUM UP: Creating a random population of grey wolves Alpha, beta, and delta wolves estimate the probable position of the prey Each candidate solution updates its distance from the prey 23 . 8
Recommend
More recommend