Particle swarm optimization is a heuristic global optimization method and also an optimization algorithm, which is based on swarm intelligence. It comes from the research on the bird and fish flock movement behavior. The algorithm is widely used and rapidly developed for its easy implementation and few particles required to be tuned. The main idea of the principle of PSO is presented; the advantages and the shortcomings are summarized. At last this paper presents some kinds of improved versions of PSO and research situation, and the future research issues are also given.
4. Origins
• Reynolds proposed a behavioral model in
which each agent follows 3 rules:
o Separation - If too close agents move away
o Alignment - Each agent steers towards the
average heading of its neighbours
o Cohesion - Each agent tries to move towards
the average heading of its neighbours
5. The Idea
• J. Kennedy and R. Eberhart included a ‘roost’ in a
simplified Reynolds-like simulation so that:
o Each agent was attracted towards the location of
the roost.
o Each agent ‘remembered’ where it was closer to
the roost.
o Each agent shared information with its neighbors
(originally, all other agents) about its closest
location to the roost.
6. The Idea
• Eventually, all the agents ‘landed’ on the
roost.
• What if the notion of distance to the roost is
changed by an unknown function? Will the
agents ‘land’ in the minimum?
• Eberhart and Kennedy recognized the
suitability of this technique for optimization
and thus developed PSO in 1995.
7. Basics
Nature Algorithm
Bird or fish Particles
Explore the environment
(3D) in search for food
Explore objective space
(nD) in search for good
function values
Exchange information by
acoustical or optical means
Exchange information by
sharing positions of
promising locations
Main Idea: Mimic bird flocking or fish schooling.
8. Fundamentals
• PSO is a robust stochastic optimization
technique based on the movement and
intelligence of swarms.
• Each particle is searching for the optimum.
• Each particle is moving and hence has a
velocity.
9. Parameters
• Each particle keeps track of its coordinates in
the solution space which are associated with
the best solution (fitness) that has been
achieved so far by that particle. This value is
called personal best, pbest.
• But this would not be much good on its own;
particles need help in figuring out where to
search.
10. Parameters
• Thus, another best value that is tracked by the
PSO is the best value obtained so far by any
particle in the neighborhood of that particle.
This value is called gbest.
• The basic concept of PSO lies in accelerating
each particle toward its pbest and the gbest
locations, with a random weighted
acceleration at each time step.
11. Cooperation theory
• The particles in the swarm co-operate. They
exchange information about what they’ve
discovered in the places they have visited.
• The co-operation is very simple. In basic PSO
it is like this:
o A particle has a neighborhood associated with it.
o A particle knows the fitnesses of those in its
neighborhood, and uses the position of the one with
best fitness.
o This position is simply used to adjust the particle’s
velocity.
13. Applications of PSO
• Neural Networks
• Telecommunication
• Data mining
• Signal Processing
• Combinatorial Optimization
• and many others.
14. The Algorithm
For each particle
Initialize particle
END
Do
For each particle
Calculate fitness value
If the fitness value is better than its personal best
set current value as the new pBest
End
Choose the particle with the best fitness value of all as
gBest
For each particle
Calculate particle velocity
Update particle position
End
While maximum iterations or minimum error criteria is not
attained
16. References
• Particle Swarm Optimization introduction -
Marco A. Montes de Oca, IRIDIA-CoDE,
Universit´e Libre de Bruxelles (U.L.B.) May 7,
2007
• PSO by Maurice Clerc
• PSO invented by Russ Eberhart (electrical
engineer) and James Kennedy (social
psychologist) in USA
• Wikipedia