Printer Friendly

Simplified particle swarm optimization algorithm/Algoritmo simplificado de otimizacao de enxame de particulas.

Introduction

In natural systems, we commonly observe mechanisms where natural agents seem to be organized in a rational and ordered way. Social insects, such as ants and bees, are especially interesting examples, and exhibit some remarkable characteristics. Despite the extremely low intellectual capacity of the individuals, the colony can solve surprisingly complex problems while searching for food (DORIGO et al., 1996; KARABOGA; AKAY, 2009; KARABOGA; BASTURK, 2007).

Some of these natural agents are the main inspiration for interesting and powerful algorithms used in the search for optimal solutions of highly non-trivial problems (CHENG et al., 2009; KARABOGA, 2009; KARABOGA; BASTURK, 2007; TOKSARI, 2006; WANG et al., 2007) such as finding the global minimum of nonlinear functions, truss optimization, the classical traveling salesman problem, electric power systems, traffic flow, polymer design, the Schottky-Barrier estimation in diode models and several other applications (ALRASHIDI; EL-HAWARY, 2009; HUANG et al., 2003; MARTINS et al., 2008; TEODOROVIC 2003; WANG; YE, 2009).

This field of study is known as "swarm intelligence" and has attracted an increasingly number of researchers since the proposal of Particle Swarm Optimization (PSO) algorithm and also of the Ant Colony Optimization (ACO) Algorithm (DORIGO et al., 1996).

In the specific case of PSO, the algorithm can be understood as the application of an updating process on a set of particles which are moving throughout the search space. In the traditional version, each particle is defined by three vectors: the position in the D-dimensional search space [x.sub.i], the individual best position [p.sub.i], and the velocity [v.sub.i]. Particles are initialized with random vectors for initial positions and initial velocities. At each step, velocities and positions of the entire swarm are updated accordingly with the rules described as follows.
PSO Algorithm--the updating process
for each time step t do
 for each particle I in the swarm do
  update position x using [v.sub.i] = [v.sub.i] + [ce.sub.i]
  ([p.sub.i] - [x.sub.i]) + [ce.sub.2] ([p.sub.g] - [x.sub.i]),

               [x.sub.i] = [x.sub.i] + [v.sub.i]

  calculate particle fitness f([x.sub.i])
  update [p.sub.i] and [p.sub.g]
 end for
end for


where:

c is a constant with the value of 2.0, [e.sub.1] and [e.sub.2] are random numbers and pg is the best position of the neighbors of the particle.

The main simplification was done by suppressing the velocity updating present in the original PSO. In this way, we may save some computational effort and obtain a reasonable performance (i.e. in comparison to that from the PSO) in the optimization of nonlinear functions. Furthermore, the implementation of the algorithm described in the next section is straightforward, making it easy to work with.

Material and methods

Simplified Particle Swarm Optimization algorithm

The proposed algorithm, which is presented in Figure 1, can be understood as a simplified version of the PSO algorithm and can be used to find the global minimum of nonlinear functions. Initially, a swarm of particles is defined, then they are randomly distributed over the search space, and an updating strategy is applied at each step in order to define the new positions of the particles. This updating process is made accordingly to the relation given by the Equation 1:

[x.sub.i+1] = [x.sub.i] + (R + [S.sub.i])[w.sub.i] (1)

where:

[X.sub.i] represents the position vector of particle i, R is a Random number between -1 and +1, [w.sub.i] is a vector pointing the direction of the line joining the point "i" and [x.sub.Global] and with size [absolute value of w] = (domain size)/(number of particles). [S.sub.i] is determined accordingly with the relative position of the particle and the best result obtained until that moment in the simulation, which is called [x.sub.Global] and calculated by equations (2a), (2b) and (2c), as follow:

[X.sub.i] - [X.sub.Global] < 0 [??] [S.sub.i] = - 1 [absolute value of [X.sub.i] - [X.sub.Global]] (2a)

[X.sub.i] - [X.sub.Global] > 0 [??] [S.sub.i] = [absolute value of [X.sub.i] - [X.sub.Global]] (2b)

[X.sub.i] - [X.sub.Global] = 0 [??] [S.sub.i] = 0 (2c)

The calculation of [S.sub.i] has fundamental importance on the proposed algorithm, since it is the feature that makes the particles move in a non-symmetric way. The updated position tends to be located closer to the global solution than its predecessor. This behavior allows a fast concentration of particles around the "instantaneous" best solution until the maximum number of iterations is reached, or until a new best solution ([x.sub.best]) is found. Thus, this "directed random walk" near [x.sub.best] can rapidly improve the obtained solution when it is not the global minimum yet.

[FIGURE 1 OMITTED]

Algorithm test and validation

The aim of the test and the validation step is to discuss the effect of the size of the swarm (or colony) proposed algorithm on the performance in the search for a global minimum of a nonlinear function. In this way, since the main interest is the optimization of functions, it is necessary to define some nonlinear benchmark functions to compare the obtained results with previous studies (KARABOGA; BASTURK, 2007; TOKSARI, 2006). The chosen functions are presented in Table 1.

The benchmark nonlinear functions constructed by analytical expressions (presented in Table 1) are shown in Figure 2. These functions, considered objective functions, were generated using the proposed ranges, also presented in Table 1.

Actually, the idea of some kind of asymmetric sorting for the direction of the next step is the central feature of the algorithm. In this way, the asymmetry can be described as an "attraction", or a scent comparing with ant's and bee's algorithms. The exclusive source of this scent will always be the best solution found until that moment in the simulation, and this source changes readily when the better solution is improved.

[FIGURE 2 OMITTED]

Results and discussion

Several tests were carried out in order to compare the performance of the proposed algorithm with the well known ACO (Ant Colony Optimization) and ABC (Artificial Bee Colony) algorithms. For this, a computational code was developed using the Fortran 90 Programming Language. This software makes it possible to test the proposed algorithm on the task of finding the global minimum from some benchmark functions, as presented in Table 1.

In Figures 3 to 6, we present the results for the global minimum search, using the benchmark functions shown in Table 1. In these graphs we have the mappings of the evolution of the global minimum results (y axis) as a function of the number of iteration cycles (marked on the x axis).

[FIGURE 3 OMITTED]

[FIGURE 4 OMITTED]

Besides that, we may observe that as colony size increases, the convergence of the global minimum result becomes faster. Actually, for colonies with a number of particles higher than a given number, which depends on the function considered, the global minimum found can be orders of magnitude smaller than before. On the other hand, when the colony size is larger than the critical size, increasing it will not improve significantly the performance. This last characteristic evidenced that the analysis of these "size-effects" can be an important task while evaluating the performance of a new algorithm, and that tests could be done using colony sizes that optimize the performance for the specific problem.

[FIGURE 5 OMITTED]

[FIGURE 6 OMITTED]

In the Table 2, we may compare numerical results obtained through the simulations carried out with the benchmark functions. The simulations of figures 3 to 6 were chosen to represent typical results for specific parameters. Several tests were accomplished using different seeds for the random number generator in order to ensure that the results shown are typical, instead of pathological ones.

Conclusion

A simplified version of the PSO algorithm was presented. The version proposed here is very simple to implement and the performance is comparable to other "colony" algorithms.

Four nonlinear "benchmark" functions were selected to test the algorithm in the search for the global minimum. This kind of test is of great importance in the case of functions, such as Schwefel, Ackley and Schaffer functions, which present serious difficulties to have their global minimum "revealed" by traditional methods.

Moreover, there is an optimal size (i.e. number of particles) for the colony that optimizes the convergence and provides great precision for the global minimum localization.

The results shown here, despite their simplicity, indicate that even simple algorithms can obtain good performance while solving non-trivial problems. Other aspect is the simple implementation, which makes it good as a "first trying" algorithm, at least when proving the power of swarm algorithms.

Doi: 10.4025/actascitechnol.v34i1.9679

References

ALRASHIDI, M. R.; EL-HAWARY, M. E. A survey of Particle Swarm Optimization applications in electric power systems. IEEE Transactions on Evolutionary Computation, v. 13, n. 4, p. 913-918, 2009.

CHENG, F. Q.; YE, F. F.; YANG, J. G. An ant colony optimization algorithm for partner selection in virtual enterprises. International Journal of Materials and Product Technology, v. 34, n. 3, p. 227-240, 2009.

DORIGO, M.; MANIEZZO, V.; COLORNI, A. Ant system: optimization by a colony of cooperating agents. IEEE Transactions on Systems Man and Cybernetics Part B-Cybernetics, v. 26, n. 1, p. 29-41, 1996.

HUANG, L.; ZHOU, C. G.; WANG, K. P. Hybrid ant colony algorithm for traveling salesman problem. Progress in Natural Science, v. 13, n. 4, p. 295-299, 2003.

KARABOGA, D.; AKAY, B. A comparative study of artificial bee colony algorithm. Applied Mathematics and Computation, v. 214, n. 1, p. 108-132, 2009.

KARABOGA, D.; BASTURK, B. A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. Journal of Global Optimization, v. 39, n. 3, p. 459-471, 2007.

KARABOGA, N. A new design method based on artificial bee colony algorithm for digital IIR filters. Journal of the Franklin Institute-Engineering and Applied Mathematics, v. 346, n. 4, p. 328-348, 2009.

MARTINS, B. V. C.; BRUNETTO, G.; SATO, F. Designing conducting polymers using bioinspired ant algorithms. Chemical Physics Letters, v. 453, n. 4-6, p. 290-295, 2008.

TEODOROVIC, D. Transport modeling by multiagent systems: a swarm intelligence approach. Transportation Planning and Technology, v. 26, n. 4, p. 289-312, 2003.

TOKSARI, M. D. Ant colony optimization for finding the global minimum. Applied Mathematics and Computation, v. 176, n. 1, p. 308-316, 2006.

WANG, J.; LIU, Y. H.; TIAN, D. X. An ant colony algorithm with global adaptive optimization. Journal of Computational and Theoretical Nanoscience, v. 4, n. 7-8, p. 1283-1289, 2007.

WANG, K.; YE, M. Parameter estimation of Schottkybarrier diode model by Particle Swarm Optimization. International Journal of Modern Physics C, v. 20, n. 5, p. 687-699, 2009.

Received on March 18, 2009.

Accepted on July 21, 2010.

License information: This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Ricardo Paupitz Barbosa dos Santos (2), Carlos Humberto Martins (1) * and Fabio Lucio Santos (3)

(1) Departamento de Engenharia Civil, Universidade Estadual de Maringa, Av. Colombo, 5790, 87020-900, Maringa, Parana, Brazil. (2) Departamento de Engenharia Agricola, Universidade Estadual de Maringa, Maringa, Parana, Brazil. (3) Departamento de Engenharia Agricola, Universidade Federal de Vicosa, Vicosa, Minas Gerais, Brazil. * Author for correspondence. E-mail: chmartins@uem.br
Table 1. Benchmark nonlinear functions employed to test the proposed
swarm optimization algorithm.

Function    Analytical Expression               Ranges        Minimum

Sphere      [MATHEMATICAL EXPRESSION NOT         [-10,10]        0
            REPRODUCIBLE IN ASCII]

Schwefel    [F.sub.2] = 837.9658 -              [-500,500]       0
            [x.sub.1]sin [[square root of
            [absolute value of [x.sub.1]]) -
            [x.sub.2]sin([square root of
            [absolute value of [x.sub.2]])

Schaffer    [F.sub.4] = 05 + [sin.sup.2]          [-3,3]         0
            ([square root of [x.sup.2.sub.1]
            + [x.sup.2.sub.2]]) - 0.5/
            (1 + 0.001[([x.sup.2.sub.1] +
            [x.sup.2.sub.2])).sup.2]

Ackley      [MATHEMATICAL EXPRESSION NOT         [-20,20]        0
            REPRODUCIBLE IN ASCII]
COPYRIGHT 2012 Universidade Estadual de Maringa
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2012 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:dos Santos, Ricardo Paupitz Barbosa; Martins, Carlos Humberto; Santos, Fabio Lucio
Publication:Acta Scientiarum. Technology (UEM)
Date:Jan 1, 2012
Words:2018
Previous Article:Prediction of ternary ion-exchange equilibrium using artificial neural networks and Law of Mass Action/Aplicacao de redes neurais artificiais e da...
Next Article:Seppomen maps for geomorphic developments analysis: the case of Parana plateau border, Faxinal, State of Parana, Brazil/Mapas de seppomen para...
Topics:

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters