3
Niching with Sub-swarm based Particle Swarm Optimization Muhammad Rashid, Abdul Rauf Baig and Kashif Zafar Department of Computer Science National University of Computer and Emerging Sciences Islamabad, Pakistan [email protected], [email protected], [email protected] Abstract— In this study we present a sub-swarm based particle swarm optimization algorithm for niching (NSPSO). The NSPSO algorithm is capable of locating and maintaining a sufficient number of niches throughout the execution of the algorithm. The niches which are identified are then exploited by using a sub-swarm strategy which tries to refine the niche and converge to an optimum solution. NSPSO is capable of locating multiple solutions and is well suited for multimodal optimization problems. From the experimentation results, we have observed that NSPSO is quite efficient in locating both global and local optima. We present a comparison of the performance of NSPSO with NichePSO and SPSO. Keywords-niche; sub-swarm; swarm intelligence; particle swarm optimization; multimodal functions; optimization I. INTRODUCTION The original Particle Swarm Optimization (PSO) was developed in 1995 [7]. The PSO algorithm consists of a swarm of particles, each having a position and velocity. Each particle is a potential solution to the optimization problem and thus the swarm represents the collection of all potential solutions to the problem. PSO tries to find the optimal solution to the problem by moving the particles and evaluating the fitness of the new position. Over the past decade PSO has under gone many improvements and enhancements. Majority of these modifications have been made to increase the diversity of swarm in order to improve the convergence of the PSO. The modifications made to PSO include the introduction of an inertia weight, velocity clamping, velocity constrictions, different ways of determining the personal best and global best positions, and different velocity models. In addition to the modifications made to basic PSO algorithm, a variety of PSO variations have also been developed. These include the sub-swarm based PSO algorithms and PSO with Niching capabilities. Sub-swarm based PSO have been used to locate single solutions for unimodal optimization problems. However, for multimodal optimization problems, niche based approaches have been preferred because of their capability to locate multiple solutions. Those PSO variations in which grouping of particles into sub-swarms have been incorporated are called Sub-swarm based PSO. These sub-swarms can either exist in cooperative mode or competitive mode [5]. Some examples of sub-swarm based PSO algorithms include the Hybrid Particle Swarm Optimiser with Breeding and Subpopulations by Lovberg et al. [10], Multi-phase PSO (MPPSO) by Al-Kazemi and Mohan [1], Life-cycle PSO (LCPSO) by Krink and Lovberg [9], Clustering based PSO with Stereotyping by Kennedy [8], Clustering based PSO by Thompson et al. [14], Cooperative Split PSO (CPSO-Sk) by Van den Bergh and Engelbrecht [15] and Predator-Prey PSO by Silva et al.[13]. Niching algorithms are those algorithms which are capable of locating multiple solutions to a problem. Niches can be defined as partitions of an environment which represent one solution to a problem. Speciation is the process of finding multiple niches or solutions. Species are the partitions of a population competing within an environment. They are the group of particles which converge on a single niche [5]. Some examples of niching algorithms include the Sequential Niching PSO employed by Kassabalidis et al. [6], PSO with Objective Function Stretching by Parsopoulous et al. [12] and nbest PSO by Brits et al. [3]. There are some PSO variants which are capable of finding multiple solutions to multi-modal problems by employing a sub-swarm based niching approach. These include the NichePSO presented by Brits et al. [4] and Species based PSO proposed by Parrott and Li [11]. II. NICHING WITH SUB-SWARM BASED PARTICLE SWARM OPTIMIZATION In this study we present extensions to the particle swarm optimization algorithm to allow it to discover niches and locate multiple solutions for multimodal optimization problems. The proposed niching with sub-swarm based particle swarm optimization (NSPSO) algorithm is able to locate niches by employing a scouting technique inspired by the food foraging behavior of honey bees. This technique for identifying promising regions has been utilized by Baig and Rashid [2]. Here we have adopted this technique to identify niches. Once the niches have been identified, we create sub- swarms in those niches and allow the sub-swarm to refine the solution. Once the sub-swarm has converged, we record the best position of the sub-swarm as one of the many solutions to the multimodal optimization problem. In NSPSO we have a total population of s particles. The algorithm starts by randomly initializing all of the s particles with in the search space. The fitness of these particles is then evaluated and the particles are sorted according to their fitness. We then select n best particles from these and use them to create sub-swarms. The sub-swarms are created by randomly placing m/n particles within a radius r of each of the n best particles. Thus these randomly initialized particles serve the purpose of finding niches for the first iteration. We also 2009 International Conference on Computer Technology and Development 978-0-7695-3892-1/09 $26.00 © 2009 IEEE DOI 10.1109/ICCTD.2009.30 182 2009 International Conference on Computer Technology and Development 978-0-7695-3892-1/09 $26.00 © 2009 IEEE DOI 10.1109/ICCTD.2009.30 181

[IEEE 2009 International Conference on Computer Technology and Development - Kota Kinabalu, Malaysia (2009.11.13-2009.11.15)] 2009 International Conference on Computer Technology and

  • Upload
    kashif

  • View
    214

  • Download
    2

Embed Size (px)

Citation preview

Page 1: [IEEE 2009 International Conference on Computer Technology and Development - Kota Kinabalu, Malaysia (2009.11.13-2009.11.15)] 2009 International Conference on Computer Technology and

Niching with Sub-swarm based Particle Swarm Optimization

Muhammad Rashid, Abdul Rauf Baig and Kashif Zafar Department of Computer Science

National University of Computer and Emerging Sciences Islamabad, Pakistan

[email protected], [email protected], [email protected]

Abstract— In this study we present a sub-swarm based particle swarm optimization algorithm for niching (NSPSO). The NSPSO algorithm is capable of locating and maintaining a sufficient number of niches throughout the execution of the algorithm. The niches which are identified are then exploited by using a sub-swarm strategy which tries to refine the niche and converge to an optimum solution. NSPSO is capable of locating multiple solutions and is well suited for multimodal optimization problems. From the experimentation results, we have observed that NSPSO is quite efficient in locating both global and local optima. We present a comparison of the performance of NSPSO with NichePSO and SPSO.

Keywords-niche; sub-swarm; swarm intelligence; particle swarm optimization; multimodal functions; optimization

I. INTRODUCTION The original Particle Swarm Optimization (PSO) was

developed in 1995 [7]. The PSO algorithm consists of a swarm of particles, each having a position and velocity. Each particle is a potential solution to the optimization problem and thus the swarm represents the collection of all potential solutions to the problem. PSO tries to find the optimal solution to the problem by moving the particles and evaluating the fitness of the new position. Over the past decade PSO has under gone many improvements and enhancements. Majority of these modifications have been made to increase the diversity of swarm in order to improve the convergence of the PSO. The modifications made to PSO include the introduction of an inertia weight, velocity clamping, velocity constrictions, different ways of determining the personal best and global best positions, and different velocity models. In addition to the modifications made to basic PSO algorithm, a variety of PSO variations have also been developed. These include the sub-swarm based PSO algorithms and PSO with Niching capabilities. Sub-swarm based PSO have been used to locate single solutions for unimodal optimization problems. However, for multimodal optimization problems, niche based approaches have been preferred because of their capability to locate multiple solutions. Those PSO variations in which grouping of particles into sub-swarms have been incorporated are called Sub-swarm based PSO. These sub-swarms can either exist in cooperative mode or competitive mode [5]. Some examples of sub-swarm based PSO algorithms include the Hybrid Particle Swarm Optimiser with Breeding and Subpopulations by Lovberg et al. [10], Multi-phase PSO (MPPSO) by Al-Kazemi and Mohan [1], Life-cycle PSO

(LCPSO) by Krink and Lovberg [9], Clustering based PSO with Stereotyping by Kennedy [8], Clustering based PSO by Thompson et al. [14], Cooperative Split PSO (CPSO-Sk) by Van den Bergh and Engelbrecht [15] and Predator-Prey PSO by Silva et al.[13]. Niching algorithms are those algorithms which are capable of locating multiple solutions to a problem. Niches can be defined as partitions of an environment which represent one solution to a problem. Speciation is the process of finding multiple niches or solutions. Species are the partitions of a population competing within an environment. They are the group of particles which converge on a single niche [5]. Some examples of niching algorithms include the Sequential Niching PSO employed by Kassabalidis et al. [6], PSO with Objective Function Stretching by Parsopoulous et al. [12] and nbest PSO by Brits et al. [3]. There are some PSO variants which are capable of finding multiple solutions to multi-modal problems by employing a sub-swarm based niching approach. These include the NichePSO presented by Brits et al. [4] and Species based PSO proposed by Parrott and Li [11].

II. NICHING WITH SUB-SWARM BASED PARTICLE SWARM OPTIMIZATION

In this study we present extensions to the particle swarm optimization algorithm to allow it to discover niches and locate multiple solutions for multimodal optimization problems. The proposed niching with sub-swarm based particle swarm optimization (NSPSO) algorithm is able to locate niches by employing a scouting technique inspired by the food foraging behavior of honey bees. This technique for identifying promising regions has been utilized by Baig and Rashid [2]. Here we have adopted this technique to identify niches. Once the niches have been identified, we create sub-swarms in those niches and allow the sub-swarm to refine the solution. Once the sub-swarm has converged, we record the best position of the sub-swarm as one of the many solutions to the multimodal optimization problem. In NSPSO we have a total population of s particles. The algorithm starts by randomly initializing all of the s particles with in the search space. The fitness of these particles is then evaluated and the particles are sorted according to their fitness. We then select n best particles from these and use them to create sub-swarms. The sub-swarms are created by randomly placing m/n particles within a radius r of each of the n best particles. Thus these randomly initialized particles serve the purpose of finding niches for the first iteration. We also

2009 International Conference on Computer Technology and Development

978-0-7695-3892-1/09 $26.00 © 2009 IEEE

DOI 10.1109/ICCTD.2009.30

182

2009 International Conference on Computer Technology and Development

978-0-7695-3892-1/09 $26.00 © 2009 IEEE

DOI 10.1109/ICCTD.2009.30

181

Page 2: [IEEE 2009 International Conference on Computer Technology and Development - Kota Kinabalu, Malaysia (2009.11.13-2009.11.15)] 2009 International Conference on Computer Technology and

check to ensure that overlapping sub-swarms are not created. Moreover not all of the s particles are assigned to sub-swarms. Instead we assign a total of m particles to these n sub-swarms. The remaining s – m particles are randomly placed within the search space. Each of the sub-swarms now starts to search its niche. There is no information sharing between the sub-swarms. The s – m particles which were randomly initialized also explore the search space. The only difference is that they do not have a social component and only explore the search space taking influence from their cognitive component. All the sub-swarms and random particles explore the search space in parallel. The sub-swarms try to exploit the already found niches by further refining the solution. The random particles on the other hand try to find other niches which can then be exploited by subsequent sub-swarms. After a user defined number of iterations have elapsed, the best particles from each sub-swarm are sorted according to their fitness along with the random particles. If one of the random particles shows fitness better then the best particles of the sub-swarms then a new sub-swarm is created around the random particle. Similarly those sub-swarms which do not exhibit good fitness are dissolved and replaced by other sub-swarms. A check is also maintained to determine whether a particular sub-swarm has converged or not. If a sub-swarms best position doesn’t improve for some time then it is considered to have converged. The best position is recorded as one of the potential solutions to the problem and the sub-swarm is dissolved and its particles are allocated to other sub-swarms.

III. EXPERIMENTATION The niching algorithms are designed to find multiple

solutions to the problem. In order to measure the performance of niching algorithms, we are required to employ performance measures which highlight and compare the niching capabilities of the algorithm. We have used five different performance measures to compare the performance of NSPSO. Some of these have been suggested in [5]. The five performance measure are; (i) The accuracy of solutions found, which is calculated by taking the average of the error of all solutions found. When calculating the error the optima closest to the solution found is used. A lower average error will suggest a more accurate algorithm. (ii) The convergence of the algorithm, which is calculated by executing the algorithm many times and then taking the percentage of time

that the algorithm was able to locate all of the optima. Higher value will suggest a more convergent algorithm. (iii) The total number of niches found. A large number of niches would suggest that the algorithm is very good at finding niches and vice versa. (iv) The number of global optima found, the closer the number of global optima found to the actual number of global optima present, the better will be the performance of the algorithm. (v) The number of local optima found, the closer the number of local optima found to the actual number of local optima present, the better will be the performance of the algorithm.

In order to test the performance of NSPSO we have made use of 5 test functions. These are given in Table I. The same test functions have been used before in [4, 11] and are used here to allow a fair comparison with NichePSO and SPSO.

NSPSO is executed 50 times for each of the test functions. And the results presented here are averages of the 50 runs. This is done to get a fair estimation of the algorithms capabilities. For each run a total of 2000 iterations were allowed. After every 50 iterations the sub-swarms were checked for convergence and compared with randomly exploring particles to determine if the sub-swarm be continued or dissolved. The number of particles used is 20, some of these will be used for randomly exploring the search space whereas others will be used to form sub-swarms and exploit niches. The number of sub-swarms is set to 5, meaning that at any given time there are 5 sub-swarms running in parallel to exploit the discovered niches. The number of particles available for sub-swarm creation is 15; and 3 particles are assigned to each sub-swarm. The number of particles which randomly explore the whole search space is 5. The neighborhood for sub-swarm creation is set to 0.05 in each dimension.

IV. RESULTS Table II presents the results obtained for accuracy of

NSPSO. The accuracy of NSPSO for both global and local optima is presented. The accuracy was calculated by taking the average of the error of the closest particle to an optima and then averaging the results over 50 runs. The accuracies of SPSO [11] and NichePSO [4] for finding global optima are compared with that of NSPSO. NSPSO was able to find the global optima with higher accuracy as compared to NichePSO for all 5 test functions. NSPSO also showed better performance as compared to SPSO for all test functions

TABLE I. TEST FUNCTIONS

Function Range Optima [Global (local)] )5(sin)(1 6 xxF π= [0,1] 5 (-)

)5(sin8.0

1.0)2log(2exp)(2 62

xxxF π⋅⎟⎟⎠

⎞⎜⎜⎝

⎛⎟⎠⎞

⎜⎝⎛ −⋅−=

[0,1] 1 (4)

))05.0(5(sin)(3 436 −= xxF π [0,1] 5 (-)

))05.0(5(sin854.0

08.0)2log(2exp)(4 4362

−⋅⎟⎟⎠

⎞⎜⎜⎝

⎛⎟⎠⎞

⎜⎝⎛ −⋅−= xxxF π

[0,1] 1 (4)

2222 )7()11(200),(5 −+−−+−= yxyxyxF [-6,6] 4 (-)

183182

Page 3: [IEEE 2009 International Conference on Computer Technology and Development - Kota Kinabalu, Malaysia (2009.11.13-2009.11.15)] 2009 International Conference on Computer Technology and

except F1, where the performance of NSPSO is worse than SPSO but still considerably good. For functions F2 and F4 which also had local optima in addition to global optima, NSPSO was able to locate the local optima with high accuracy as well.

TABLE II. RESULTS ON ACCURACY OF NSPSO (MEAN AND STD. DEV.) AFTER 2000 ITERATIONS (AVERAGED OVER 50 RUNS)

Function NSPSO (Global) NSPSO (Local) F1 3.67E-15 ± 6.75E-15 - F2 0.00 ± 0.00 1.98E-02 ± 2.22E-02F3 1.02E-15 ± 1.94E-15 - F4 1.45E-11 ± 1.25E-17 1.83E-02 ± 2.49E-02F5 4.81E-31 ± 1.56E-31 -

Table III presents the results obtained for the

convergence of NSPSO. NSPSO was able to find the global optima for all 5 test functions 100% of the time. For test functions F2 and F4 which also had local optima, NSPSO was able to locate all the optima both global and local 56% of the times for F2 and 68% of the times for F4. SPSO [11] and NichePSO [4] also reported finding the global optima 100% of the times for all test function.

TABLE III. TABLE 4. RESULTS ON CONVERGENCE (PERCENTAGE OF RUNS OUT OF 50 WHICH CONVERGED)

Function NSPSO [Global (All)] F1 100 % (-) F2 100 % (56%) F3 100 % (-) F4 100 % (68%) F5 100 % (-)

TABLE IV. TABLE 5. RESULTS ON NICHES, GLOBAL OPTIMA AND LOCAL OPTIMA FOUND (AVERAGED OVER 50 RUNS)

Function Niches Found

Global Optima Found

Local Optima Found

F1 16.68 5 - F2 6.16 1 3.54 F3 10.24 5 - F4 4.74 1 3.62 F5 8.6 4 -

In Table IV we present the average number of niches that

NSPSO was able to find. For all 5 test functions NSPSO was able to identify significantly more niches than the number of optima present. This illustrates that NSPSO is very much suited for multimodal optimization and is capable of finding multiple solutions. Table IV also shows the average number of global and local optima that NSPSO was able to find. We see that NSPSO was successfully able to locate all of the global optima for all 5 test functions. For functions F2 and F4, NSPSO on the average found over 3 out 4 of the local optima.

V. CONCLUSION We have presented a sub-swarm based niching PSO

algorithm (NSPSO) which uses niching to discover promising regions in the search space and refines them with the help of sub-swarms. NSPSO has the characteristic of

being able to locate all global optima very quickly. It also continues to locate other local optima and niches after the global optima have been found. Based on the performance measures obtained from 5 test functions, NSPSO is an efficient algorithm for multimodal optimization problems which is good at finding and maintaining a large number of niches. In future we would be looking to further enhance NSPSO to cater for dynamic multimodal optimization problems.

REFERENCES [1] B. Al~Kazemi and C.K. Mohan. Multi-Phase Discrete Particle Swarm

Optimization. Proc. of the International Workshop on Frontiers in Evolutionary Algorithms, pp. 622-625, 2002.

[2] A.R. Baig and M. Rashid, Honey bee foraging algorithm for multimodal & dynamic optimization problems. Proc. of the Genetic and Evolutionary Computation Conference, pp. 169, 2007.

[3] R. Brits, A.P. Engelbrecht, and F. van den Bergh. Solving Systems of Unconstrained Equations using Particle Swarm Optimization. Proc. of the IEEE Conference on Systems, Man, and Cybernetics, volume 3, pp. 102-107, Oct 2002.

[4] R. Brits, A.P. Engelbrecht, and F. van den Bergh. A Niching Particle Swarm Optimizer. Proc. of the Fourth Asia-Pacific Conference on Simulated Evolution and Learning, pp. 692-696, 2002

[5] A.P. Engelbrecht, Fundamentals of Computational Swarm Intelligence. John Wiley & Sons, 2006.

[6] L.N. Kassabalidis, M.A. EI-Shurkawi, R.J. Marks, L.S. Moulin, and A.P. Alves da Silva. Dynamic Security Border Identification using Enhanced Particle Swarm Optimization. IEEE Transactions on Power Systems, 17(3):723-729, August 2002.

[7] J. Kennedy and R.C. Eberhart. Particle Swarm Optimization. Proc. of the IEEE International Joint Conference on Neural Networks, pp. 1942-1948. IEEE Press, 1995.

[8] J. Kennedy. Stereotyping: Improving Particle Swarm Performance with Cluster Analysis. Proc. of the IEEE Congress on Evolutionary Computation, volume 2, pp. 1507-1512, July 2000.

[9] T. Krink and M. Li3vberg. The Life Cycle Model: Combining Particle Swarm Optimisation, Genetic Algorithms and Hill Climbers. Proc. of the Parallel Problem Solving from Nature Conference, In: Lecture Notes in Computer Science, volume 2439, pp. 621-630. Springer-Verlag, 2002.

[10] M. Lovberg, T.K. Rasmussen, and T. Krink. Hybrid Particle Swarm Optimiser with Breeding and Subpopulations. Proc. of the Genetic and Evolutionary Computation Conference, pp. 469-476, 2001.

[11] D. Parrott and X. Li, Locating and tracking multiple dynamic optima by a particle swarm model using speciation. IEEE Transaction on Evolutionary Computation. 10(4):440-458, 2006.

[12] K.E. Parsopoulos, V.P. Plagianakos, G.D. Magoulas, and M.N. Vrahatis. Stretching Technique for Obtaining Global Minimizers through Particle Swarm Optimization. Proc. of the IEEE Workshop on Particle Swarm Optimization, pp. 22-29, 2001.

[13] A. Silva, A. Neves,and E. Costa. An Empirical Comparison of Particle Swarm and Predator Prey Optimisation. Proc. of the Thirteenth Irish Conference on Artificial Intelligence and Cognitive Science, In: Lecture Notes in Artificial Intelligence, volume 2464, pp. 103-110. Springer-Verlag, 2002.

[14] B.B. Thompson, R.J. Marks, M.A. El-Sharkawi, W.J. Fox, and R.T. Miyamoto. Inversion of Neural Network Underwater Acoustic Model for Estimation of Bottom Parameters using Modified Particle Swarm Optimizer. Proc. of the International Joint Conference on Neural Networks, pp. 1306, 2003.

[15] F. van den Bergh and A.P. Engelbrecht. Cooperative Learning in Neural Networks using Particle Swarm Optimizers. South African Computer Journal, 26:84-90, 2000.

184183