20
An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm A. Kaveh , T. Bakhshpoori, E. Afshari Centre of Excellence for Fundamental Studies in Structural Engineering, School of Civil Engineering, Iran University of Science and Technology, Narmak, Tehran 16, Iran article info Article history: Received 30 September 2013 Accepted 13 July 2014 Keywords: Particle Swarm Optimization Swallow Swarm Optimization Hybrid metaheuristic algorithms abstract In this article, search mechanisms of Swallow Swarm Optimization (SSO) are implemented in the frame- work of Particle Swarm Optimization (PSO) to form the Hybrid Particle Swallow Swarm Optimization (HPSSO) algorithm. The new algorithm is tested by solving eleven mathematical optimization problems and six truss weight minimization problems. HPSSO is compared to the standard PSO and some of its advanced variants. Optimization results demonstrate the efficiency of the proposed algorithm that out- performs the PSO variants taken as basis of comparison and is very competitive with other state-of- the-art metaheuristic optimization methods. Here, a good balance between global and local searches is achieved. Ó 2014 Elsevier Ltd. All rights reserved. 1. Introduction Gradient-based optimization algorithms may have significant drawbacks such as complex derivatives, sensitivity to initial design, need of dealing with continuous search spaces, and large amount of required memory [1]. Metaheuristic optimization algo- rithms, mostly reproducing natural phenomena, were developed to overcome these disadvantages and gained increasing popularity. For example, Genetic Algorithms (GA) [2], Simulated Annealing (SA) [3], Ant Colony Optimization (ACO) [4], Particle Swarm Opti- mization (PSO) [5], Harmony Search (HS) [1,6], Big Bang-Big Crunch (BB-BC) [7], Charged System Search (CSS) [8], Imperialist Competitive Algorithm (ICA) [9], Cuckoo Search Algorithm (CS) [10], teaching–learning-based-optimization algorithm (TLBO) [11], and Mine blast algorithm (MBA) [12] have widely been used in structural optimization. Since the best results found in many real-life and classical opti- mization problems were obtained using hybrid metaheuristic for- mulations [13], this study will present the Hybrid Particle Swallow Swarm Optimization (HPSSO) algorithm that combines Swallow Swarm Optimization (SSO) and Particle Swarm Optimiza- tion (PSO). PSO incorporates swarming behaviors observed in flocks of birds, schools of fish, or swarms of bees, and even human social behavior, from which the idea is emerged initially by Kennedy and Eberhart [14]. A population of candidate designs is randomly generated and progressively updated in each iteration. The search is based on the idea that particles move through the search space from their current positions with velocities dynamically adjusted according to their current velocity, best self-experienced position and best global-experienced position with some impression of randomness. Neshat et al. [15] presented very recently a new swarm intelli- gence-based technique, the Swallow Swarm Optimization (SSO) algorithm, that mimics swallow swarm movements and their other behaviors. The algorithm shares common features with PSO, but presents also significant differences [15]. In particular, population is divided into internal subcolonies including local leaders. Particle velocity is updated as in PSO but with an additional component taken from the best experienced inter subcolony position relative to each individual. Furthermore, SSO utilizes aimless particles to further explore design space. In the real colony, these swallows increase the chance of finding food outside the internal areas. Convergence speed and global search ability are the most important criteria to evaluate performance of metaheuristic opti- mization algorithms. In standard PSO, all particles learn from the best global-experienced particle to update their velocity and posi- tion. Hence, PSO shows fast convergence [16]. In order to improve PSO performance, several PSO variants were developed in litera- ture. These approaches include [16]: (1) tuning the control param- eters so as to maintain the balance between local search and global search, (2) designing different neighborhood topologies to replace the traditional global topology, (3) hybridizing PSO with auxiliary search techniques, and (4) using multi-swarm techniques. A com- prehensive review on the PSO algorithm was recently presented by http://dx.doi.org/10.1016/j.compstruc.2014.07.012 0045-7949/Ó 2014 Elsevier Ltd. All rights reserved. Corresponding author. Tel.: +98 21 77240104; fax: +98 21 77240398. E-mail address: [email protected] (A. Kaveh). Computers and Structures 143 (2014) 40–59 Contents lists available at ScienceDirect Computers and Structures journal homepage: www.elsevier.com/locate/compstruc

An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

  • Upload
    e

  • View
    220

  • Download
    5

Embed Size (px)

Citation preview

Page 1: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Computers and Structures 143 (2014) 40–59

Contents lists available at ScienceDirect

Computers and Structures

journal homepage: www.elsevier .com/locate/compstruc

An efficient hybrid Particle Swarm and Swallow Swarm Optimizationalgorithm

http://dx.doi.org/10.1016/j.compstruc.2014.07.0120045-7949/� 2014 Elsevier Ltd. All rights reserved.

⇑ Corresponding author. Tel.: +98 21 77240104; fax: +98 21 77240398.E-mail address: [email protected] (A. Kaveh).

A. Kaveh ⇑, T. Bakhshpoori, E. AfshariCentre of Excellence for Fundamental Studies in Structural Engineering, School of Civil Engineering, Iran University of Science and Technology, Narmak, Tehran 16, Iran

a r t i c l e i n f o

Article history:Received 30 September 2013Accepted 13 July 2014

Keywords:Particle Swarm OptimizationSwallow Swarm OptimizationHybrid metaheuristic algorithms

a b s t r a c t

In this article, search mechanisms of Swallow Swarm Optimization (SSO) are implemented in the frame-work of Particle Swarm Optimization (PSO) to form the Hybrid Particle Swallow Swarm Optimization(HPSSO) algorithm. The new algorithm is tested by solving eleven mathematical optimization problemsand six truss weight minimization problems. HPSSO is compared to the standard PSO and some of itsadvanced variants. Optimization results demonstrate the efficiency of the proposed algorithm that out-performs the PSO variants taken as basis of comparison and is very competitive with other state-of-the-art metaheuristic optimization methods. Here, a good balance between global and local searches isachieved.

� 2014 Elsevier Ltd. All rights reserved.

1. Introduction

Gradient-based optimization algorithms may have significantdrawbacks such as complex derivatives, sensitivity to initialdesign, need of dealing with continuous search spaces, and largeamount of required memory [1]. Metaheuristic optimization algo-rithms, mostly reproducing natural phenomena, were developed toovercome these disadvantages and gained increasing popularity.For example, Genetic Algorithms (GA) [2], Simulated Annealing(SA) [3], Ant Colony Optimization (ACO) [4], Particle Swarm Opti-mization (PSO) [5], Harmony Search (HS) [1,6], Big Bang-BigCrunch (BB-BC) [7], Charged System Search (CSS) [8], ImperialistCompetitive Algorithm (ICA) [9], Cuckoo Search Algorithm (CS)[10], teaching–learning-based-optimization algorithm (TLBO)[11], and Mine blast algorithm (MBA) [12] have widely been usedin structural optimization.

Since the best results found in many real-life and classical opti-mization problems were obtained using hybrid metaheuristic for-mulations [13], this study will present the Hybrid ParticleSwallow Swarm Optimization (HPSSO) algorithm that combinesSwallow Swarm Optimization (SSO) and Particle Swarm Optimiza-tion (PSO).

PSO incorporates swarming behaviors observed in flocks ofbirds, schools of fish, or swarms of bees, and even human socialbehavior, from which the idea is emerged initially by Kennedyand Eberhart [14]. A population of candidate designs is randomly

generated and progressively updated in each iteration. The searchis based on the idea that particles move through the search spacefrom their current positions with velocities dynamically adjustedaccording to their current velocity, best self-experienced positionand best global-experienced position with some impression ofrandomness.

Neshat et al. [15] presented very recently a new swarm intelli-gence-based technique, the Swallow Swarm Optimization (SSO)algorithm, that mimics swallow swarm movements and their otherbehaviors. The algorithm shares common features with PSO, butpresents also significant differences [15]. In particular, populationis divided into internal subcolonies including local leaders. Particlevelocity is updated as in PSO but with an additional componenttaken from the best experienced inter subcolony position relativeto each individual. Furthermore, SSO utilizes aimless particles tofurther explore design space. In the real colony, these swallowsincrease the chance of finding food outside the internal areas.

Convergence speed and global search ability are the mostimportant criteria to evaluate performance of metaheuristic opti-mization algorithms. In standard PSO, all particles learn from thebest global-experienced particle to update their velocity and posi-tion. Hence, PSO shows fast convergence [16]. In order to improvePSO performance, several PSO variants were developed in litera-ture. These approaches include [16]: (1) tuning the control param-eters so as to maintain the balance between local search and globalsearch, (2) designing different neighborhood topologies to replacethe traditional global topology, (3) hybridizing PSO with auxiliarysearch techniques, and (4) using multi-swarm techniques. A com-prehensive review on the PSO algorithm was recently presented by

Page 2: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Fig. 1. Types of particles and movements of explorer particles.

A. Kaveh et al. / Computers and Structures 143 (2014) 40–59 41

Banks et al. [17,18]. It appears that avoiding premature conver-gence is still a challenging task. In order to overcome this problem,the HPPSO algorithm is developed that attempts to incorporate themost important features of SSO in the PSO formulation. HPSSO pro-vides a mechanism for particles to learn not only from the best glo-bal-experienced particle but also from other promising particles.Getting to work some particles with predetermined tasks isanother utilized feature.

In order to demonstrate the efficiency and robustness of pro-posed approach, HPSSO is applied to eleven mathematical optimi-zation problems and six weight minimization problems of trussstructures including sizing variables. The latter class of problemsis often taken as benchmark in constrained global optimizationbecause of the presence of many design variables (i.e. large searchspaces) and nonlinear constraints [9].

Optimization results indicate that HPSSO is competitive withother popular metaheuristic methods and outperforms signifi-cantly standard PSO and an advanced PSO variant in truss prob-lems. In the mathematical optimization problems, HPSSO is alsomore efficient than a powerful variant of PSO developed veryrecently.

The rest of the paper is structured as follows. Section 2describes the HPSSO algorithm besides outlining PSO and SSO algo-rithms. Results of mathematical optimization problems are dis-cussed in Section 3 while Section 4 is concerned with trussdesign problems. Section 5 summarizes the main findings of thisstudy.

2. Formulation of optimization algorithms

2.1. Particle Swarm Optimization (PSO)

PSO is a population based metaheuristic algorithm developedby Kennedy and Eberhart [14] that simulates social behaviors ofanimals. Similar to other metaheuristic methods, PSO is initializedwith a population of random designs, named particles, that areupdated in each generation to search the optimum. Each particleis associated with a velocity vector adaptively changed in the opti-mization process. Particles move through the search space fromtheir current positions with velocity vectors that are dynamicallyadjusted according to their current velocity, best self-experiencedposition and the best global-experienced position. PSO algorithmconstitutes the simple conduct rules for search ability of each par-ticle as follows:

Xkþ1i ¼ Xk

i þ Vkþ1i

Vkþ1i ¼ xVk

i þ c1r1ðPki � Xk

i Þ þ c2r2ðPkg � Xk

i Þð1Þ

The new position of particles Xik+1 is obtained by adding the new

velocity Vik+1 to the current position Xi

k. Vik, Pi

k and Pgk are previous

velocity, the best position visited by each particle itself and thebest solution the swarm has found so far, respectively. x is an iner-tia weight to control the influence of the previous velocity, r1 and r2

are two random numbers uniformly distributed in the range of (0,1), and c1 and c2 are two learning factors which control the influ-ence of the cognitive and social components.

2.2. Swallow Swarm Optimization (SSO)

SSO has been developed recently by Neshat et al. [15] as a newswarm intelligence based algorithm reproducing the behavior ofswallow swarms. Studies conducted on various species of swallowsrevealed peculiar features that have been taken as the basis of theSSO algorithm: the very social life and migration of large groups;high-speed flying which can affect convergence speed; thereare few floating swallows that fly out of the colony or between

subcolonies to search and inform the rest of the swarm on foodsources or on the attack of hunters; organization of the swarm inseveral subcolonies each of which has an experienced leader.

SSO has common features with PSO but also several significantdifferences [15]. An initial population of particles is randomly gen-erated and progressively updated in the optimization process.Three types of particles are considered: leader, explorer and aim-less particles. Leader particles are categorized into two types: LocalLeaders (LL) that conduct the related internal subcolonies andshow a local optimum point, and Head Leader (HL) that is respon-sible for the leadership of the entire colony and indicates the globaloptimum point. Explorer particles, that represent the largest partof the population, take care of the exploration of design space. Ineach optimization iteration (k), particles play different rolesaccording to their type.

Each swallow arriving at an extreme point emits a special soundto guide the group toward there. If that place is the best in thedesign space, that particle becomes the Head Leader, HL(k). If theparticle reaches a good position (yet not the best) compared withits neighboring particles, it is chosen as a local leader, LL(k). Other-wise, the particle is an explorer one and has to change its positionin the search space. The new position of explorer particles, Xi

(k+1), isobtained by adding a change velocity Vi

(k+1) to the current positionXi

(k) considering VHLi(K+1) (change velocity vector of particle toward

Head Leader), and VLLi(K+1) (change velocity vector of particle

toward Local Leader). The change velocity of explorer particlestoward head leader and corresponding local leader are dynamicallyadjusted according to the current velocity vector of the particletoward leaders (VHLi

(k) and VLLi(k), respectively), best self-experi-

enced position (Xbest(k) ), and leader’s position (HL(k) and LL(k), respec-

tively). This is shown schematically in Fig. 1 and modeledmathematically as follows:

Xkþ1i ¼ Xk

i þ Vkþ1i

Vkþ1i ¼ VHLkþ1

i þ VLLkþ1i

VHLkþ1i ¼ VHLk

i þ aHLrandðÞðXbestki � Xk

i Þ þ bHLrandðÞðHLk � Xki Þ

VLLkþ1i ¼ VLLk

i þ aLLrandðÞðXbestki � Xk

i Þ þ bLLrandðÞðLLki � Xk

i Þð2Þ

Page 3: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Fig. 2. Schematic representation of the transition from the current iteration to the subsequent iteration.

Fig. 3. Flowchart of the HPSSO algorithm.

42 A. Kaveh et al. / Computers and Structures 143 (2014) 40–59

Page 4: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Table 1Details of the mathematical optimization problems.

Test function n Optimum Domain Name

f1ðxÞ ¼Pn

i¼1x2i

30 0 ±5.12 Sphere

f2ðxÞ ¼Pn�1

i¼1 ½100ðxiþ1 � x2i Þ

2 þ ðxi � 1Þ2� 30 0 ±50 Rosenbrock

f3ðxÞ ¼Pn

i¼1jxij þQn

i¼1jxij 30 0 ±10 Schwefel’s P2.22

f4ðxÞ ¼Pn

i¼1ðPi

j¼1xjÞ2 30 0 ±100 Quadric

f5ðxÞ ¼Pn

i¼1ðbxi þ 0:5cÞ2 30 0 ±100 Step

f6ðxÞ ¼Pn

i¼1ix4i þ rand½0;1Þ 30 0 ±1.28 Quadric noise

f7ðxÞ ¼ 20þ e� 20 expð�0:2ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi1=n

Pni¼1x2

i

qÞ � expð1=n

Pni¼1 cos 2pxiÞ 30 0 ±32 Ackley

f8ðxÞ ¼ 1=4000Pn

i¼1x2i �

Qni¼1 cosðxi=

ffiffiipÞ þ 1 30 0 ±600 Griewank

f9ðxÞ ¼Pn

i¼1½x2i � 10 cosð2pxiÞ þ 10� 30 0 ±5.12 Rastrigin

f10ðxÞ ¼Pn

i¼1½y2i � 10 cosð2pyiÞ þ 10� 30 0 ±5.12 Non-continuous Rastrigin

f11ðxÞ ¼ p=nf1� sin2ðpy1Þ þPn�1

i¼1 ððyi � 1Þ2½1þ 10 sin2ðpyiþ1Þ�Þþðyn � 1Þ2g þ

Pni¼1uðxi;10;100;4Þ; yi ¼ 1þ 1=4ðxi þ 1Þ

30 0 ±50 Generalized penalized

In f 10 ; yi ¼xi jxij < 0:5

roundð2xiÞ2 jxijP 0:5

�: In f 11;u xi; a; k;mð Þ ¼

kðxi � aÞm xija0 �a 6 jxij 6 a

kð�xi � aÞm jxij < �a

8<: .

(b)(a)

0 0.5 1 1.5 2

x 105

-20

0

20

40

60

80

Number of FEs

-Log

(Fi

tnes

s)

C3=0.2C3=0.5C3=1C3=2C3=3C3=4C3=5

0 0.5 1 1.5 2

x 105

-10

0

10

20

30

Number of FEs

-Log

(Fi

tnes

s)

C3=0.2C3=0.5C3=1C3=2C3=3C3=4C3=5

Fig. 4. Results of HPSSO parameter tuning analysis carried out for: (a) Sphere problem; and (b) Ackley problem.

(b)(a)

(d)(c)

3 6 9 12 15 181000

1100

1200

1300

1400

1500

1600

1700

Naimless

Ave

arag

e FE

s

NLL=3NLL=6NLL=9NLL=12NLL=15NLL=18

3 6 9 12 15 181000

1100

1200

1300

1400

1500

1600

1700

Naimless

Bes

t FE

s

3 6 9 12 15 1810

20

30

40

50

60

70

Naimless

-Log

(A

vear

age

of M

inim

ums)

3 6 9 12 15 1820

30

40

50

60

70

80

Naimless

-Log

(B

est o

f M

inim

ums)

Fig. 5. Results of HPSSO parameter tuning analysis carried out for the sphere problem with N = 50: (a) Variation of average computational cost with respect to the number ofsubcolonies and aimless particles; (b) variation of lowest computational cost with respect to the number of subcolonies and aimless particles; (c) variation of averageoptimized cost; and (d) Variation of best optimized cost.

A. Kaveh et al. / Computers and Structures 143 (2014) 40–59 43

Page 5: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Table 2Comparison of optimization results obtained by HPSSO and other metaheuristic methods in the mathematical optimization problems.

Function Algorithm

Neshat et al. [15] Chen et al. [16] Present work

SSO PSO ALC-PSO HPSSO

FEs = 2 � 104 FEs = 2 � 105 FEs = 2 � 105 FEs = 1000 FEs = 5000 FEs = 2 � 104 FEs = 2 � 105

Sphere f1 0 2.016e�56 8.206e�172 3.87e�06 6.137e�15 4.239e�33 1.879e�37Rosenbrock f2 2.4373e�1 1.439e�2 3.729e�7 0.03014 4.327e�12 4.020e�26 1.294e�30Schwefel’s P2.22 f3 1.58e�78 8.572e�38 1.121e�98 2.748e�122 6.173e�244 0 0Quadric f4 4.16e�15 2.941e�3 8.267e�14 1.573e�4 2.372e�11 3.348e�31 3.285e�35Step f5 0 0 0 0 0 0 0Quadric noise f6 2.86e�3 – – 2.212e�3 6.670e�05 4.778e�05 3.268e�06Ackley f7 4.7025e�12 7.994e�15 7.994e�15 3.573e�4 1.203e�07 8.882e�16 8.882e�16Griewank f8 4.8516e�8 0 0 1.486e�08 0 0 0Rastrigin f9 1.8104e�10 14.92 7.105e�15 7.017e�4 6.221e�12 0 0Non-continuous Rastrigin f10 6.04e�19 1 0 0.128 1.192e�10 0 0Generalized penalized f11 1.84e�31 1.570e�32 1.570e�32 1.816e�06 1.522e�13 1.636e�30 1.571e�32

Table 3Comparison of robustness and reliability of HPSSO and ALC-PSO in the mathematical optimization problems.

Function ALC-PSO [16] HPSSO Function ALC-PSO [16] HPSSO

f1 Best 1.135e�172 1.879e�37 f7 Best 7.694e�15 8.882e�16Mean 1.677e�161 1.121e�32 Mean 1.148e�14 2.783e�15Dev 8.206e�161 3.829e�32 Dev 2.941e�15 2.196e�15

f2 Best 3.729e�7 1.294e�30 f8 Best 0 0Mean 7.613 2.801e�28 Mean 1.221e�2 0Dev 6.658 7.624e�8 Dev 1.577e�2 0

f3 Best 1.121e�98 0 f9 Best 7.105e�15 0Mean 1.161e�90 5.193e�272 Mean 2.528e�14 0Dev 4.146e�90 7.363e�283 Dev 1.376e�14 0

f4 Best 8.267e�14 3.285e�35 f10 Best 0 0Mean 1.792e�11 1.970e�31 Mean 1.251e�11 7.105e�16Dev 3.538e�11 9.778e�31 Dev 6.752e�11 3.826e�15

f5 Best 0 0 f11 Best 1.570e�32 1.571e�32Mean 0 0 Mean 4.393e�32 5.889e�23Dev 0 0 Dev 7.602e�32 3.171e�22

(b)(a)

(d)(c)

0 2000 4000 6000 8000-20

0

20

40

60

80

100

Iterations

-log

(Fi

tnes

s)

Average of 30 runs

The best run

0 2000 4000 6000 8000

0

10

20

30

40

Iterations

-Log

(Fi

tnes

s)

Average of 30 runs

The best run

0 2000 4000 6000 8000-10

0

10

20

30

40

Iterations

-Log

(Fi

tnes

s)

Average of 30 runs

The best run

0 2000 4000 6000 8000-20

0

20

40

60

80

Iterations

-Log

(Fi

tnes

s)

Average of 30 runs

The best run

Fig. 6. Comparison of convergence curves relative to the best design and average optimization run: (a) f1; (b) f7; (c) f9; and (d) f12.

44 A. Kaveh et al. / Computers and Structures 143 (2014) 40–59

Page 6: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

(a)

(b)

(c)

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2

x 105

-20

0

20

40

60

80

Number of FEs

-Log

(Fi

tnes

s)

The best particle

The worst particleAverage

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2

x 105

-10

0

10

20

30

40

Number of FEs

-Log

(Fi

tnes

s)

The best particleThe worst particleAverage

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2

x 105

-20

0

20

40

60

80

Number of FEs

-Log

(Fi

tnes

s)

The best particle

The worst particleAverage

Fig. 7. Global search speed and local search speed of the HPSSO algorithm: (a) f1; (b) f7; and (c) f12.

Table 4Results of sensitivity analysis carried out to find the best combination of internal population parameters of HPSSO for the 25-bar truss problem.

Nll = 2 Nll = 3 Nll = 5 Nll = 7 Nll = 10 Nll = 15 Nll = 20

N = 20 545.3354 545.1735 545.1850 545.2246 – – –N = 30 545.2539 545.1767 545.1665 545.1888 545.2000 – –N = 50 545.2127 545.1968 545.1836 545.1831 545.1862 545.2001 545.2571N = 100 545.2356 545.2220 545.2033 545.1773 545.1733 545.1887 545.2134N = 150 545.3216 545.2592 545.2376 545.2121 545.1802 545.1742 545.1990

A. Kaveh et al. / Computers and Structures 143 (2014) 40–59 45

where aHL, bHL, aLL and bLL are acceleration control coefficients adap-tively defined [15] while rand() is a random number uniformly dis-tributed in (0,1).

Aimless particles o(i) also carry out exploration but have nothingto do with head leader and local leaders. They simply move backand forth with respect to their previous positions by displacingby a random fraction of the allowable step defined by the upperand lower bound of design variables. That is:

okþ1i ¼ ok

i þ randðf�1;1gÞ �randðmin

s;max

1þ randðÞ

" #ð3Þ

2.3. Hybrid Particle Swallow Swarm Optimization (HPSSO)

HPSSO includes two important features of SSO added to thebasic PSO formulation: considering a specific number of subcolon-ies and a certain number of particles for specific task. Similar toSSO, there are leaders (global leader and local leaders), explorers,and aimless particles. The size of population N is specified alongwith the number of subcolonies Nsubcolony and aimless particlesNaimless; the number of explorer particles (Ne.p) is determined as aconsequence. HPSSO starts with a set of particles randomly posi-tioned in the design space and with random velocities. The positionand velocity of each particle are progressively updated to search

Page 7: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Table 5Effect of aimless particles search strategy in truss optimization problems.

Benchmark case Senario Best (lb) Average (lb) Worst (lb)

10-bar truss-Case 1 Global 5060.86 5064.46 5081.63Local 5060.86 5062.28 5076.92

10-bar truss-Case 2 Global 4676.97 4677.46 4681.32Local 4676.95 4677.38 4679.72

22-bar truss Global 1023.98 1026.80 1057.43Local 1023.99 1027.60 1052.05

25-bar truss Global 545.16 545.69 547.12Local 545.16 545.56 546.99

72-bar truss Global 363.8617 364.06 365.86Local 363.86 364.07 364.97

120-bar truss Global 33250.30 33259.96 33283.41Local 33250.05 33260.70 33307.16

200-bar truss Global 25831.12 28047.89 34255.55Local 25698.85 28386.72 34808.78

P1

(6) (4) (2)

(5) (3) (1)1 2

3 4

7 8 9 10

5 6

X

Y

360 in. 360 in.

360 in.

P1

P2P2

Fig. 8. Schematic of the planar 10-bar truss.

46 A. Kaveh et al. / Computers and Structures 143 (2014) 40–59

the optimum. In each iteration, particles are sorted based on thevalue of the cost function (usually, pseudo-cost function includingpenalty terms or fitness). The best particle is set as the head leaderand Nsubcolony subsequent particles are set as local leaders goingfrom top to bottom. Naimless particles are then selected from theworst ones going from bottom to top. The remaining particles areset as explorers. The search of each explorer particle is performedby adding the updated velocity vector to the current position ofthat particle. Compared with PSO, the velocity vector includes anextra term to account for the contribution of local leaders:

Xkþ1i ¼ Xk

i þ Vkþ1i

Vkþ1i ¼ xVk

i þ c1r1ðPki � Xk

i Þ þ c2r2ðPkg � Xk

i Þ þ c3r3ðPklðiÞ � Xk

i Þð4Þ

where Pl(i)k is the local leader of the subcolony including the ith par-

ticle, r3 is a uniform random number in the (0,1) interval and c3 isthe learning factor controlling the influence of the proximity cogni-tion. In each iteration, the position of a particle included in a subcol-ony can be changed so as to move away from the current localleader and join the leader of another group. The distance betweeneach explorer particle and local leaders is used to determine therelated subcolony so that each explorer particle can be placed nearthe closest local leader. That is:

disti;j ¼ jXi � Pl;jj; i ¼ 1;2; . . . ;Ne:p; j ¼ 1;2; . . . ;Nsub�colony

disti;j ¼ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiðX1

i � P1l;jÞ

2 þ ðX2i � P2

l;jÞ2 þ � � � þ ðXng

i � Pngl;j Þ

2q ð5Þ

where ng is the number of design variables and disti,j is the distancebetween the ith explorer particle and jth local leader.

Three possible options are considered for aimless particles: (i)they perform just a random search in the same way as it is donein SSO; (ii) they perform a local search in the neighborhood of localleaders; (iii) they perform a dynamic search in the neighborhood ofthe global leader. If option (ii) is chosen, the number of aimlessparticles coincides with the number of subcolonies and hence anaimless particle should be defined for each subcolony. In this case,the distance between the worst particles and local leaders is thecriterion to assign each aimless particle to its correspondingsubcolony. This strategy is most effective in truss optimizationproblems, while making aimless particles to search in the neigh-borhood of the global leader is good for mathematical optimizationproblems.

Aimless particles perform their search in the neighborhood ofthe local leader of their subcolony or the head leader of the popu-lation according to the following rules:

okþ1i ¼

PklðiÞ Second senario

Pkg Thirdsenario

(þ randð�1;1Þ

� ½kk � ðmaxs�min

sÞ�; i ¼ 1;2;3; . . . ;Naimless ð6Þ

where rand(�1,1) is a uniform random number between �1 and 1;mins and maxs, respectively, are the lower and upper bound ofdesign variables; kk is a parameter defined to generate the effectivesearch range about global optimum or local leaders. That is:

kk ¼ kmax � ðkmax � kminÞ � iter=itermax ð7Þ

where kmax and kmin, respectively, are the values of k in the first andlast iterations of the algorithm, set in the present study as 0.01 and0.001; iter is the number of the current iteration; itermax is the totalnumber of optimization iterations.

Fig. 2 illustrates the transition between two consecutive gener-ations. The population is updated by: (i) copying first the head lea-der and local leaders from one generation to the next (in some way,this can be interpreted as an elitist strategy); (ii) performing searchwith explorer particles to move population toward the best regionsof design space (exploration phase or global search); (iii) perform-ing a dynamic local search with aimless particles in the neighbor-hood of the head leader or local leaders. The flowchart of theHPSSO algorithm is presented in Fig. 3.

3. Mathematical optimization problems

Eleven mathematical optimization problems were solved inorder to test the numerical efficiency of the HPSSO algorithmdeveloped in this research. Table 1 summarizes the optimizationproblems reporting the number of design variables (set equal to30), the target optimum cost, the bounds of design variables, andthe denomination of each test problem in the optimization litera-ture. The first six problems regard unimodal functions that wereselected to check if HPSSO could maintain the fast-converge prop-erty of standard PSO. The other five problems regard multimodalfunctions with many local optima and were selected to check ifHPSSO has global search capability and can avoid premature con-vergence to local optima [16].

HPSSO includes seven internal parameters. The first fourparameters are the inertia weight controlling the influence of thevelocity previously possessed by the particle and three learningfactors controlling the influence of the social, self and proximitycognition mechanisms. The inertia weight x can range between0 and 1 and strongly affects convergence behavior. By properlyselecting x, it is possible to achieve a good balance between globaland local exploration abilities, thus finding better designs. In this

Page 8: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Table 6Comparison of optimization results obtained by HPSSO and other metaheuristic methods in the 10-bar truss problem.

Element group Optimal cross-sectional areas (in2)

HPSACO ABC-AP SAHS TLBO PSO MSPSO HPSSOKaveh and Talatahari [23] Sonmez [24] Degertekin [6] Degertekin [11] Talatahari et al. [20] Talatahari et al. [20] Present work

Case 11 A1 30.307 30.548 30.394 30.4286 30.602 30.5257 30.538382 A2 0.1000 0.1000 0.1000 0.1 0.1 0.1001 0.13 A3 23.4340 23.1800 23.0980 23.2436 23.9005 23.2250 23.151034 A4 15.5050 15.2180 15.4910 15.3677 14.7888 15.4114 15.205665 A5 0.1000 0.1000 0.1000 0.1 0.1000 0.1001 0.16 A6 0.5241 0.5510 0.5290 0.5751 0.1000 0.5583 0.5488977 A7 7.4365 7.4630 7.4880 7.4404 8.5553 7.4395 7.4653228 A8 21.0790 21.0580 21.1890 20.9665 21.0431 20.9172 21.064379 A9 21.2290 21.5010 21.3420 21.5330 20.8093 21.5098 21.5293510 A10 0.1000 0.1000 0.1000 0.1000 0.1000 0.1000 0.1Best weight (lb) 5056.56* 5060.880 5061.42 5060.96 5076.7 5061 5060.86Number of structural analyses 10,650 500 � 103 7081 16,872 N/A N/A 14118

Case 21 A1 23.1940 23.4692 23.5250 23.5240 23.9324 23.4432 23.523772 A2 0.1000 0.1005 0.1000 0.1000 0.1000 0.1000 0.13 A3 24.5850 25.2393 25.4290 25.4410 25.2478 25.3718 25.368644 A4 14.2210 14.3540 14.4880 14.4790 14.1791 14.1360 14.377995 A5 0.1000 0.1001 0.1000 0.1000 0.1000 0.1000 0.16 A6 1.9690 1.9701 1.9920 1.9950 1.9701 1.9699 1.9697327 A7 12.4890 12.4128 12.3520 12.3340 12.5097 12.4335 12.36788 A8 12.9250 12.8925 12.6980 12.6890 13.0379 13.0173 12.797229 A9 20.9520 20.3343 20.3410 20.3540 19.9002 20.2717 20.3257710 A10 0.1010 0.1000 0.1000 0.1000 0.1000 0.1000 0.1Best weight (lb) 4675.78** 4677.077 4678.84 4678.31 4678 4677.26 4676.95Number of structural analyses 9925 500 � 103 7267 14,857 N/A N/A 14406

* HPSACO violates the design constraints as 0.099% [11].** HPSACO violates the design constraints as 0.079% [11].

A.K

avehet

al./Computers

andStructures

143(2014)

40–59

47

Page 9: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Table 7Comparison of robustness and reliability of HPSSO and other metaheuristic methods in the 10-bar truss problem.

Algorithm Weight (lb)

Best Average Worst Difference best–averagesolution (%)

Difference best–worstsolution (%)

SD

Case 1SAHS [6] 5061.42 5061.95 5063.39 0.0105 0.0389 0.71TLBO [11] 5060.96 5062.08 5063.23 0.0221 0.0449 0.79PSO [20] 5076.70 5354.87 7076.70 5.47 39.39 508.77MSPSO [20] 5061.00 5064.46 5078.00 0.07 0.33 5.72HPSSO 5060.86 5062.28 5076.90 0.028 0.3159 4.325

Case 2SAHS [6] 4678.84 4680.08 4682.26 0.0265 0.0731 1.89TLBO [11] 4678.31 4680.12 4681.23 0.0387 0.0624 1.016PSO [20] 4678.00 4979.320 6759.84 6.44 44.5 453.33MSPSO [20] 4677.26 4681.45 4687.50 0.08 0.22 2.19HPSSO 4676.95 4677.38 4679.72 0.0092 0.059 0.46354

48 A. Kaveh et al. / Computers and Structures 143 (2014) 40–59

study, x was reduced linearly from 0.9 to 0.4 during the optimiza-tion process. As shown by Eberhat and Shi [19], this leads to obtainbetter results. Self and social learning factors (c1 and c2) were setequal to 2 in order to make the average velocity change coefficientclose to one [20].

The unimodal Sphere function and the multimodal Ackley func-tion (denoted as f1 and f7 in Table 1, respectively) were selected fortuning the proximity cognition parameter c3. These functions wereminimized with HPSSO (N = 30, Nsubcolony = 5, and Naimless = 3) byperforming ten independent runs with randomly generated initialsolutions and setting different values of c3. Convergence curves rel-ative to the best optimization runs are presented in Fig. 4 withrespect to the number of function evaluations (FEs). The presenceof non-continuous step-like movements reveals that setting c3 asa very high value leads to an instability in exploring the searchspace. Fast convergence indicates that setting c3 as a very smallvalue leads to a performance identical to the standard PSO algo-rithm. HPSSO was most efficient in the case c3 = 2: this is very sim-ilar to the findings of Kennedy and Eberhart on self and socialcognition parameters [14].

The other three internal parameters of HPSSO describe thestructure of the population: the population size N, the number of

Z

(5)

(6)

(7)

(8)

240 in.

80 in.

120 in.

1

3

11

1

13

15

16

19

20

21

22

Fig. 9. Schematic of the

local leaders or number of subcolonies Nsubcolony, and the numberof aimless particles Naimless. Option (iii) was selected for aimlessparticles that carry out their search in the neighborhood of thehead leader. The effect of population parameters was investigatedfor the unimodal Sphere function and the multimodal Ackley func-tion (respectively denoted as f1 and f7 in Table 1). These two prob-lems were solved for N = 15, 30, 50, 100, 150 considering differentvalues of Nsubcolony (here denoted as NLL) and Naimless; 100 indepen-dent optimization runs were performed for each case.

Fig. 5 summarizes the optimization results with respect to thenumber of aimless particles and subcolonies. The plots representthe best solution and the average of independent optimizationruns. It appears that the convergence speed (expressed in termsof the number of function evaluations required in the optimizationprocess) is insensitive to the number of aimless particles. However,increasing the number of subcolonies reduced convergence speed.This behavior was seen regardless of population size. Fig. 5c showsthat the average optimized cost became less sensitive to Naimless asthis parameter is larger than one tenth of the population size. Thesame conclusion can be drawn for the best optimized cost. Such abehavior was observed regardless of population size. In general,increasing population size allows the number of function evaluations

X

Y

(1)

(2)

(3)

(4)

2

4

5

6

7

8

9

10

2

14

17

18

spatial 22-bar truss.

Page 10: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Table 8Comparison of optimization results obtained by HPSSO and other metaheuristic methods in the 22-bar truss problem.

Element group Optimal cross-sectional areas (in2)

HS PSO MSPSO HPSSOLee and Geem [1] Talatahari et al. [20] Talatahari et al. [20] Present Study

1 A1–A4 2.588 2.5799 2.6320 2.6205932 A5–A6 1.083 1.1312 1.1952 1.2068363 A7–A8 0.363 0.3472 0.3541 0.3557194 A9–A10 0.422 0.4212 0.4145 0.4192235 A11–A14 2.827 2.8330 2.7644 2.7830286 A15–A18 2.055 2.0946 2.0297 2.0826867 A19–A22 2.044 2.0205 2.0909 2.029553

Best weight (lb) 1022.23 1024 1024 1023.9857Number of structural analyses 10,000 25,000 12,500 14,406

Table 9Comparison of robustness and reliability of HPSSO and other metaheuristic methods in the 22-bar truss problem.

Algorithm Weight (lb)

Best Average Worst Difference best–average solution (%) Difference best–worst solution (%) SD

PSO [20] 1024 1033.790 1093.120 0.95 6.75 17.29MSPSO [20] 1024 1028.550 1049.180 0.44 2.46 6.63HPSSO 1023.9857 1027.599 1052.048 0.3528 2.7405 6.357

200 in. (508 cm)

Y

X

Z

75 in. (190.5 cm)

200 in. (508 cm)

75 in. (190.5 cm)

75 in. (190.5 cm)

100 in. (254 cm)

100 in. (254 cm)

(1)

(2)

(3)

(4)

(5)

(6)

(7)

(8)

(9)

(10)

1

3

24

567

89

10

11

1213

14 15

16

17

18

1924

20

22

21

23

25

Fig. 10. Schematic of the spatial 25-bar tower.

A. Kaveh et al. / Computers and Structures 143 (2014) 40–59 49

to be reduced but may result in worse optimized designs. This wasobserved also in the case of the Ackley problem function.

Based on the above described sensitivity study, internal param-eters of HPSSO were set as follows: N = 30, Nsubcolony = 5, andNaimless = 3. The inertia weight x decreased linearly from 0.9 to0.4. Self, social and proximity learning factors were set asc1 = c2 = c3 = 2. Since the HPSSO search process is governed by ran-dom rules, each test problem was solved by carrying out 30 inde-pendent optimization runs to obtain statistically significant results.

Finally, each test problem was solved for different values of themaximum number of function evaluations in order to check thebalancing ability of HPSSO between convergence rate andaccuracy.

The proposed algorithm was compared with SSO, standard PSOand an advanced PSO variant with an aging leader and challengers(ALC-PSO). Chen et al. [16] have recently developed the ALC-PSOalgorithm where the leader of the swarm is assigned with a grow-ing age and life span; other individuals are allowed to challenge

Page 11: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Tabl

e10

Com

pari

son

ofop

tim

izat

ion

resu

lts

obta

ined

byH

PSSO

and

othe

rm

etah

euri

stic

met

hods

inth

e25

-bar

tow

erpr

oble

m.

Elem

ent

grou

pO

ptim

alcr

oss-

sect

ion

alar

eas

(in

2)

HPS

AC

OH

BB

–BC

SAH

STL

BO

PSO

MSP

SOH

PSSO

Kav

ehan

dTa

lata

har

i[2

3]K

aveh

and

Tala

tah

ari

[25]

Deg

erte

kin

[6]

Deg

erte

kin

[11]

Tala

tah

ari

etal

.[20

]Ta

lata

har

iet

al.[

20]

Pres

ent

wor

k

1A

10.

0100

2.66

220.

0100

0.01

000.

0100

0.01

000.

012

A2–A

52.

0540

1.99

302.

0740

2.07

121.

9503

1.98

481.

9907

3A

6–A

93.

0080

3.05

602.

9610

2.95

703.

0408

2.99

562.

9881

4A

10–A

11

0.01

000.

0100

0.01

000.

0100

0.01

000.

0100

0.01

005

A1

2–A

13

0.01

000.

0100

0.01

000.

0100

0.01

000.

0100

0.01

006

A1

4–A

17

0.67

900.

6650

0.69

100.

6891

0.69

290.

6852

0.68

247

A1

8–A

21

1.61

101.

6420

1.61

701.

6209

1.68

661.

6778

1.67

648

A2

2–A

25

2.67

802.

6790

2.67

402.

6768

2.63

622.

6599

2.66

56

Bes

tw

eigh

t(l

b)54

4.99

0054

5.16

0054

5.12

0054

5.09

0054

5.22

545.

1654

5.16

4C

onst

rain

tto

lera

nce

(%)

[11]

3.52

2.06

Non

eN

one

Non

eN

one

Non

eN

um

ber

ofst

ruct

ura

lan

alys

es98

7512

,500

9051

15,3

1818

,400

10,8

0013

,326

50 A. Kaveh et al. / Computers and Structures 143 (2014) 40–59

the leadership when the leader becomes aged. By comparing ALC-PSO with eight popular PSO variants, it was found that ALC-PSO hasa good balance between global and local exploration abilitieswhich allows better optimized designs to be found [16]. Therefore,ALC-PSO was taken as an effective comparison basis to evaluate theperformance of HPSSO.

Table 2 compares the optimized costs found by each algorithm.It can be seen that HPSSO always found the best design except forthe Sphere function problem. In the Generalized Penalized prob-lem, HPSSO was less efficient than SSO if the number of functionevaluations is 2 � 104 and behaved practically the same as ALC-PSO. Table 3 provides the detailed results of the 30 independentoptimization runs carried out with HPSSO and ALC-PSO. It appearsthat HPSSO is very competitive with ALC-PSO and even more reli-able than the referenced algorithm except for the GeneralizedPenalized problem. It should be noted that in the Rosenbrock func-tion case (f2), HPSSO converges to the global optimum withoutbeing trapped in a local optimum with cost (n � 1) if all designvariables become equal to 0. This is observed for all 30 indepen-dent runs performed in this study.

Fig. 6 presents the convergence history of HPSSO for the bestobserved results and the average of the 30 independent runs. Plotsare relative to test problems f1, f7, f9, and f12. The presence of step-like movements indicates how HPSSO can escape from local optimaand find better designs thus avoiding premature convergence.

In order to further analyze convergence behavior, optimizationhistories of the best and worst particles, and the average optimiza-tion history of all particles corresponding to a single independentrun of the HPSSO are presented in Fig. 7. It appears that HPSSOreaches an effective balance between global search and localsearch. It was always seen that the difference between the averageof all particles and the worst particle is much smaller than the dif-ference between average and the best particle. Another importantaspect to be underlined is that in the early optimization cycleswhere randomness plays the main rule, the distance between thebest particle diagram and both average and worst particle dia-grams is small. HPSSO tries to increase this distance in the nextiterations: this demonstrates the diversification capability of theoptimization search. The distance then decreases again as thesearch process progresses and finally becomes negligible uponreaching the optimum design: this demonstrates the intensifica-tion ability of HPSSO. Such a behavior was observed in all testproblems. The previous discussion demonstrates that HPSSOachieve an effective balance between diversification andintensification.

4. Truss design problems

The HPSSO algorithm developed in this research was furthertested in six classical weight minimization problems of a planar10-bar truss, a spatial 22-bar truss, a spatial 25-bar transmissiontower, a spatial 72-bar truss, a 120-bar dome shaped truss and aplanar 200-bar truss. The problems included 10, 7, 8, 16, 7 and29 continuous sizing variables, respectively. Optimization resultswere compared with literature, in particular with advanced PSOvariants. For example, Talatahari et al. [20] have recently devel-oped a multi-stage particle swarm optimization algorithm(MSPSO) including three auxiliary improving mechanisms toenhance efficiency and reliability with respect to standard PSO.

The best combination of internal parameters was determinedby carrying out a sensitivity analysis on the 25-bar tower problem:population included 30 particles and 5 subcolonies, 5 aimlessparticles; c1 and c3 were set equal to 0.8, c2 was set equal to 2,the inertia weight decreased from 0.9 to 0.7. For example, Table 4presents the results of the sensitivity analysis of 50 independent

Page 12: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Table 11Comparison of robustness and reliability of HPSSO and other metaheuristic methods in the 25-bar tower problem.

Algorithm Weight (lb)

Best Average Worst Difference best–averagesolution (%)

Difference best–worstsolution (%)

SD

HPSACO [23] 544.9900 545.52 N/A 0.0972 — 0.315HBB–BC [25] 545.1600 545.66 N/A 0.0917 — 0.367SAHS [6] 545.1200 545.94 545.94 0.1504 0.1504 0.91TLBO [11] 545.0900 545.41 546.33 0.0587 0.2275 0.42PSO [20] 545.220 549.960 594.530 0.87 9.04 9.91MSPSO [20] 545.160 546.030 548.780 0.16 0.66 0.8HPSSO 545.164 545.556 546.990 0.0718 0.3349 0.432

(a)

(b)

0 0.5 1 1.5 2 2.5

x 104

545

550

555

560

565

570

Number of structural analyses

Pena

lized

Wei

ght (

lb)

HPSSOHPSACOPSOMSPSOTLBOSAHS

0 2000 4000 6000 8000 10000 12000 14000520

540

560

580

600

620

640

660

680

700

Number of structural analyses

Pena

lized

wei

ght (

lb)

The best particle

Average

The worst particle

0 0.5 1 1.5 2 2.5x 10

4

545

545.2

545.4

545.6

545.8

546

546.2

546.4

546.6

546.8

547

Fig. 11. Convergence curves recorded for the spatial 25-bar tower: (a) Comparingconvergence rate between algorithms; (b) Comparison of best and worst particlesand average of all particles for HPSSO.

A. Kaveh et al. / Computers and Structures 143 (2014) 40–59 51

runs, carried out for adjusting the population parameters. It can beseen that considering the number of subcolonies as close to onetenth of the population size, leads to lighter designs. The lightestdesign is obtained for N = 30 and Nsubcolony = 5. The stopping crite-rion was based on the maximum number of optimization iterationsset equal to 600. Aimless particles performed their searches in theneighborhood of local leaders. It can be seen that the best combi-nation of internal parameters chosen for the truss design problemsis quite different from its counterpart set in the mathematical opti-mization problems (x is decreased linearly from 0.9 to 0.4. Self,social and proximity learning factors are set as c1 = c2 = c3 = 2).The inertia weight parameter is increased and self and proximitylearning factors are decayed. Reducing the proximity learning fac-tor should be due to the different search strategy utilized for aim-less particles. In truss design problem, aimless particles learn fromlocal leaders. In this way, the promising regions are explored morethoroughly for finding better solutions. Hence, movements ofexplorer particles towards local leaders are reduced. The increaseof inertia weight parameter together with reducing the self-learn-ing factor makes no change on the global search ability. Suchchanges in the parameter tuning process result in decaying of thestep sizes, and avoid the utilization of extra-large step sizes forthe non-smooth constrained search spaces.

In order to investigate the effect of the initial population on theoptimization process, each test problem was solved independently50 times starting from a different population randomly generated.This allowed to account for the random nature of the HPSSO algo-rithm. The optimization code was coded in the MATLAB softwareenvironment. Structural analyses entailed by the optimization pro-cess were performed by means of the direct stiffness method [21].

Table 5 compares statistical results (i.e. best, average, and worstoptimized weight) for truss test problems. It appears that makingaimless particles search in the neighborhood of local leaders is amore effective strategy than searching in the neighborhood ofthe head leader. However, local search became less effective interms of average weight and worst weight for increasing complex-ity of optimization problem. In summary, dynamic local searchabout positions of local leaders somehow implies global searchability of HPSSO.

4.1. Statement of the weight minimization truss problem

The weight minimization problem of a truss structure can bestated as follows:

Find fXg ¼ ½x1; x2; . . . ; xng �; xi 2 D

To minimize WðfXgÞ ¼Xng

i¼1

xi

XnmðiÞ

j¼1

qj:Lj

Subject to : gjðfXgÞ 6 0 J ¼ 1;2; . . . ; n

ð8Þ

where {X} is the set of design variables; ng is the number of membergroups (i.e. the number of optimization variables) defined according

to structural symmetry; D represents the design space including thecross-sectional areas of truss elements that can take discrete or con-tinuous values; W({X}) is the weight of the structure; nm(i) is thenumber of members included in the ith group; qi and Lj are respec-tively the material density and the length of the jth memberincluded in the ith group; gj({X}) denote the n optimizationconstraints.

In order to handle optimization constraints, a penalty approachwas utilized in this study by introducing the following pseudo-costfunction:

Page 13: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

(1)(2)

(3)(4)

(5)(6)

(7)(8)

1 2

34

5 6 7

8

9 10

11

12

13

14

15

16 17

18

(1) (2)

(5) (6)

(9) (10)

(13) (14)

(17) (18)

X

Z

60 in.

60 in.

60 in.

60 in.

120 in.

120 in.

X

Y

Typical Story

Element and node numbering

Fig. 12. Schematic of the 72-bar spatial truss.

Table 12Comparison of optimization results obtained by HPSSO and other metaheuristic methods in the 72-bar truss problem.

Element group Optimal cross-sectional areas (in2)

ABC-AP SAHS TLBO PSO MSPSO HPSSOSonmez [25] Degertekin [6] Degertekin [11] Talatahari et al. [20] Talatahari et al. [20] Present work

1 A1–A4 1.8907 1.8890 1.8929 1.9067 1.9005 1.893262 A5–A12 0.5166 0.5200 0.5160 0.5311 0.5056 0.5111323 A13–A16 0.0100 0.0100 0.0100 0.0100 0.0100 0.014 A17–A18 0.0100 0.0100 0.0100 0.0100 0.0100 0.015 A19–A22 1.2968 1.2890 1.2917 1.3154 1.2914 1.2912276 A23–A30 0.5191 0.5240 0.5176 0.5168 0.5158 0.5151167 A31–A34 0.0100 0.0100 0.0100 0.0100 0.0100 0.018 A35–A36 0.0101 0.0100 0.0100 0.0100 0.0100 0.019 A37–A40 0.5208 0.5390 0.5229 0.5345 0.5178 0.53608210 A41–A48 0.5178 0.5190 0.5193 0.5055 0.5188 0.52117611 A49–A52 0.0100 0.0150 0.0100 0.0114 0.0108 0.01001912 A53–A54 0.1048 0.1050 0.0997 0.1055 0.1165 0.11087813 A55–A58 0.1675 0.1670 0.1680 0.1672 0.1659 0.16669114 A59–A66 0.5346 0.5320 0.5359 0.5322 0.5479 0.53398415 A67–A70 0.4443 0.4250 0.4457 0.4422 0.4437 0.45372416 A71–A72 0.5803 0.5790 0.5818 0.5591 0.5619 0.574581

Best weight (lb) 364 364.0500 363.8410 364 363.9 363.8581Number of structural analyses 400 � 103 12,852 17,954 N/A 18,400 13,086

52 A. Kaveh et al. / Computers and Structures 143 (2014) 40–59

fcos tðfXgÞ ¼ ð1þ e1:tÞe2 �WðfXgÞ; t ¼Xn

j¼1

max½0; gjðfXgÞ� ð9Þ

where t is the total constraint violation. Constants e1 and e2 must beselected considering the exploration and the exploitation rate of thesearch space. In this study, e1 was set equal to one while e2 was

selected so as to decrease the total penalty yet reducing cross-sec-tional areas. Thus, e2 increased from the value of 1.5 set in the firststeps of the search process to the value of 3 set toward the end ofthe optimization process.

Stress limits on truss members were imposed according to ASD-AISC [22]:

Page 14: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Table 13Comparison of robustness and reliability of HPSSO and other metaheuristic methods in the 72-bar truss problem.

Algorithm Weight (lb)

Best Average Worst Difference best–averagesolution (%)

Difference best–worstsolution (%)

SD

SAHS [6] 364.05 366.57 369.15 0.6922 1.4009 2.02TLBO [11] 363.841 364.42 365.01 0.1591 0.3213 0.49PSO [20] 364.000 412.610 662.340 13.35 81.96 64.9MSPSO [20] 363.900 364.350 365.850 0.12 0.53 0.32HPSSO 363.858 364.065 364.966 0.0569 0.3045 0.305

(a)

(b)

0 0.5 1 1.5 2 2.5

x 104

360

380

400

420

440

460

480

500

Number os structural analyses

Pena

lized

wei

ght (

lb)

HPSSO

PSO

MSPSO

TLBO

SAHS

0 2000 4000 6000 8000 10000 12000 140000

500

1000

1500

2000

2500

3000

Number of structural analyses

Pena

lized

wei

ght (

lb)

The best particle

Average

The worst particle

0 1 2x 10

4

364

366

368

370

Fig. 13. Convergence curves recorded for the spatial 72-bar truss: (a) Comparingconvergence rate between algorithms; and (b) Comparison of best and worstparticles and average of all particles.

A. Kaveh et al. / Computers and Structures 143 (2014) 40–59 53

rþi ¼ 0:6Fy for ri P 0r�i for ri < 0

�ð10Þ

r�i ¼1� k2

i2c2

c

� �Fy

h i.53þ

3ki8ccþ k3

i8c3

c

� �for ki P cc

12p2E23k2

ifor ki P cc

8><>: ð11Þ

where E is the modulus of elasticity; Fy is the yield stress; ki is theslenderness ratio (ki = kli/ri); Cc is the slenderness ratio separatingelastic and inelastic buckling regions (cc ¼

ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi2p2E=Fy

p); k is the

effective length factor; Li is the length and ri the correspondingradius of gyration of the ith element. The radius of gyration canbe related to cross-sectional areas as (ri ¼ aAb

i ) where constants a

and b depend on the type of element cross section (for example,pipes, angles, and tees). In this study, pipe sections (a = 0.4993and b = 0.6777) were adopted for bars [1].

Optimization constraints on nodal displacements were set asfollows:

di � dui 6 0 i ¼ 1;2; . . . ;nn ð12Þ

where di is the displacement of the ith node of the truss, diu is the

corresponding allowable displacement, and nn is the number ofnodes.

4.2. Planar 10-bar truss

This test case was frequently used in structural design optimi-zation to test optimization algorithms. The optimization problemformulation is described in detail in [23]. Truss geometry includingnode and element numbering, loading conditions (there may betwo variants) and kinematic constraints is shown in Fig. 8.

Table 6 presents the best optimized designs found by HPSSO forthe two problem variants and the corresponding number of struc-tural analyses. The present algorithm is compared with PSO vari-ants (Standard PSO, multi-stage particle swarm optimization(MSPSO)) [20], a hybrid scheme of Particle Swarm Optimizer, AntColony Strategy and Harmony Search (HPSACO) [23], ArtificialBee Colony algorithm with an adaptive penalty function approach(ABC-AP) [24], a Self Adaptive Harmony Search algorithm (SAHS)as an advanced version of Harmony Search algorithm presentedby Degertekin [6], and Teaching Learning Based Otimization algo-rithm (TLBO) [11]. In both loading cases, HPSSO is the most effi-cient algorithm since the lightest design obtained by HPSACOslightly violated the design constraints while the design obtainedby HPSSO is feasible. It should be noted that HPSSO requires14118 structural analyses (with considering the stopping criteriaas 600 iterations) which is twice the number of analyses requiredby SAHS (the fastest optimizer overall). One can also reach nearlythe same optimum designs with less number of analyses consider-ing less number of iterations as stopping criteria. For example, bysetting the limit number of optimization iterations equal to 200,HPSSO found the optimized weights of 5060.875 and 4677.15 lbswithin 4806 and 4686 structural analyses for, respectively, loadcases 1 and 2.

Table 7 presents the optimization results obtained for 50 inde-pendent runs carried out from different initial populations ran-domly generated. HPSSO always outperformed standard PSO andMSPSO. Furthermore, SAHS and HPSSO were the fastest optimizersin load cases 1 and 2, respectively.

4.3. Spatial 22-bar truss

The second structural optimization problem solved in this studyregards the spatial 22-bar truss shown in Fig. 9. This test case,described in detail in Ref. [1], was previously studied by Lee andGeem [1] using Harmony Search (HS) algorithm, and Talatahariet al. [20] using standard PSO and a multi-stage particle swarm

Page 15: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

(44)

(45)

(46)

(47)

(48)

(49)

(38)

(39)

(40)

(41)

(42)

(43)

(1)

(2)(3)

(4)

(5)

(6)

(7)(8)

(9)

(10)

(11)

(12)

(13)

(14)(15)

(16)

(17)

(18)

(19)

(20)

(21)

(22)

(23)

(24)

(25) (26) (27)

(28)

(29)

(30)

(31)

(32)

(33)

(34)

(35)

(36)

(37)

1 1 1

2 2

33

34 4 44

55 5

56

66

77 7

7

X

Y

275.59 in. (700cm)

196.85 in. (500cm)

118.11 in. (300cm)

±0.00

Z

273.26 in.(694.1 cm)

492.12 in.(1250 cm)

625.59 in.(1589 cm)

Fig. 14. Schematic of the 120-bar dome.

54 A. Kaveh et al. / Computers and Structures 143 (2014) 40–59

optimization (MSPSO) algorithm. The optimized designs found bythe different algorithms are compared in Table 8 that shows alsothe corresponding structural weights and required structural anal-yses. It can be seen that HS required only 10,000 analyses (about40% less than the present algorithm) to find the best design overall,slightly lighter than those obtained by HPSSO, standard PSO andMSPSO which were practically the same. HPSSO required muchless structural analyses than PSO but 15% more analyses thanMSPSO. Statistical results of independent optimization runs arepresented in Table 9. HPSSO was much more robust than standardPSO and outperformed MSPSO except for the worst optimumdesign.

4.4. Spatial 25-bar tower

The third structural optimization problem solved in thisresearch was the weight minimization of the spatial 25-bar trussschematized in Fig. 10. This is a very well known test problem.Table 10 compares the optimized design found by HPSSO withthose found by HPSACO [23], hybrid Big-Bang Big-Crunch algo-rithm (HBB-BC) [25], SAHS [6], TLBO [11], standard PSO and MSPSO[20]. The lightest design is obtained by TLBO algorithm which is0.074 lbs lighter than that found by HPSSO. SAHS is overall themost efficient optimizer considering both convergence speed andstructural weight: in particular, HPSSO requires 4275 (i.e. 45%)

Page 16: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Tabl

e14

Com

pari

son

ofop

tim

izat

ion

resu

lts

obta

ined

byH

PSSO

and

othe

rm

etah

euri

stic

met

hods

inth

e12

0-ba

rdo

me

prob

lem

.

Elem

ent

grou

pO

ptim

alcr

oss-

sect

ion

alar

eas

(in

2)

HPS

AC

OC

SSIC

AC

SPS

OM

SPSO

HPS

SOK

aveh

and

Tala

tah

ari

[23]

Kav

ehan

dTa

lata

har

i[8

]K

aveh

and

Tala

tah

ari

[9]

Kav

ehan

dB

akh

shpo

ori

[10]

Tala

tah

ari

etal

.[20

]Ta

lata

har

iet

al.[

20]

Pres

ent

wor

k

1A

13.

095

3.02

73.

0275

3.02

443.

0252

3.02

443.

0241

392

A2

14.4

0514

.606

14.4

596

14.7

168

14.8

108

14.7

804

14.7

8086

3A

35.

020

5.04

45.

2446

5.08

005.

1531

5.05

675.

0521

644

A4

3.35

23.

139

3.14

133.

1374

3.13

453.

1359

3.13

6943

5A

58.

631

8.54

38.

4541

8.50

128.

3977

8.48

308.

5003

536

A6

3.43

23.

367

3.35

673.

3019

3.29

333.

3104

3.28

8777

7A

72.

499

2.49

72.

4947

2.49

652.

4955

2.49

772.

4968

8

Bes

tw

eigh

t(l

b)33

248.

933

251.

933

256.

233

250.

4233

251.

9533

251.

2233

250.

05A

vera

gew

eigh

t(l

b)N

/AN

/AN

/A33

253.

2833

666.

0433

257.

2933

260.

700

Nu

mbe

rof

stru

ctu

ral

anal

yses

10,0

0070

0060

0063

0015

0,00

015

,000

13,4

22

A. Kaveh et al. / Computers and Structures 143 (2014) 40–59 55

structural analyses more than SAHS. It can be seen that HPSSO isvery competitive with MSPSO and standard PSO: in fact the num-ber of structural analyses required by HPSSO is in the range definedby MSPSO and standard PSO.

Statistical results of 50 independent runs are compared inTable 11. The present code was more robust than PSO and MSPSOexcept for the best weight which was marginally different fromthat found by MSPSO. It can be seen that HPSSO is quite competi-tive with SAHS and TLBO.

Convergence curves reported for HPSACO [23], SAHS [6], TLBO[11], standard PSO and MSPSO [20] as the best optimizationobserved run for independent runs starting from a different popu-lation randomly generated beside the obtained one based on theHPSSO are compared in Fig. 11a. HPSSO is faster than standardPSO and MSPSO and is competitive with SAHS and TLBO. The bestconvergence rate is obtained by HPSACO. In order to further eval-uate algorithm performance, Fig. 11b shows the optimization his-tories for the best run seen for the best particle, average of allparticles, and worst particle. The present algorithm tries to balanceglobal and local searches. Although optimization histories did notconverge at the same point, they are quite close to each other.The large difference between the worst particle diagram and theother two diagrams may derive from the constrained nature ofthe problem.

4.5. Spatial 72-bar truss

Fig. 12 shows the schematic of the spatial 72-bar truss (num-bering of nodes and elements and element grouping are indicatedin the figure). Detailed information on this test problem are givenin [23]. Table 12 compares optimization results of HPSSO withStandard PSO, and MSPSO [20], ABC-AP [24], SAHS [6] and TLBO[11]. It can be seen that the lightest design is obtained by TLBOwhich is 0.017 lbs lighter than design obtained by HPSSO. Thepresent algorithm designed a lighter structure than MSPSO withinmuch less structural analyses. It appears that HPSSO is overall themost efficient optimization algorithm considering convergencespeed and structural weight. Statistical results of 50 independentruns for HPSSO, SAHS, TLBO, standard PSO and MSPSO are pre-sented in Table 13. HPSSO was by far the most robust and reliablealgorithm.

Convergence curves reported for SAHS [6], TLBO [11], standardPSO and MSPSO [20] as the best optimization observed run forindependent runs starting from a different population randomlygenerated beside the obtained one based on the HPSSO are shownin Fig. 13a. The present algorithm converges more quickly to theoptimum than other PSO variants and is competitive with SAHSand TLBO. Fig. 13b shows the optimization histories for the bestparticle, average of all particles and worst particle. It can be seenthat HPSSO attempts to balance between global and local searchabilities and reduces the distance between the three diagrams toa negligible amount approaching the optimum design. Therefore,HPSSO preserves its global search ability until the final iterationsof the optimization process. Although optimization histories didnot converge at the same point, they are quite close to each other.The large difference between the worst particle diagram and theother two diagrams may derive from the constrained nature ofthe problem.

4.6. 120-bar dome truss

The 120-bar dome truss optimized in this test case is schema-tized in Fig. 14. For the sake of clarity, not all the element groupswere numbered in the figure. Because of structural symmetry,the 120 members were divided in seven groups. Stress constraintswere defined by Eqs. (10) and (11), and displacement limitations

Page 17: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Table 15Comparison of robustness and reliability of HPSSO and other metaheuristic methods in the 120-bar dome problem.

Algorithm Weight (lb)

Best Average Worst Difference best–averagesolution (%)

Difference best–worstsolution (%)

SD

PSO [20] 33251.96 33666.04 40231.33 1.24 21 1031.3MSPSO [20] 33251.22 33257.29 33269.13 0.02 0.05 4.29HPSSO 33250.05 33260.70 33307.16 0.032 0.17 10.49

1 2 3 4 5

14

19

131211109876

15 16 17 18

282726252423222120

29 30 31 32 33

424140393837363534

43 44 45 46 47

565554535251504948

57 58 59 60 61

706968676665646362

71 72 73 74

76 77

75

1 2 3 4

5 6 7 8 9 10 11 12 13 14 15 16 17

18 19 20 21 22 23 24 25

170 171 172 173 174 175 176 177

178 179 180 181 182 183 184 185 186 187 188 189 190

191 192 193 194

195196

197 198199

200

1014

4in

.36

0in

.

4@240 in.

X

Y

Note: For the sake of clarity, not all membersare numbered in this figure.

Fig. 15. Schematic of the planar 200-bar truss.

56 A. Kaveh et al. / Computers and Structures 143 (2014) 40–59

were imposed on all nodes in x, y and z coordinate directions. Fur-ther details on this optimization problem can be found in [10]. The

structure was previously optimized by HPSACO [23], Charged sys-tem Search algorithm (CSS) [8], Imperialist Competetive Algorithm

Page 18: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

Table 16Comparison of optimization results obtained by HPSSO and other metaheuristic methods in the 200-bar truss problem.

Element group Optimal cross-sectional areas (in2)

HPSACO CMLPSA SAHS TLBO HPSSOKaveh and Talatahari [23] Lamberti [3] Degertekin [6] Degertekin [11] Present work

1 0.1033 0.1468 0.1540 0.1460 0.12132 0.9184 0.9400 0.9410 0.9410 0.94263 0.1202 0.1000 0.1000 0.1000 0.12204 0.1009 0.1000 0.1000 0.1010 0.10005 1.8664 1.9400 1.9420 1.9410 2.01436 0.2826 0.2962 0.3010 0.2960 0.28007 0.1000 0.1000 0.1000 0.1000 0.15898 2.9683 3.1042 3.1080 3.1210 3.06669 0.1000 0.1000 0.1000 0.1000 0.1002

10 3.9456 4.1042 4.1060 4.1730 4.041811 0.3742 0.4034 0.4090 0.4010 0.414212 0.4501 0.1912 0.1910 0.1810 0.485213 4.9603 5.4284 5.4280 5.4230 5.419614 1.0738 0.1000 0.1000 0.1000 0.100015 5.9785 6.4284 6.4270 6.4220 6.374916 0.7863 0.5734 0.5810 0.5710 0.681317 0.7374 0.1327 0.1510 0.1560 0.157618 7.3809 7.9717 7.9730 7.9580 8.144719 0.6674 0.1000 0.1000 0.1000 0.100020 8.3000 8.9717 8.9740 8.9580 9.092021 1.1967 0.7049 0.7190 0.7200 0.746222 1.0000 0.4196 0.4220 0.4780 0.211423 10.8262 10.8636 10.8920 10.8970 10.958724 0.1000 0.1000 0.1000 0.1000 0.100025 11.6976 11.8606 11.8870 11.8970 11.983226 1.3880 1.0339 1.0400 1.0800 0.924127 4.9523 6.6818 6.6460 6.4620 6.767628 8.8000 10.8113 10.8040 10.7990 10.963929 14.6645 13.8404 13.8700 13.9220 13.8186

Weight (lb) 25156.5 25445.63 25491.9 25,488.15 25,698.85Average weight (lb) N/A N/A 25610.2 25,533.14 28,386.72SD (lb) N/A N/A 141.85 27.44 2403Number of structural analyses 9875 9650 19,670 28,059 14,406

A. Kaveh et al. / Computers and Structures 143 (2014) 40–59 57

(ICA) [9], Cuckoo Search algorithm (CS) [10], standard PSO andMSPSO [20].

Table 14 compares the optimization results of the HPSSO withother methods. HPSSO results in the best design so far. It can beseen that the present algorithm is efficient than the MSPSO interms of optimized weight and convergence rate. HPSSO yieldslighter design than the CS as the best result reported in the optimi-zation literature at the expense of significantly more computa-tional cost.

The statistical results of 50 independent runs are provided inthe Table 15 for HPSSO, standard PSO and MSPSO. The present codewas more reliable and robust than standard PSO but less efficientthan MSPSO.

4.7. Planar 200-bar truss

The planar 200-bar truss optimized in the last test problem isshown in Fig. 15. The elastic modulus of the material is30,000 ksi while density is 0.283 lb/in3. The allowable stress forall members is 10 ksi (the same in tension and compression). Nodisplacement constraints are included in the optimization process.The structure can be divided into 29 groups of elements. The min-imum cross-sectional area of all design variables is 0.1 in2. Thistruss is subjected to three independent loading conditions. Furtherdetails on this optimization problem can be found in [11]. Table 16presents the optimum designs obtained by HPSSO, HPSACO [23], aCorrected Multi-Level and Multi-Point Simulated Annealing algo-rithm (CMLPSA) [3], SAHS [6] and TLBO [11]. TLBO achieved thelightest structure amongst feasible or almost feasible optimizeddesigns. The design optimized by HPSSO weights 25698.85 lbs,

hence only 0.4% and 0.3% heavier than those optimized by SAHSand TLBO, respectively. However, the present algorithm completedthe optimization process within much less structural analyses.Convergence curves of the best optimization runs carried out forHPSSO, SAHS, TLBO and HPSACO are compared in Fig. 16a. It canbe seen that the present algorithm is apparently slower than otheroptimization algorithms; HPSACO was the fastest optimizer. How-ever, continuous step-like movements of HPSSO shows how itcould bypass all local optima. Whilst such a convergence behaviormay suggest to include more optimization iterations in the termi-nation criterion, the present results demonstrate that performingmore than 600 iterations does not improve significantly structuralweight.

Convergence behavior of metaheuristic algorithms may alsodepend on the ratio between population size and number of opti-mization variables. In this regard, Fig. 16b shows the convergencecurves for best optimization run over 30 independent runs carriedout including different population parameters (number of subcol-onies were set according to Table 4). The best design was obtainedfor N = 30. Furthermore, considering a population with less than 50particles does not affect significantly convergence rate.

Further evidence of the search performance of HPSSO can begathered from Fig. 17 that plots the number of structural analysesrequired in the optimization process with respect to the degree ofcomplexity of test problems (i.e. according to the number of designvariables, size of search space, number of stress and displacementconstraints equal to the number of members and nodes, respec-tively). The present algorithm is compared with SAHS, TLBO andMSPSO. Data relative to the 10-bar truss problem were averagedover the two loading cases. While the computational cost of the

Page 19: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

(a)

(b)

0 0.5 1 1.5 2 2.5

x104

2

4

6

8

10

12x 10

4

Number of structural analyses

Pena

lized

wei

ght (

lb)

HPSSO

HPSACO

TLBO

SAHS

0 5000 10000 150002

4

6

8

10

12x 10

4

Number of structural analyses

Pena

lized

wei

ght (

lb)

population size=20

population size=30

population size=50

population size=100

0 0.5 1 1.5 2 2.52.5

2.6

2.7

2.8

2.9

30

0 5000 10000 15000

3

4

5

6

Pena

lized

wei

ght (

lb)

x 10

Fig. 16. Convergence curves recorded for the planar 200-bar truss: (a) Comparingconvergence rate between algorithms; and (b) Comparison of HPSSO convergencerate for different values of internal population parameters.

1 (10) 2 (7) 3 (8) 4 (16) 5 (7) 6 (29)0.5

1

1.5

2

2.5

3x 10

4

Truss test problem (number of design variables)

Num

ber

of s

truc

tura

l ana

lyse

s

HPSSO

SAHS

TLBO

MSPSO

Fig. 17. Sensitivity of computational cost of different optimizers with respect toproblem complexity.

58 A. Kaveh et al. / Computers and Structures 143 (2014) 40–59

other algorithms increases with problem complexity, HPSSO isvery effective in all optimization iterations and can always achievea very efficient balance between local search ability and globalsearch ability regardless of the size of the optimization problem.

5. Conclusion

A new efficient hybrid swarm intelligence based algorithmcombining Swallow Swarm Optimization and Particle Swarm Opti-mization was developed in this research. Since population isdivided in subcolonies, particles can learn not only from the bestglobally-experienced particle but also from the best particle ofeach subcolony. The new HPSSO algorithm included elitism as itselects and preserves best particles (i.e. global and local leaders)in the process of updating population; makes a good balancebetween global and local search as explorer particles have the abil-ity of learning from self, social and proximity cognition; utilizesaimless particles to further adjust local search ability.

The new algorithm was tested in eleven mathematical optimi-zation problems and six truss design optimization problemsincluding continuous sizing variables. Numerical results demon-strated the efficiency of the proposed optimization algorithm thatoutperformed standard PSO and state-of-the-art variants of PSOand was very competitive with other metaheuristic algorithms.

Acknowledgement

The first author is grateful to the Iran National Science Founda-tion for the support.

References

[1] Lee KS, Geem W. A new structural optimization method based on the harmonysearch algorithm. Comput Struct 2004;82:781–98.

[2] Rajeev S, Krishnamoorthy CS. Discrete optimization of structures using geneticalgorithms. J Struct Eng 1992;118:1233–50.

[3] Lamberti L. An efficient simulated annealing algorithm for design optimizationof truss structures. Comput Struct 2008;86:1936–53.

[4] Camp CV, Bichon BJ. Design of space trusses using ant colony optimization. JStruct Eng 2004;130:741–51.

[5] Kaveh A, Talatahari S. A particle swarm ant colony optimization for trussstructures with discrete variables. J Constr Steel Res 2009;65:1558–68.

[6] Degertekin SO. Improved harmony search algorithms for sizing optimization oftruss structures. Comput Struct 2012;92–93:229–41.

[7] Camp CV. Design of space trusses using big bang-big crunch optimization. JStruct Eng 2007;133:999–1008.

[8] Kaveh A, Talatahari S. Optimal design of skeletal structures via the chargedsystem search algorithm. Struct Multidiscip O 2010;41:893–911.

[9] Kaveh A, Talatahari S. Optimum design of skeletal structures using imperialistcompetitive algorithm. Comput Struct 2010;88:1220–9.

[10] Kaveh A, Bakhshpoori T. Optimum design of space trusses using cuckoo searchalgorithm with levy flights. Iran J Sci Technol Trans B Eng 2013;37:1–15.

[11] Degertekin SO. Sizing truss structures using teaching-learning-basedoptimization. Comput Struct 2013;119:177–88.

[12] Sadollah A, Bahreininejad A, Eskandar H, Hamdi M. Mine blast algorithm foroptimization of truss structures with discrete variable. Comput Struct2012;102:49–63.

[13] Talbi EG. Metaheuristics: from design to implementation. Hoboken, NewJersey: John Wiley & Sons; 2009.

[14] Kennedy J, Eberhart R. Swarm intelligence. San Francisco: Morgan Kaufmann;2001.

[15] Neshat M, Sepidnam G, Sargolzaei M. Swallow swarm optimization algorithm:a new method to optimization. Neural Comput Appl 2013;23:429–54.

[16] Chen WN, Zhang J, Lin Y, Chen N, Zhan ZH, Chung HSH, et al. Particle swarmoptimization with an aging leader and challengers. Evol Comput2013;17:241–58.

[17] Banks A, Vincent J, Anyakoha C. A review of particle swarm optimization. PartI: background and development. Nat Comput 2007;6:467–84.

[18] Banks A, Vincent J, Anyakoha C. A review of particle swarm optimization. PartII: hybridisation, combinatorial, multicriteria and constrained optimization,and indicative applications. Nat Comput 2008;7:109–24.

Page 20: An efficient hybrid Particle Swarm and Swallow Swarm Optimization algorithm

A. Kaveh et al. / Computers and Structures 143 (2014) 40–59 59

[19] Eberhart RC, Shi Y. Comparing inertia weights and constriction factors inparticle swarm optimization. In: Proceedings of the 2000 congress onevolutionary computation; 2000. p. 84–8.

[20] Talatahari S, Kheirollahi M, Farahmandpour C, Gandomi AH. A multi-stageparticle swarm for optimum design of truss structures. Neural Comput Appl2013;23:1297–309.

[21] Kaveh A. Optimal structural analysis. 2nd ed. UK: John Wiley & Sons; 2006.[22] American Institute of Steel Construction (AISC). Manual of steel construction–

allowable stress design. 9th ed., AISC, Chicago; 1989.

[23] Kaveh A, Talatahari S. Particle swarm optimizer, ant colony strategy andharmony search scheme hybridized for optimization of truss structures.Comput Struct 2009;87:267–83.

[24] Sonmez M. Artificial bee colony algorithm for optimization of truss structures.Appl Soft Comput 2011;11:2406–18.

[25] Kaveh A, Talatahari S. Size optimization of space trusses using Big-Bang Big-Crunch algorithm. Comput Struct 2009;87:1129–40.