18
Research Article Grey Wolf Optimizer Based on Powell Local Optimization Method for Clustering Analysis Sen Zhang 1 and Yongquan Zhou 1,2 1 College of Information Science and Engineering, Guangxi University for Nationalities, Nanning 530006, China 2 Guangxi High School Key Laboratory of Complex System and Computational Intelligence, Nanning 530006, China Correspondence should be addressed to Yongquan Zhou; [email protected] Received 25 July 2015; Accepted 11 October 2015 Academic Editor: Lawrence P. Horwitz Copyright © 2015 S. Zhang and Y. Zhou. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. One heuristic evolutionary algorithm recently proposed is the grey wolf optimizer (GWO), inspired by the leadership hierarchy and hunting mechanism of grey wolves in nature. is paper presents an extended GWO algorithm based on Powell local optimization method, and we call it PGWO. PGWO algorithm significantly improves the original GWO in solving complex optimization problems. Clustering is a popular data analysis and data mining technique. Hence, the PGWO could be applied in solving clustering problems. In this study, first the PGWO algorithm is tested on seven benchmark functions. Second, the PGWO algorithm is used for data clustering on nine data sets. Compared to other state-of-the-art evolutionary algorithms, the results of benchmark and data clustering demonstrate the superior performance of PGWO algorithm. 1. Introduction Swarm Intelligence (SI) has received much attention. Many SI algorithms have been proposed, such as genetic algorithm (GA) [1], particle swarm optimization (PSO) [2], Differential Evolution (DE) [3, 4], and evolutionary programming (EP) [5]. Although these algorithms are capable of solving complex optimization problems, it has been proven by the well- known No Free Lunch (NFL) theorem that there is no optimization solving all optimization problems [6]. So the theorem allows researchers to propose some new algorithms. Some of the recent algorithms are grey wolf optimizer (GWO) [7], Multiverse Optimizer (MVO) [8], Ant Lion Optimizer (ALO) [9], artificial bee colony (ABC) algorithm [10], Firefly algorithm (FA) [11], Cuckoo search (CS) algorithm [12], Cuckoo Optimization Algorithm (COA) [13], gravitational search algorithm (GSA) [14], and Adaptive Gbest-guided Gravitational Search Algorithm (GGSA) [15]. In this paper, we concentrate on GWO, developed by Mir- jalili et al. [7] in 2014 based on simulating hunting behavior and social leadership of grey wolves in nature. Numerical comparisons showed that the superior performance of GWO is competitive to that of other population-based algorithms. Because it is simple and easy to implement and has fewer control parameters, GWO has caused much attention and has been used to solve a number of practical optimization problems [1618]. However, like other stochastic optimization algorithms, such as PSO and GA, as the growth of the search space dimension, GWO algorithm provides a poor convergence behavior at exploitation [19, 20]. erefore, it is necessary to emphasize that our work falls in increasing the local search ability of GWO algorithm. According to [21], we can know that many direct search methods are fairly fast optimizers and have a strong ability of local search. Powell’s method [22] is one of the direct search methods. In order to use good convergence property of Powell’s method, we propose a hybrid metaheuristic that is grey wolf optimizer based on Powell local optimization method (PGWO). Compared to other state-of-the-art evolutionary algorithms, PGWO performs significantly better. Cluster analysis or clustering is the task of grouping sim- ilar objects or multidimensional data vectors into a number of clusters or groups. Clustering analysis is an unsupervised learning process, without a priori knowledge of clustering, and the clustering algorithm is automatic classification based Hindawi Publishing Corporation Discrete Dynamics in Nature and Society Volume 2015, Article ID 481360, 17 pages http://dx.doi.org/10.1155/2015/481360

Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

Research ArticleGrey Wolf Optimizer Based on Powell Local OptimizationMethod for Clustering Analysis

Sen Zhang1 and Yongquan Zhou12

1College of Information Science and Engineering Guangxi University for Nationalities Nanning 530006 China2Guangxi High School Key Laboratory of Complex System and Computational Intelligence Nanning 530006 China

Correspondence should be addressed to Yongquan Zhou yongquanzhou126com

Received 25 July 2015 Accepted 11 October 2015

Academic Editor Lawrence P Horwitz

Copyright copy 2015 S Zhang and Y Zhou This is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work is properlycited

One heuristic evolutionary algorithm recently proposed is the grey wolf optimizer (GWO) inspired by the leadership hierarchy andhunting mechanism of grey wolves in natureThis paper presents an extended GWO algorithm based on Powell local optimizationmethod and we call it PGWO PGWO algorithm significantly improves the original GWO in solving complex optimizationproblems Clustering is a popular data analysis and datamining technique Hence the PGWOcould be applied in solving clusteringproblems In this study first the PGWO algorithm is tested on seven benchmark functions Second the PGWO algorithm is usedfor data clustering on nine data sets Compared to other state-of-the-art evolutionary algorithms the results of benchmark and dataclustering demonstrate the superior performance of PGWO algorithm

1 Introduction

Swarm Intelligence (SI) has received much attention ManySI algorithms have been proposed such as genetic algorithm(GA) [1] particle swarm optimization (PSO) [2] DifferentialEvolution (DE) [3 4] and evolutionary programming (EP)[5] Although these algorithms are capable of solving complexoptimization problems it has been proven by the well-known No Free Lunch (NFL) theorem that there is nooptimization solving all optimization problems [6] So thetheorem allows researchers to propose some new algorithmsSomeof the recent algorithms are greywolf optimizer (GWO)[7] Multiverse Optimizer (MVO) [8] Ant Lion Optimizer(ALO) [9] artificial bee colony (ABC) algorithm [10] Fireflyalgorithm (FA) [11] Cuckoo search (CS) algorithm [12]Cuckoo Optimization Algorithm (COA) [13] gravitationalsearch algorithm (GSA) [14] and Adaptive Gbest-guidedGravitational Search Algorithm (GGSA) [15]

In this paper we concentrate onGWO developed byMir-jalili et al [7] in 2014 based on simulating hunting behaviorand social leadership of grey wolves in nature Numericalcomparisons showed that the superior performance of GWOis competitive to that of other population-based algorithms

Because it is simple and easy to implement and has fewercontrol parameters GWO has caused much attention andhas been used to solve a number of practical optimizationproblems [16ndash18]

However like other stochastic optimization algorithmssuch as PSO and GA as the growth of the search spacedimension GWO algorithm provides a poor convergencebehavior at exploitation [19 20] Therefore it is necessary toemphasize that our work falls in increasing the local searchability of GWO algorithm According to [21] we can knowthat many direct search methods are fairly fast optimizersand have a strong ability of local search Powellrsquos method[22] is one of the direct search methods In order to usegood convergence property of Powellrsquos method we proposea hybrid metaheuristic that is grey wolf optimizer basedon Powell local optimization method (PGWO) Comparedto other state-of-the-art evolutionary algorithms PGWOperforms significantly better

Cluster analysis or clustering is the task of grouping sim-ilar objects or multidimensional data vectors into a numberof clusters or groups Clustering analysis is an unsupervisedlearning process without a priori knowledge of clusteringand the clustering algorithm is automatic classification based

Hindawi Publishing CorporationDiscrete Dynamics in Nature and SocietyVolume 2015 Article ID 481360 17 pageshttpdxdoiorg1011552015481360

2 Discrete Dynamics in Nature and Society

on the distance or similarity between samples As a result ofthis characteristic clustering techniques have been applied toa wide range of problems such as datamining [23] data anal-ysis pattern recognition [24] and image segmentation [25]The traditional clustering methods can be simply classifiedas partitioningmethods hierarchicalmethods density-basedmethods and grid-based methods [26] In this paper weconcentrate on partitioning methods Partitioning clusteringmethod divides data vectors into a predefined number ofclusters through optimizing some certain criterion119870-meansis the most popular partitioning clustering method [27 28]However 119870-means highly depends on initial states andalways falls into local optimum In order to overcome thisproblem many methods have been proposed For example119870-harmonic mean algorithm has been proposed for cluster-ing instead of119870-means in [29] A simulated annealing- (SA-)based approach has been developed in [30] A tabu search-(TS-) based method was introduced in [31 32] Geneticalgorithm- (GA-) based methods were presented in [33ndash36]Fathian et al [37] have developed a clustering algorithmbased on honey-bee mating optimization (HBMO) Theparticle swarm optimization (PSO) is applied for clusteringin [38] Hatamlou et al employed a big bang-big crunchalgorithm for data clustering in [39] Karaboga and Ozturkpresented a novel clustering approach based on artificial beecolony (ABC) algorithm in [40] Data clustering based ongravitational search algorithm was presented in [41 42] In1991 Colorni et al have presented ant colony optimization(ACO) algorithmbased on the behavior of ants seeking a pathbetween their colony and a source of food Then Shelokarand Kao solved the clustering problem using the ACO algo-rithm [43 44] Kao et al have presented a hybrid approachaccording to combination of the119870-means algorithm Nelder-Mead simplex search and PSO for clustering analysis [45]Niknam et al have presented a hybrid evolutionary algorithmbased on PSO and SA to solve the clustering problem [46]But every algorithm has some drawbacks for example 119870-means algorithm sucks in local optima and convergenceis highly dependent on initial positions in case of geneticalgorithm in ACO the solution vector has been affectedas the number of iterations increased Kao et al [45] andNiknam et al [46] stated that PSO gives better clusteringresults when it is applied in one-dimensional data set andfor small data set but when it is applied to large data set itdoes not give the good results and so forth In this paper aPGWO algorithm is used to solve the cluster problem testedon nine data setsThis algorithm takes the advantage of GWOand Powell method Initial process is started by GWO whichallows searching all the space for a global solution Whenglobal solution is found the clustering is switched to Powellfor faster convergence to finish the clustering process As canbe seen from the simulation results this proposed algorithmnot only has higher convergence speed but also can find outthe optimal solution compared to the other algorithms acrossthe majority of data sets whether in small data set or in largedata set

The rest of the paper is organized as follows Section 2presents a brief introduction to GWO Section 3 discusses thebasic principles of GWO based on Powell local optimization

method The experimental results of test functions and dataclustering are shown in Sections 4 and 5 respectively FinallySection 6 concludes the work and suggests some directionsfor future studies

2 Grey Wolf Optimizer (GWO)

The GWO algorithm proposed by Mirjalili et al (2014) [7]is inspired by the hunting behavior and social leadership ofgrey wolves in nature It is similar to other metaheuristicsand in GWO algorithm the search begins by a population ofrandomly generated wolves (candidate solutions) In orderto formulate the social hierarchy of wolves when designingGWO in this algorithm the population is split into fourgroups alpha (120572) beta (120573) delta (120575) and omega (120596) Over thecourse of iterations the first three best solutions are called 120572120573 and 120575 respectively The rest of the candidate solutions arenamed as 120596 In this algorithm the hunting (optimization) isguided by 120572 120573 and 120575 The 120596 wolves are required to encircle120572 120573 and 120575 so as to find better solutions The encircle processcould be molded as follows [7]

=10038161003816100381610038161003816 sdot 119901 (119905) minus (119905)

10038161003816100381610038161003816

(119905 + 1) = 119901 (119905) minus sdot

(1)

where 119905 indicates the current iteration = 2sdot 1199032 = 2 119886sdot 1199031minus 119886119901 is the position vector of the prey indicates the positionvector of a grey wolf is gradually decreased from 2 to 0 and1199031 and 1199032 are random numbers over range [0 1]

In order tomathematically simulate the hunting behaviorof grey wolves in the GWO algorithm we always assume that120572 120573 and 120575 have better knowledge about the position of theprey (optimum)Therefore the positions of the first three bestsolutions (120572 120573 120575) obtained so far are saved and other wolves(120596) are obliged to reposition with respect to 120572 120573 and 120575 Themathematical model of readjusting the positions of 120596 wolvesis presented as follows [7]

120572 =100381610038161003816100381610038161 sdot 120572 minus

10038161003816100381610038161003816

120573 =100381610038161003816100381610038162 sdot 120573 minus

10038161003816100381610038161003816

120575 =100381610038161003816100381610038163 sdot 120575 minus

10038161003816100381610038161003816

(2)

1 = 120572 minus 1 sdot (120572)

2 = 120573 minus 2 sdot (120573)

3 = 120575 minus 3 sdot (120575)

(3)

(119905 + 1) =1 + 2 + 3

3 (4)

where 120572 is the position of the alpha 120573 is the position of thebeta 120575 is the position of the delta 1 2 and 3 and 12 and 3 are all random vectors is the position of thecurrent solution and 119905 indicates the number of iterations

Discrete Dynamics in Nature and Society 3

Initialize the grey wolf population119883119894 = (119894 = 1 2 119899) and parametersCalculate the fitness of populationFind the first three agents119883120572119883120573119883120575While (119905 ltMax number of iterations)

Update the position of the current search agent by (4)Calculate the fitness of populationUpdate119883120572119883120573119883120575119905 = 119905 + 1

End whileReturn119883120572

Algorithm 1 GWO algorithm

Initialize the starting point1198831 independent vectors 119889119894 = 119890119894(119894 = 1 119863) the tolerance for stop criteria 120576 set119891(1) = 119891(1198831)119883119888 = 1198831 119870 = 1

While (stopping criterion is not met namely |Δ119891| gt 120576) doFor 119894 = 1 to 119863 doIf (119870 ge 2) then

119889119894 = 119889119894+1

End If119883119894+1 = 119883119894 + 120582119894119889119894 120582119894 is determined by Minimizing 119891(119883119894+1)

End For

119889119894+1 =

119863

sum

1

120582119894 lowast 119889119894 = 119883119863+1 minus 119883119863 119883119888(119896 + 1) = 119883119863+1 + 120582119896119889119894+1 119891(119896 + 1) = 119891(119883119888(119896 + 1))

119896 = 119896 + 1 1198831 = 119883119888(119896) Δ119891 = 119891(119896) minus 119891(119896 minus 1)

EndWhile

Algorithm 2 Powellrsquos method

In these formulas it may also be observed that there aretwo vectors and obliging the GWO algorithm to exploreand exploit the search space With decreasing 119860 half of theiterations are devoted to exploration (|119860 ge 1|) and the otherhalf are dedicated to exploitation (|119860| lt 1) The range of 119862is 2 le 119862 le 0 and the vector 119862 also improves explorationwhen 119862 gt 1 and the exploitation is emphasized when 119862 lt 1Note here that 119860 is decreased linearly over the course of theiterations In contrast 119862 is generated randomly whose aim isto emphasize explorationexploitation at any stage avoidinglocal optimaThemain steps of grey wolf optimizer are givenin Algorithm 1

3 Grey Wolf Optimizer Based on Powell LocalOptimization Method (PGWO)

31 Local Search Based on Powellrsquos Method Powellrsquos methodstrictly Powellrsquos conjugate direction method is an algorithmproposed by Powell [22] for finding a local minimum of afunction The function need not be differentiable and noderivatives are taken The method successively utilized abidirectional search approach along each search vector topursue the minimum of a function The new position isrepresented as a linear combination of the search vectorsThenew displacement vector as a new search vector is added tothe search vector list Correspondingly the most successful

vector which contributedmost to the new direction is deletedfrom the search vector list The algorithm is iterated by somerun times until no significant improvement is made Thedetailed steps of Powellrsquos method procedure are presented inAlgorithm 2 [47]

32 The Proposed Algorithm Over the last years manyresearch results have been published in the field of evo-lutionary algorithms [7] And the results show that thehybridizationsmetaheuristics and local search techniques areexceptionally successful Successful hybridizations have beenproposed especially in combinatorial and discrete solutionspaces [48 49] Interestingly for real-valued solution spacesfew results have been introduced yetmdashan astonishing fact asmany direct search methods are fairly fast optimizers [21]Therefore in order to overcome the shortcomings of GWOalgorithm including slow convergence speed easily fallinginto local optimumvalue low computation accuracy and lowsuccess rate of convergence [19 20] GWO algorithm basedon Powell local optimization method is proposed It adoptsthe powerful local optimization ability of Powellrsquos methodand embeds it into GWO as a local search operator In thiscase the proposed method has potential to provide supe-rior results compared to other state-of-the-art evolutionaryalgorithms The general steps of PGWO are presented asfollows among which 119901 indicates the performing probability

4 Discrete Dynamics in Nature and Society

Initialize the grey wolf population119883119894 = (119894 = 1 2 119899) and parametersCalculate the fitness of populationFind the first three agents119883120572119883120573119883120575

While (119905 ltMax number of iterations)Update the position of the current search agent by (4)If rand gt 119901 choose current best solution119883120572 as a starting point and generate a new solution1198831015840

120572by Powellrsquos

method as illustrated Algorithm 2 If1198831015840120572lt 119883120572 replace119883120572 with119883

1015840

120572 otherwise go to Step 6

Calculate the fitness of populationUpdate119883120572119883120573119883120575119905 = 119905 + 1

EndWhileReturn119883120572

Algorithm 3 PGWO algorithm

Table 1 Benchmark functions

Functions Name Dim Range 119891min

1198651 =

119899

sum

119894=1

1199092

119894Sphere 30 [minus100 100] 0

1198652 =

119899minus1

sum

119894=1

[100 (119909119894+1 minus 1199092

119894)2

+ (119909119894 minus 1)2] Rosenbrock 30 [minus30 30] 0

1198653 = minus20 exp(minus02radic1

119899

119899

sum

119894=1

1199092

119894) minus exp(1

119899

119899

sum

119894=1

cos (2120587119909119894)) + 20 + 119890 Ackley 30 [minus32 32] 0

1198654 =120587

119899 10 sin (1205871199101) +

119899minus1

sum

119894=1

(1199101 minus 1)2[1 + 10sin2 (120587119910119894+1)] + (119910119899 minus 1)

2

Penalized 30 [minus50 50] 0+

119899

sum

119894=1

119906(119909119894 10 100 4)

119910119894 = 1 +119909119894 + 1

4

119906(119909119894 119886 119896 119898) =

119896 (119909119894 minus 119886)119898

119909119894 gt 119886

0 minus119886 lt 119909119894 lt 119886

119896 (minus119909119894 minus 119886)119898

119909119894 lt minus119886

1198655= 01 sin2 (31205871199091) +

119899

sum

119894=1

(119909119894 minus 1)2[1 + sin2 (3120587119909119894 + 1)] + (119909119899 minus 1)

2[1 + sin2 (2120587119909119899)]

Penalized 2 30 [minus50 50] 0+

119899

sum

119894=1

119906(119909119894 5 100 4)

1198656 =

11

sum

119894=1

[119886119894 minus1199091 (1198872

119894+ 1198871198941199092)

1198872119894+ 1198871198941199093 + 1199094

]

2

Kowalik 4 [minus5 5] 00003075

1198657 = [1 + (1199091 + 1199092 + 1)2(19 minus 141199091 + 3119909

2

1minus 141199092 + 611990911199092 + 3119909

2

2)] Goldstein-Price 2 [minus2 2] 3

times [30 + (21199091 minus 31199092)2times (18 minus 321199091 + 12119909

2

1+ 481199092 minus 3611990911199092 + 27119909

2

2)]

of Powellrsquos method and is set to 05 We also have triedto vary the number of grey wolves (population size 119899) andthe performing probability of Powellrsquos method From oursimulations we found that 119899 = 30 and 119901 = 05 aresufficient for most optimization problems So we will usefixed 119899 = 30 and 119901 = 05 in the rest of the simulationsIn the following section various benchmark functions areemployed to investigate the efficiency of PGWO algorithm(see Algorithm 3)

4 Experimental Results and Discussion41 Simulation Platform All algorithms are tested in MatlabR2012a (714) and experiments are executed on AMD Athlon

(tm) II X4 640 Processor 300GHz PC with 3G RAMOperating system is Windows 7

42 Benchmark Functions In order to evaluate the propertyof PGWO algorithm seven standard benchmark functionsare employed [50 51] These benchmark functions can bedivided into two different groups unimodal and multimodalfunctions Table 1 lists the functions where Dim indicatesdimension of the function Range is the boundary of the func-tionrsquos search space and 119891min is the minimum return value ofthe function among which the former two functions (1198651-1198652)are unimodal (1198653ndash1198655) are multimodal functions and thelast two functions (1198656-1198657) are fixed-dimension multimodal

Discrete Dynamics in Nature and Society 5

Table 2 Initial parameters of algorithms

Algorithm Parameter Value

PGWO

Population size 30 Linearly decreased from 2 to 0

Performing probability of Powell (119875) 05Max iteration 500 (benchmark functions test) 200 (data sets test)

Stopping criteria Max iteration

GWO

Population size 30 Linearly decreased from 2 to 0

Max iteration 500 (functions test) 200 (data sets test)Stopping criteria Max iteration

PSO

Population size 301198621 1198622 14962 14962119882 07298

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

ABC

Population size 30Limit 10

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

CS

Population size 30119901119886 025

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

GGSA

Population size 301198881015840

1(minus211990531198793) + 2

1198881015840

2(211990531198793)

1198660 1120572 20

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

119879 indicates the maximum number of interactions 119905 is the current iteration

benchmark functionsThere are some parameters that shouldbe initialized before running Table 2 is the initial value ofthese algorithms

The experimental results are presented in Table 3 Theresults are averaged over 20 independent runs and boldresults mean that PGWO is better while underlined resultsmean that the other algorithm is better The Best WorstMean and Std represent the optimal fitness value worstfitness value mean fitness value and standard deviationrespectively Note that the Matlab code of the GGSA algo-rithm is given in httpwwwalimirjalilicomProjectshtml

To improve the performance evaluation of evolutionaryalgorithms statistical tests should be conducted [52] In orderto determine whether the results of PGWO differ from thebest results of the other algorithms in a statistical method anonparametric test which is known as Wilcoxonrsquos rank-sumtest [53 54] is performed at 5 significance levelThe119901 valuescalculated in Wilcoxonrsquos rank-sum test comparing PGWOand the other algorithms over all the benchmark functionsare given in Table 4 According to [52] 119901 values lt 005 can beconsidered as sufficient evidence against the null hypothesis

43 Comparison of Experiment Results As shown in Table 3PGWO has the best result for 1198651 For 1198652 PGWO providedbetter results than the other algorithms across Best Worstand Mean while the Std of GWO is better than PGWOTheunimodal benchmark functions have only one global solutionwithout any local optima so they are very suitable to examineexploitation Hence these results indicate that the proposedmethod provides greatly improved exploitation compared tothe other algorithms

However it should be noticed that multimodal bench-mark functions have many local minima The final resultsare more important because these functions can reflect theability of the algorithm escaping from poor local optima andobtaining the global optimum We have tested the experi-ments on 1198653ndash1198655 Seen from Table 3 the PGWO provides thebest results

For 1198656-1198657 these are fixed-dimension multimodal bench-mark functions with only a few local minima the dimensionsof the functions are also small In this case it is hard tojudge the property of the algorithms The major differencecomparedwith functions1198653ndash1198655 is that functions1198656-1198657 appear

6 Discrete Dynamics in Nature and Society

Table 3 Results comparison of different optimal algorithms for 20 independent runs

Function Criteria GGSA CS ABC PSO GWO PGWO

1198651

Best 10133E minus 17 37450E + 00 21820E + 01 17964E minus 06 47447E minus 29 25422E minus 54Worst 10181E minus 16 24776E + 01 18094E + 02 22907E minus 04 22227E minus 26 79888E minus 53Mean 51858E minus 17 88292E + 00 81908E + 01 65181E minus 05 18269E minus 27 16276E minus 53Std 26953E minus 17 47633E + 00 43834E + 01 69891E minus 05 48884E minus 27 17834E minus 53

1198652

Best 23974E + 01 30920E + 02 44296E + 03 24432E + 01 26182E + 01 49374E + 00Worst 34411E + 02 22219E + 03 45866E + 04 41217E + 02 28769E + 01 15466E + 01Mean 77481E + 01 55445E + 02 15193E + 04 12465E + 02 27507E + 01 11692E + 01Std 74751E + 01 41835E + 02 97239E + 03 14004E + 02 67488119864 minus 01 20844119864 + 00

1198653

Best 23591E minus 09 40632E + 00 92131E + 00 63889E minus 03 79048E minus 14 44409E minus 15Worst 98704E minus 09 10603E + 01 16954E + 01 49535E + 00 15010E minus 13 44409E minus 15Mean 50646E minus 09 60677E + 00 13090E + 01 20423E + 00 10498E minus 13 44409E minus 15Std 19095E minus 09 15421E + 00 22882E + 00 11464E + 00 17485E minus 14 0

1198654

Best 54733E minus 01 20863E + 00 75357E minus 01 24473E minus 03 18626E minus 02 16432E minus 32Worst 46556E + 00 41099E + 00 55219E + 00 36426E + 00 12149E minus 01 16432E minus 32Mean 19728E + 00 32711E + 00 20544E + 00 85739E minus 01 54300E minus 02 16432E minus 32Std 10488E + 00 59739E minus 01 13358E + 00 10326E + 00 29934E minus 02 28080E minus 48

1198655

Best 21024E minus 02 27015E + 00 22507E + 00 63080E minus 06 39444E minus 01 13498E minus 32Worst 24387E + 01 12516E + 01 11090E + 02 36461E + 00 85465E minus 01 19625E minus 01Mean 77881E + 00 62644E + 00 11124E + 01 37285E minus 01 67323E minus 01 45918E minus 02Std 66733E + 00 25624E + 00 23792E + 01 93480E minus 01 14330E minus 01 60835E minus 02

1198656

Best 12358E minus 03 31550E minus 04 83768E minus 04 30749119864 minus 04 30750E minus 04 30752119864 minus 04

Worst 12219E minus 02 60087119864 minus 04 34810E minus 03 19361E minus 03 20363E minus 02 15943119864 minus 03

Mean 43558E minus 03 43266119864 minus 04 16421E minus 03 93309E minus 04 64067E minus 03 45499119864 minus 04

Std 26046E minus 03 85397119864 minus 05 72141E minus 04 41733E minus 04 93762E minus 03 34265119864 minus 04

1198657

Best 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00Worst 30000E + 00 30000E + 00 30620E + 00 30000E + 00 84000E + 01 30000E + 00Mean 30000E + 00 30000E + 00 30136E + 00 30000E + 00 11100E + 01 30000E + 00Std 25794E minus 15 15717E minus 15 19505E minus 02 12310119864 minus 15 24931E + 01 92458119864 minus 07

Table 4 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS and GGSA over benchmarkfunctions (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA1198651 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198652 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198653 766E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198654 801E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198655 597E minus 08 0295606 597E minus 08 597E minus 08 467E minus 071198656 0003057 0000375 138E minus 06 0001782 106E minus 071198657 00002 493E minus 08 680E minus 08 619E minus 08 649E minus 08

to be simpler than 1198653ndash1198655 due to their low dimension and asmaller number of local minima For 1198656 CS provided betterresults than the other algorithms across Mean Std andWorst For 1198657 the majority of the algorithms can find theoptimal solution but the PSO is more stable than the otheralgorithms in terms of Std

The 119901 values ofWilcoxonrsquos rank-sum in Table 4 show thatthe results of PGWO in 1198655 are not significantly better thanPSO However the PGWO achieves significant improvementin all remaining benchmark functions compared to the otheralgorithms Therefore this is evidence that the proposed

algorithm has high performance in dealing with unimodalmultimodal and fixed-dimension multimodal benchmarkfunctions

The convergence curves of six algorithms are illustratedin Figure 1 As can be seen in these figures PGWO has amuch better convergence rate than the other algorithms onall benchmark functions

According to this comprehensive comparative study anddiscussions these results show that the proposed algorithmis able to significantly improve the performance of GWOand overcome its major shortcomings For that reason in the

Discrete Dynamics in Nature and Society 7

F6

times107

F4

F2F1times10

4

F3

F5

100 200 300 400 5000Iteration

0

05

1

15

2

25

Aver

age b

est s

o fa

r

101

102

103

104

105

106

107

108

109

Aver

age b

est s

o fa

r (lo

g)

GGSACSABC

PSOGWOPGWO

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

0

1

2

3

4

5

6

7

8

9

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus15

10minus10

10minus5

100

105

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus2

100

102

104

106

108

1010

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus4

10minus3

10minus2

10minus1

100

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

Figure 1 Continued

8 Discrete Dynamics in Nature and Society

100 200 300 400 5000Iteration

2

4

6

8

10

12

14

16

18

20

Aver

age b

est s

o fa

r

F7

GGSACSABC

PSOGWOPGWO

Figure 1 The convergence curves of the average fitness value over 20 independent runs

next section we apply PGWO algorithm to solve clusteringproblem

5 Data Clustering

51 The Mathematical Description of Data Clustering Clus-tering sample is set to119883 = 119883119894 119894 = 1 2 119899 where119883119894 is a119901dimensional vector Clustering problem is to find a partition119862 = 1198621 1198622 119862119898 satisfying the following conditions[26]

(1) 119883 = ⋃119898

119894=1119862119894

(2) 119862119894 = 120601 119894 = 1 2 119898(3) 119862119894 cap 119862119895 = 120601 119894 119895 = 1 2 119898 119894 = 119895

52 The Clustering Criterion Clustering is the process ofgrouping a set of data objects into a number of clusters orgroups The aim of clustering is to make the data withina cluster have a high similarity while being very dissimilarto objects in other clusters Dissimilarities and similaritiesare evaluated based on the attribute of data sets contain-ing distance metric The most popular distance metric isEuclidean distance [55] Let 119894 = (1199091198941 1199091198942 119909119894119901) and 119895 =

(1199091198951 1199091198952 119909119895119901) be two objects described by 119901 numericattributes the Euclidean distance between object 119894 and 119895 ispresented as follows

119889 (119894 119895)

= radic(1199091198941 minus 1199091198951)2

+ (1199091198942 minus 1199091198952)2

+ sdot sdot sdot + (119909119894119901 minus 119909119895119901)2

(5)

For given119873 objects the clustering problem is tominimize thesum squared Euclidean distance between objects and allocateeach object to one of 119896 cluster centers [55]The clustering aims

at finding clusters center through minimizing the objectivefunction The objective function is defined as follows [26]

119869119888 =

119898

sum

119896=1

sum

119883119894isin119862119896

119889 (119883119894 119885119896) (6)

where 119898 indicates the number of clusters 119862119896 indicates the119896th cluster 119885119896 indicates the kth center of the clusteringand 119889(119883119894 119885119896) indicates the distance of the sample to thecorresponding center of clustering namely119889(119883119894 119885119896) = 119883119894minus119885119896

53 PGWO Algorithm on Clustering In clustering analysiseach element in the data set is a 119901 dimensional vectorMoreover the actual position of a grey wolf represents the 119896cluster centers so each greywolf indicates a 119896lowast119901 dimensionalvector For each grey wolf 119894 its position is denoted as a vector119883119894 = (1199091198941 1199091198942 119909119894119896lowast119901) In the initialization phase weutilize maximum and minimum values of each componentof the data set (which is to be grouped) as PGWO algorithmthe initialization search scope of the grey wolves and theinitialization solution is randomly generated in this rangeWe use (6) to calculate the fitness function of grey wolvesrsquoindividuals and the main steps of the fitness function areshown in Algorithm 4

54 Data Clustering Experimental Results and DiscussionIn order to verify performance of the proposed PGWOapproach for clustering we compare the results of the 119870-means GGSA CS ABC PSO GWO and PGWO clusteringalgorithms using nine different data sets that are selectedfrom the UCI machine learning repository [56]

Artificial data set (119873 = 600 119889 = 2 and 119896 = 4) is atwo-featured problemwith four unique classes A total of 600

Discrete Dynamics in Nature and Society 9

(1) For data vector 119909119894(2) Calculate the Euclidean distance by (5)(3) Assign 119909119894 to the closest cluster center(4) Calculate the measure function by (6)(5) End For(6) Return value of the fitness function

Algorithm 4 Main steps of the fitness function

patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732 (120583 = (

119898119894

0) Σ = [[

05 005

005 05]]) (7)

where 119894 = 1 2 3 4 1198981 = minus3 1198982 = 0 1198983 = 3 and 119898 = 6120583 and Σ are mean vector and covariance matrix respectively[45 57]

Iris data (119873 = 150 119889 = 4 and 119870 = 3) is a data setwith 150 random samples of flowers from the Iris speciessetosa versicolor and virginica collected by Anderson [58]From each species there are 50 observations for sepal lengthsepal width petal length and petal width in cm This dataset was used by Fisher [59] in his initiation of the linear-discriminant-function technique [28 56 57]

Wisconsin breast cancer (119873 = 683 119889 = 9 and 119870 =

2) consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [28 56 57]

Contraceptive method choice (119873 = 1473 119889 = 10119870 = 3)CMC for short is a data set that is a subset of the 1987NationalIndonesia Contraceptive Prevalence Survey The samples aremarried women who either were not pregnant or did notknow if they were at the time of interview The problemis to predict the current contraceptive method choice (nouse long-termmethods or short-termmethods) of a womanbased onher demographic and socioeconomic characteristics[28 56 57]

Seeds data (119873 = 210 119889 = 7 and 119870 = 3) is a dataset that consists of 210 patterns belonging to three differentvarieties of wheat Kama Rosa and Canadian From eachspecies there are 70 observations for area 119860 perimeter 119875compactness 119862 (119862 = 4 lowast 119901119894 lowast119860119901

and2) length of kernel width

of kernel asymmetry coefficient and length of kernel groove[56]

Statlog (Heart) data (119873 = 270 119889 = 13119870 = 2) is a data setthat is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [56]

Wine data (119873 = 178 119889 = 13 119870 = 3) is also takenfromMCI laboratoryThese data are the results of a chemicalanalysis of wines grown in the same region in Italy but derivedfrom three different cultivars The analysis determined thequantities of 13 constituents found in each of the three types

of wines There are 178 instances with 13 numeric attributesin wine data set All attributes are continuous There is nomissing attribute value [28 56 57]

Balance scale data (119873 = 625 119889 = 4 and 119870 = 3) is a dataset that was generated to model psychological experimentalresults Each example is classified as having the balance scaletip to the right or left being balanced The attributes arethe left weight the left distance the right weight and theright distance The correct way to find the class is the greaterof (left-distance lowast left-weight) and (right-distance lowast right-weight) If they are equal it is balanced [56]

Habermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) is adata set that contains cases from a study that was conductedbetween 1958 and 1970 at the University of Chicagorsquos BillingsHospital on the survival of patients who had undergonesurgery for breast cancer It records two survival statuspatients with the age of patient at time of operation patientrsquosyear of operation and number of positive axillary nodesdetected [56]

The results of comparison of intracluster distances forthe seven clustering algorithms over 20 independent runs areshown in Table 5 Table 6 reports the 119901 values produced byWilcoxonrsquos rank-sum test comparing PGWO and the otheralgorithms over all the data sets

For Art data set the optimum value the worst value andthe average value of PGWO and GGSA are all 51390119864 + 02while the standard deviation of GGSA is better than PGWOFor PSO it only gets the optimum solution 51390119864 + 02

For Iris data set Cancer data set and Seeds data setPGWO PSO and GGSA provide the optimum value incomparison to those obtained by other methods Howeverthe worst value the average value and the standard deviationvalue of PGWO are superior to those of the other methods

For Heart data set PGWO PSO and CS all find theoptimum solution 10623119864 + 04 That means they all can findthe global solution However PGWO is slightly better for theworst value the average value and the standard deviation

For CMC data set and Wine data set Table 5 shows thatthe average best worst and standard deviation values ofthe fitness function for PGWO algorithm are much smallerthan those of the other six methods The PGWO clusteringalgorithm is capable of providing the same partition of thedata points in all runs

For balance scale data set the optimum values of thefitness function for PGWO PSO andGGSA are 14238119864+03That means they all can find the global solution And theoptimum values of the fitness function for GWO and 119870-means are 14239119864 + 03 the result is close to PGWO PSOand GGSA However the standard deviation values of GWO119870-means PGWO PSO GGSA and CS are 68756119864 minus 0136565119864 + 00 86619119864 minus 01 97396119864 minus 01 16015119864 + 00 and73239119864 minus 01 respectively From the standard deviation wecan see that the GWO is more stable than the other methods

For Habermanrsquos Survival data set the optimum value theworst value and the average value of the fitness function forCS and PSO are almost the same The results of CS and PSOalgorithms are better than those of the other methods

The 119901 values in Table 6 show that the results of PGWOare significantly better in Art data set Cancer data set

10 Discrete Dynamics in Nature and Society

Table 5 Comparison of intracluster distances for the seven clustering algorithms over 20 independent runs

Data set Criteria 119870-means GGSA CS ABC PSO GWO PGWO

Art

Best 51454E + 02 51390E + 02 51391E + 02 51752E + 02 51390E + 02 51455E + 02 51390E + 02Worst 90474E + 02 51390E + 02 53213E + 02 55298E + 02 89248E + 02 52058E + 02 51390E + 02Mean 60974E + 02 51390E + 02 51597E + 02 52620E + 02 53283E + 02 51687E + 02 51390E + 02Std 16903E + 02 15430119864 minus 13 48536E + 00 97988E + 00 84652E + 01 18768E + 00 61243119864 minus 08

Iris

Best 97190E + 01 96655E + 01 96659E + 01 97616E + 01 96655E + 01 96835E + 01 96655E + 01Worst 12318E + 02 11839E + 02 97684E + 01 10401E + 02 12767E + 02 12108E + 02 96655E + 01Mean 10226E + 02 98865E + 01 96862E + 01 99710E + 01 10596E + 02 99903E + 01 96655E + 01Std 10290E + 01 55454E + 00 31103E minus 01 18460E + 00 14578E + 01 67482E + 00 85869E minus 10

Cancer

Best 29763E + 03 29644E + 03 29645E + 03 29887E + 03 29644E + 03 29645E + 03 29644E + 03Worst 29884E + 03 30611E + 03 29677E + 03 32650E + 03 47288E + 03 29650E + 03 29644E + 03Mean 29830E + 03 29715E + 03 29651E + 03 30760E + 03 34055E + 03 29647E + 03 29644E + 03Std 48278E + 00 21284E + 01 73222E minus 01 65283E + 01 78385E + 02 11452E minus 01 16118E minus 08

CMC

Best 57016E + 03 56938E + 03 57054E + 03 58219E + 03 56942E + 03 58236E + 03 56937E + 03Worst 57053E + 03 57111E + 03 57762E + 03 63743E + 03 71580E + 03 60911E + 03 56937E + 03Mean 57040E + 03 56971E + 03 57263E + 03 60868E + 03 57690E + 03 59239E + 03 56937E + 03Std 11022E + 00 37872E + 00 21119E + 01 17292E + 02 32694E + 02 90635E + 01 98777E minus 04

Seeds

Best 31322E + 02 31180E + 02 31187E + 02 31573E + 02 31180E + 02 31323E + 02 31180E + 02Worst 31373E + 02 31354E + 02 31746E + 02 34997E + 02 42035E + 02 31782E + 02 31180E + 02Mean 31337E + 02 31189E + 02 31347E + 02 33161E + 02 32813E + 02 31475E + 02 31180E + 02Std 23686E minus 01 38862E minus 01 16383E + 00 10943E + 01 39697E + 01 15716E + 00 17815E minus 09

Heart

Best 10682E + 04 10749E + 04 10623E + 04 10683E + 04 10623E + 04 10637E + 04 10623E + 04Worst 10701E + 04 12684E + 04 10625E + 04 11622E + 04 10626E + 04 10683E + 04 10624E + 04Mean 10693E + 04 11542E + 04 10624E + 04 10920E + 04 10624E + 04 10657E + 04 10623E + 04Std 81672E + 00 57044E + 02 46239E minus 01 25921E + 02 76866E minus 01 13817E + 01 23235E minus 01

Wine

Best 16385E + 04 16493E + 04 16296E + 04 16566E + 04 16294E + 04 16316E + 04 16292E + 04Worst 18437E + 04 20245E + 04 16311E + 04 17668E + 04 16312E + 04 16371E + 04 16294E + 04Mean 16974E + 04 17999E + 04 16301E + 04 17010E + 04 16297E + 04 16345E + 04 16293E + 04Std 86757E + 02 11562E + 03 45677E + 00 36824E + 02 40149E + 00 14836E + 01 52338E minus 01

Balance scale

Best 14239E + 03 14238E + 03 14256E + 03 14265E + 03 14238E + 03 14239E + 03 14238E + 03Worst 14337E + 03 14291E + 03 14285E + 03 14310E + 03 14262E + 03 14260E + 03 14257E + 03Mean 14275E + 03 14259E + 03 14268E + 03 14282E + 03 14248E + 03 14243119864 + 03 14245119864 + 03

Std 36565E + 00 16015E + 00 73239E minus 01 11831E + 00 97396E minus 01 68756119864 minus 01 86619119864 minus 01

Haberman-Survival

Best 26251E + 03 25670E + 03 25670E + 03 25671E + 03 25670E + 03 25673E + 03 25670E + 03Worst 31966E + 03 26226E + 03 25670119864 + 03 25709E + 03 25678E + 03 26686E + 03 25678119864 + 03

Mean 26554E + 03 25702E + 03 25670119864 + 03 25679E + 03 25671E + 03 25898E + 03 25673119864 + 03

Std 12740E + 02 12354E + 01 17590119864 minus 06 83107E minus 01 30624E minus 01 23346E + 01 40711119864 minus 01

Table 6 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS GGSA and 119870-means over datasets (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA 119870-meansArt 680E minus 08 160E minus 05 680E minus 08 680E minus 08 611E minus 08 449E minus 08Iris 675E minus 08 119E minus 06 675E minus 08 675E minus 08 0107346 517E minus 08Cancer 676E minus 08 676E minus 08 676E minus 08 676E minus 08 12E minus 06 598E minus 08CMC 669E minus 08 669E minus 08 669E minus 08 669E minus 08 669E minus 08 339E minus 08Seeds 680E minus 08 680E minus 08 680E minus 08 680E minus 08 123E minus 03 481E minus 08Heart 661E minus 08 202E minus 06 661E minus 08 587E minus 07 661E minus 08 607E minus 08Wine 680E minus 08 123E minus 07 680E minus 08 680E minus 08 680E minus 08 663E minus 08Balance scale 0067868 0007712 68E minus 08 106E minus 07 0003966 601E minus 07Haberman-Survival 690E minus 07 0524948 178E minus 03 0860418 0090604 626E minus 08

Discrete Dynamics in Nature and Society 11

Art Iris

Cancer CMC

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900

950

1000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

80

90

100

110

120

130

140

150

160

170

180

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

2500

3000

3500

4000

4500

5000

5500

6000

6500

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

5500

6000

6500

7000

7500

8000

8500Av

erag

e bes

t so

far

50 100 150 2000Iteration

105

11

115

12

125

13

135

14

145

15

155

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

300

350

400

450

500

550

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

Figure 2 Continued

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 2: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

2 Discrete Dynamics in Nature and Society

on the distance or similarity between samples As a result ofthis characteristic clustering techniques have been applied toa wide range of problems such as datamining [23] data anal-ysis pattern recognition [24] and image segmentation [25]The traditional clustering methods can be simply classifiedas partitioningmethods hierarchicalmethods density-basedmethods and grid-based methods [26] In this paper weconcentrate on partitioning methods Partitioning clusteringmethod divides data vectors into a predefined number ofclusters through optimizing some certain criterion119870-meansis the most popular partitioning clustering method [27 28]However 119870-means highly depends on initial states andalways falls into local optimum In order to overcome thisproblem many methods have been proposed For example119870-harmonic mean algorithm has been proposed for cluster-ing instead of119870-means in [29] A simulated annealing- (SA-)based approach has been developed in [30] A tabu search-(TS-) based method was introduced in [31 32] Geneticalgorithm- (GA-) based methods were presented in [33ndash36]Fathian et al [37] have developed a clustering algorithmbased on honey-bee mating optimization (HBMO) Theparticle swarm optimization (PSO) is applied for clusteringin [38] Hatamlou et al employed a big bang-big crunchalgorithm for data clustering in [39] Karaboga and Ozturkpresented a novel clustering approach based on artificial beecolony (ABC) algorithm in [40] Data clustering based ongravitational search algorithm was presented in [41 42] In1991 Colorni et al have presented ant colony optimization(ACO) algorithmbased on the behavior of ants seeking a pathbetween their colony and a source of food Then Shelokarand Kao solved the clustering problem using the ACO algo-rithm [43 44] Kao et al have presented a hybrid approachaccording to combination of the119870-means algorithm Nelder-Mead simplex search and PSO for clustering analysis [45]Niknam et al have presented a hybrid evolutionary algorithmbased on PSO and SA to solve the clustering problem [46]But every algorithm has some drawbacks for example 119870-means algorithm sucks in local optima and convergenceis highly dependent on initial positions in case of geneticalgorithm in ACO the solution vector has been affectedas the number of iterations increased Kao et al [45] andNiknam et al [46] stated that PSO gives better clusteringresults when it is applied in one-dimensional data set andfor small data set but when it is applied to large data set itdoes not give the good results and so forth In this paper aPGWO algorithm is used to solve the cluster problem testedon nine data setsThis algorithm takes the advantage of GWOand Powell method Initial process is started by GWO whichallows searching all the space for a global solution Whenglobal solution is found the clustering is switched to Powellfor faster convergence to finish the clustering process As canbe seen from the simulation results this proposed algorithmnot only has higher convergence speed but also can find outthe optimal solution compared to the other algorithms acrossthe majority of data sets whether in small data set or in largedata set

The rest of the paper is organized as follows Section 2presents a brief introduction to GWO Section 3 discusses thebasic principles of GWO based on Powell local optimization

method The experimental results of test functions and dataclustering are shown in Sections 4 and 5 respectively FinallySection 6 concludes the work and suggests some directionsfor future studies

2 Grey Wolf Optimizer (GWO)

The GWO algorithm proposed by Mirjalili et al (2014) [7]is inspired by the hunting behavior and social leadership ofgrey wolves in nature It is similar to other metaheuristicsand in GWO algorithm the search begins by a population ofrandomly generated wolves (candidate solutions) In orderto formulate the social hierarchy of wolves when designingGWO in this algorithm the population is split into fourgroups alpha (120572) beta (120573) delta (120575) and omega (120596) Over thecourse of iterations the first three best solutions are called 120572120573 and 120575 respectively The rest of the candidate solutions arenamed as 120596 In this algorithm the hunting (optimization) isguided by 120572 120573 and 120575 The 120596 wolves are required to encircle120572 120573 and 120575 so as to find better solutions The encircle processcould be molded as follows [7]

=10038161003816100381610038161003816 sdot 119901 (119905) minus (119905)

10038161003816100381610038161003816

(119905 + 1) = 119901 (119905) minus sdot

(1)

where 119905 indicates the current iteration = 2sdot 1199032 = 2 119886sdot 1199031minus 119886119901 is the position vector of the prey indicates the positionvector of a grey wolf is gradually decreased from 2 to 0 and1199031 and 1199032 are random numbers over range [0 1]

In order tomathematically simulate the hunting behaviorof grey wolves in the GWO algorithm we always assume that120572 120573 and 120575 have better knowledge about the position of theprey (optimum)Therefore the positions of the first three bestsolutions (120572 120573 120575) obtained so far are saved and other wolves(120596) are obliged to reposition with respect to 120572 120573 and 120575 Themathematical model of readjusting the positions of 120596 wolvesis presented as follows [7]

120572 =100381610038161003816100381610038161 sdot 120572 minus

10038161003816100381610038161003816

120573 =100381610038161003816100381610038162 sdot 120573 minus

10038161003816100381610038161003816

120575 =100381610038161003816100381610038163 sdot 120575 minus

10038161003816100381610038161003816

(2)

1 = 120572 minus 1 sdot (120572)

2 = 120573 minus 2 sdot (120573)

3 = 120575 minus 3 sdot (120575)

(3)

(119905 + 1) =1 + 2 + 3

3 (4)

where 120572 is the position of the alpha 120573 is the position of thebeta 120575 is the position of the delta 1 2 and 3 and 12 and 3 are all random vectors is the position of thecurrent solution and 119905 indicates the number of iterations

Discrete Dynamics in Nature and Society 3

Initialize the grey wolf population119883119894 = (119894 = 1 2 119899) and parametersCalculate the fitness of populationFind the first three agents119883120572119883120573119883120575While (119905 ltMax number of iterations)

Update the position of the current search agent by (4)Calculate the fitness of populationUpdate119883120572119883120573119883120575119905 = 119905 + 1

End whileReturn119883120572

Algorithm 1 GWO algorithm

Initialize the starting point1198831 independent vectors 119889119894 = 119890119894(119894 = 1 119863) the tolerance for stop criteria 120576 set119891(1) = 119891(1198831)119883119888 = 1198831 119870 = 1

While (stopping criterion is not met namely |Δ119891| gt 120576) doFor 119894 = 1 to 119863 doIf (119870 ge 2) then

119889119894 = 119889119894+1

End If119883119894+1 = 119883119894 + 120582119894119889119894 120582119894 is determined by Minimizing 119891(119883119894+1)

End For

119889119894+1 =

119863

sum

1

120582119894 lowast 119889119894 = 119883119863+1 minus 119883119863 119883119888(119896 + 1) = 119883119863+1 + 120582119896119889119894+1 119891(119896 + 1) = 119891(119883119888(119896 + 1))

119896 = 119896 + 1 1198831 = 119883119888(119896) Δ119891 = 119891(119896) minus 119891(119896 minus 1)

EndWhile

Algorithm 2 Powellrsquos method

In these formulas it may also be observed that there aretwo vectors and obliging the GWO algorithm to exploreand exploit the search space With decreasing 119860 half of theiterations are devoted to exploration (|119860 ge 1|) and the otherhalf are dedicated to exploitation (|119860| lt 1) The range of 119862is 2 le 119862 le 0 and the vector 119862 also improves explorationwhen 119862 gt 1 and the exploitation is emphasized when 119862 lt 1Note here that 119860 is decreased linearly over the course of theiterations In contrast 119862 is generated randomly whose aim isto emphasize explorationexploitation at any stage avoidinglocal optimaThemain steps of grey wolf optimizer are givenin Algorithm 1

3 Grey Wolf Optimizer Based on Powell LocalOptimization Method (PGWO)

31 Local Search Based on Powellrsquos Method Powellrsquos methodstrictly Powellrsquos conjugate direction method is an algorithmproposed by Powell [22] for finding a local minimum of afunction The function need not be differentiable and noderivatives are taken The method successively utilized abidirectional search approach along each search vector topursue the minimum of a function The new position isrepresented as a linear combination of the search vectorsThenew displacement vector as a new search vector is added tothe search vector list Correspondingly the most successful

vector which contributedmost to the new direction is deletedfrom the search vector list The algorithm is iterated by somerun times until no significant improvement is made Thedetailed steps of Powellrsquos method procedure are presented inAlgorithm 2 [47]

32 The Proposed Algorithm Over the last years manyresearch results have been published in the field of evo-lutionary algorithms [7] And the results show that thehybridizationsmetaheuristics and local search techniques areexceptionally successful Successful hybridizations have beenproposed especially in combinatorial and discrete solutionspaces [48 49] Interestingly for real-valued solution spacesfew results have been introduced yetmdashan astonishing fact asmany direct search methods are fairly fast optimizers [21]Therefore in order to overcome the shortcomings of GWOalgorithm including slow convergence speed easily fallinginto local optimumvalue low computation accuracy and lowsuccess rate of convergence [19 20] GWO algorithm basedon Powell local optimization method is proposed It adoptsthe powerful local optimization ability of Powellrsquos methodand embeds it into GWO as a local search operator In thiscase the proposed method has potential to provide supe-rior results compared to other state-of-the-art evolutionaryalgorithms The general steps of PGWO are presented asfollows among which 119901 indicates the performing probability

4 Discrete Dynamics in Nature and Society

Initialize the grey wolf population119883119894 = (119894 = 1 2 119899) and parametersCalculate the fitness of populationFind the first three agents119883120572119883120573119883120575

While (119905 ltMax number of iterations)Update the position of the current search agent by (4)If rand gt 119901 choose current best solution119883120572 as a starting point and generate a new solution1198831015840

120572by Powellrsquos

method as illustrated Algorithm 2 If1198831015840120572lt 119883120572 replace119883120572 with119883

1015840

120572 otherwise go to Step 6

Calculate the fitness of populationUpdate119883120572119883120573119883120575119905 = 119905 + 1

EndWhileReturn119883120572

Algorithm 3 PGWO algorithm

Table 1 Benchmark functions

Functions Name Dim Range 119891min

1198651 =

119899

sum

119894=1

1199092

119894Sphere 30 [minus100 100] 0

1198652 =

119899minus1

sum

119894=1

[100 (119909119894+1 minus 1199092

119894)2

+ (119909119894 minus 1)2] Rosenbrock 30 [minus30 30] 0

1198653 = minus20 exp(minus02radic1

119899

119899

sum

119894=1

1199092

119894) minus exp(1

119899

119899

sum

119894=1

cos (2120587119909119894)) + 20 + 119890 Ackley 30 [minus32 32] 0

1198654 =120587

119899 10 sin (1205871199101) +

119899minus1

sum

119894=1

(1199101 minus 1)2[1 + 10sin2 (120587119910119894+1)] + (119910119899 minus 1)

2

Penalized 30 [minus50 50] 0+

119899

sum

119894=1

119906(119909119894 10 100 4)

119910119894 = 1 +119909119894 + 1

4

119906(119909119894 119886 119896 119898) =

119896 (119909119894 minus 119886)119898

119909119894 gt 119886

0 minus119886 lt 119909119894 lt 119886

119896 (minus119909119894 minus 119886)119898

119909119894 lt minus119886

1198655= 01 sin2 (31205871199091) +

119899

sum

119894=1

(119909119894 minus 1)2[1 + sin2 (3120587119909119894 + 1)] + (119909119899 minus 1)

2[1 + sin2 (2120587119909119899)]

Penalized 2 30 [minus50 50] 0+

119899

sum

119894=1

119906(119909119894 5 100 4)

1198656 =

11

sum

119894=1

[119886119894 minus1199091 (1198872

119894+ 1198871198941199092)

1198872119894+ 1198871198941199093 + 1199094

]

2

Kowalik 4 [minus5 5] 00003075

1198657 = [1 + (1199091 + 1199092 + 1)2(19 minus 141199091 + 3119909

2

1minus 141199092 + 611990911199092 + 3119909

2

2)] Goldstein-Price 2 [minus2 2] 3

times [30 + (21199091 minus 31199092)2times (18 minus 321199091 + 12119909

2

1+ 481199092 minus 3611990911199092 + 27119909

2

2)]

of Powellrsquos method and is set to 05 We also have triedto vary the number of grey wolves (population size 119899) andthe performing probability of Powellrsquos method From oursimulations we found that 119899 = 30 and 119901 = 05 aresufficient for most optimization problems So we will usefixed 119899 = 30 and 119901 = 05 in the rest of the simulationsIn the following section various benchmark functions areemployed to investigate the efficiency of PGWO algorithm(see Algorithm 3)

4 Experimental Results and Discussion41 Simulation Platform All algorithms are tested in MatlabR2012a (714) and experiments are executed on AMD Athlon

(tm) II X4 640 Processor 300GHz PC with 3G RAMOperating system is Windows 7

42 Benchmark Functions In order to evaluate the propertyof PGWO algorithm seven standard benchmark functionsare employed [50 51] These benchmark functions can bedivided into two different groups unimodal and multimodalfunctions Table 1 lists the functions where Dim indicatesdimension of the function Range is the boundary of the func-tionrsquos search space and 119891min is the minimum return value ofthe function among which the former two functions (1198651-1198652)are unimodal (1198653ndash1198655) are multimodal functions and thelast two functions (1198656-1198657) are fixed-dimension multimodal

Discrete Dynamics in Nature and Society 5

Table 2 Initial parameters of algorithms

Algorithm Parameter Value

PGWO

Population size 30 Linearly decreased from 2 to 0

Performing probability of Powell (119875) 05Max iteration 500 (benchmark functions test) 200 (data sets test)

Stopping criteria Max iteration

GWO

Population size 30 Linearly decreased from 2 to 0

Max iteration 500 (functions test) 200 (data sets test)Stopping criteria Max iteration

PSO

Population size 301198621 1198622 14962 14962119882 07298

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

ABC

Population size 30Limit 10

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

CS

Population size 30119901119886 025

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

GGSA

Population size 301198881015840

1(minus211990531198793) + 2

1198881015840

2(211990531198793)

1198660 1120572 20

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

119879 indicates the maximum number of interactions 119905 is the current iteration

benchmark functionsThere are some parameters that shouldbe initialized before running Table 2 is the initial value ofthese algorithms

The experimental results are presented in Table 3 Theresults are averaged over 20 independent runs and boldresults mean that PGWO is better while underlined resultsmean that the other algorithm is better The Best WorstMean and Std represent the optimal fitness value worstfitness value mean fitness value and standard deviationrespectively Note that the Matlab code of the GGSA algo-rithm is given in httpwwwalimirjalilicomProjectshtml

To improve the performance evaluation of evolutionaryalgorithms statistical tests should be conducted [52] In orderto determine whether the results of PGWO differ from thebest results of the other algorithms in a statistical method anonparametric test which is known as Wilcoxonrsquos rank-sumtest [53 54] is performed at 5 significance levelThe119901 valuescalculated in Wilcoxonrsquos rank-sum test comparing PGWOand the other algorithms over all the benchmark functionsare given in Table 4 According to [52] 119901 values lt 005 can beconsidered as sufficient evidence against the null hypothesis

43 Comparison of Experiment Results As shown in Table 3PGWO has the best result for 1198651 For 1198652 PGWO providedbetter results than the other algorithms across Best Worstand Mean while the Std of GWO is better than PGWOTheunimodal benchmark functions have only one global solutionwithout any local optima so they are very suitable to examineexploitation Hence these results indicate that the proposedmethod provides greatly improved exploitation compared tothe other algorithms

However it should be noticed that multimodal bench-mark functions have many local minima The final resultsare more important because these functions can reflect theability of the algorithm escaping from poor local optima andobtaining the global optimum We have tested the experi-ments on 1198653ndash1198655 Seen from Table 3 the PGWO provides thebest results

For 1198656-1198657 these are fixed-dimension multimodal bench-mark functions with only a few local minima the dimensionsof the functions are also small In this case it is hard tojudge the property of the algorithms The major differencecomparedwith functions1198653ndash1198655 is that functions1198656-1198657 appear

6 Discrete Dynamics in Nature and Society

Table 3 Results comparison of different optimal algorithms for 20 independent runs

Function Criteria GGSA CS ABC PSO GWO PGWO

1198651

Best 10133E minus 17 37450E + 00 21820E + 01 17964E minus 06 47447E minus 29 25422E minus 54Worst 10181E minus 16 24776E + 01 18094E + 02 22907E minus 04 22227E minus 26 79888E minus 53Mean 51858E minus 17 88292E + 00 81908E + 01 65181E minus 05 18269E minus 27 16276E minus 53Std 26953E minus 17 47633E + 00 43834E + 01 69891E minus 05 48884E minus 27 17834E minus 53

1198652

Best 23974E + 01 30920E + 02 44296E + 03 24432E + 01 26182E + 01 49374E + 00Worst 34411E + 02 22219E + 03 45866E + 04 41217E + 02 28769E + 01 15466E + 01Mean 77481E + 01 55445E + 02 15193E + 04 12465E + 02 27507E + 01 11692E + 01Std 74751E + 01 41835E + 02 97239E + 03 14004E + 02 67488119864 minus 01 20844119864 + 00

1198653

Best 23591E minus 09 40632E + 00 92131E + 00 63889E minus 03 79048E minus 14 44409E minus 15Worst 98704E minus 09 10603E + 01 16954E + 01 49535E + 00 15010E minus 13 44409E minus 15Mean 50646E minus 09 60677E + 00 13090E + 01 20423E + 00 10498E minus 13 44409E minus 15Std 19095E minus 09 15421E + 00 22882E + 00 11464E + 00 17485E minus 14 0

1198654

Best 54733E minus 01 20863E + 00 75357E minus 01 24473E minus 03 18626E minus 02 16432E minus 32Worst 46556E + 00 41099E + 00 55219E + 00 36426E + 00 12149E minus 01 16432E minus 32Mean 19728E + 00 32711E + 00 20544E + 00 85739E minus 01 54300E minus 02 16432E minus 32Std 10488E + 00 59739E minus 01 13358E + 00 10326E + 00 29934E minus 02 28080E minus 48

1198655

Best 21024E minus 02 27015E + 00 22507E + 00 63080E minus 06 39444E minus 01 13498E minus 32Worst 24387E + 01 12516E + 01 11090E + 02 36461E + 00 85465E minus 01 19625E minus 01Mean 77881E + 00 62644E + 00 11124E + 01 37285E minus 01 67323E minus 01 45918E minus 02Std 66733E + 00 25624E + 00 23792E + 01 93480E minus 01 14330E minus 01 60835E minus 02

1198656

Best 12358E minus 03 31550E minus 04 83768E minus 04 30749119864 minus 04 30750E minus 04 30752119864 minus 04

Worst 12219E minus 02 60087119864 minus 04 34810E minus 03 19361E minus 03 20363E minus 02 15943119864 minus 03

Mean 43558E minus 03 43266119864 minus 04 16421E minus 03 93309E minus 04 64067E minus 03 45499119864 minus 04

Std 26046E minus 03 85397119864 minus 05 72141E minus 04 41733E minus 04 93762E minus 03 34265119864 minus 04

1198657

Best 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00Worst 30000E + 00 30000E + 00 30620E + 00 30000E + 00 84000E + 01 30000E + 00Mean 30000E + 00 30000E + 00 30136E + 00 30000E + 00 11100E + 01 30000E + 00Std 25794E minus 15 15717E minus 15 19505E minus 02 12310119864 minus 15 24931E + 01 92458119864 minus 07

Table 4 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS and GGSA over benchmarkfunctions (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA1198651 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198652 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198653 766E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198654 801E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198655 597E minus 08 0295606 597E minus 08 597E minus 08 467E minus 071198656 0003057 0000375 138E minus 06 0001782 106E minus 071198657 00002 493E minus 08 680E minus 08 619E minus 08 649E minus 08

to be simpler than 1198653ndash1198655 due to their low dimension and asmaller number of local minima For 1198656 CS provided betterresults than the other algorithms across Mean Std andWorst For 1198657 the majority of the algorithms can find theoptimal solution but the PSO is more stable than the otheralgorithms in terms of Std

The 119901 values ofWilcoxonrsquos rank-sum in Table 4 show thatthe results of PGWO in 1198655 are not significantly better thanPSO However the PGWO achieves significant improvementin all remaining benchmark functions compared to the otheralgorithms Therefore this is evidence that the proposed

algorithm has high performance in dealing with unimodalmultimodal and fixed-dimension multimodal benchmarkfunctions

The convergence curves of six algorithms are illustratedin Figure 1 As can be seen in these figures PGWO has amuch better convergence rate than the other algorithms onall benchmark functions

According to this comprehensive comparative study anddiscussions these results show that the proposed algorithmis able to significantly improve the performance of GWOand overcome its major shortcomings For that reason in the

Discrete Dynamics in Nature and Society 7

F6

times107

F4

F2F1times10

4

F3

F5

100 200 300 400 5000Iteration

0

05

1

15

2

25

Aver

age b

est s

o fa

r

101

102

103

104

105

106

107

108

109

Aver

age b

est s

o fa

r (lo

g)

GGSACSABC

PSOGWOPGWO

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

0

1

2

3

4

5

6

7

8

9

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus15

10minus10

10minus5

100

105

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus2

100

102

104

106

108

1010

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus4

10minus3

10minus2

10minus1

100

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

Figure 1 Continued

8 Discrete Dynamics in Nature and Society

100 200 300 400 5000Iteration

2

4

6

8

10

12

14

16

18

20

Aver

age b

est s

o fa

r

F7

GGSACSABC

PSOGWOPGWO

Figure 1 The convergence curves of the average fitness value over 20 independent runs

next section we apply PGWO algorithm to solve clusteringproblem

5 Data Clustering

51 The Mathematical Description of Data Clustering Clus-tering sample is set to119883 = 119883119894 119894 = 1 2 119899 where119883119894 is a119901dimensional vector Clustering problem is to find a partition119862 = 1198621 1198622 119862119898 satisfying the following conditions[26]

(1) 119883 = ⋃119898

119894=1119862119894

(2) 119862119894 = 120601 119894 = 1 2 119898(3) 119862119894 cap 119862119895 = 120601 119894 119895 = 1 2 119898 119894 = 119895

52 The Clustering Criterion Clustering is the process ofgrouping a set of data objects into a number of clusters orgroups The aim of clustering is to make the data withina cluster have a high similarity while being very dissimilarto objects in other clusters Dissimilarities and similaritiesare evaluated based on the attribute of data sets contain-ing distance metric The most popular distance metric isEuclidean distance [55] Let 119894 = (1199091198941 1199091198942 119909119894119901) and 119895 =

(1199091198951 1199091198952 119909119895119901) be two objects described by 119901 numericattributes the Euclidean distance between object 119894 and 119895 ispresented as follows

119889 (119894 119895)

= radic(1199091198941 minus 1199091198951)2

+ (1199091198942 minus 1199091198952)2

+ sdot sdot sdot + (119909119894119901 minus 119909119895119901)2

(5)

For given119873 objects the clustering problem is tominimize thesum squared Euclidean distance between objects and allocateeach object to one of 119896 cluster centers [55]The clustering aims

at finding clusters center through minimizing the objectivefunction The objective function is defined as follows [26]

119869119888 =

119898

sum

119896=1

sum

119883119894isin119862119896

119889 (119883119894 119885119896) (6)

where 119898 indicates the number of clusters 119862119896 indicates the119896th cluster 119885119896 indicates the kth center of the clusteringand 119889(119883119894 119885119896) indicates the distance of the sample to thecorresponding center of clustering namely119889(119883119894 119885119896) = 119883119894minus119885119896

53 PGWO Algorithm on Clustering In clustering analysiseach element in the data set is a 119901 dimensional vectorMoreover the actual position of a grey wolf represents the 119896cluster centers so each greywolf indicates a 119896lowast119901 dimensionalvector For each grey wolf 119894 its position is denoted as a vector119883119894 = (1199091198941 1199091198942 119909119894119896lowast119901) In the initialization phase weutilize maximum and minimum values of each componentof the data set (which is to be grouped) as PGWO algorithmthe initialization search scope of the grey wolves and theinitialization solution is randomly generated in this rangeWe use (6) to calculate the fitness function of grey wolvesrsquoindividuals and the main steps of the fitness function areshown in Algorithm 4

54 Data Clustering Experimental Results and DiscussionIn order to verify performance of the proposed PGWOapproach for clustering we compare the results of the 119870-means GGSA CS ABC PSO GWO and PGWO clusteringalgorithms using nine different data sets that are selectedfrom the UCI machine learning repository [56]

Artificial data set (119873 = 600 119889 = 2 and 119896 = 4) is atwo-featured problemwith four unique classes A total of 600

Discrete Dynamics in Nature and Society 9

(1) For data vector 119909119894(2) Calculate the Euclidean distance by (5)(3) Assign 119909119894 to the closest cluster center(4) Calculate the measure function by (6)(5) End For(6) Return value of the fitness function

Algorithm 4 Main steps of the fitness function

patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732 (120583 = (

119898119894

0) Σ = [[

05 005

005 05]]) (7)

where 119894 = 1 2 3 4 1198981 = minus3 1198982 = 0 1198983 = 3 and 119898 = 6120583 and Σ are mean vector and covariance matrix respectively[45 57]

Iris data (119873 = 150 119889 = 4 and 119870 = 3) is a data setwith 150 random samples of flowers from the Iris speciessetosa versicolor and virginica collected by Anderson [58]From each species there are 50 observations for sepal lengthsepal width petal length and petal width in cm This dataset was used by Fisher [59] in his initiation of the linear-discriminant-function technique [28 56 57]

Wisconsin breast cancer (119873 = 683 119889 = 9 and 119870 =

2) consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [28 56 57]

Contraceptive method choice (119873 = 1473 119889 = 10119870 = 3)CMC for short is a data set that is a subset of the 1987NationalIndonesia Contraceptive Prevalence Survey The samples aremarried women who either were not pregnant or did notknow if they were at the time of interview The problemis to predict the current contraceptive method choice (nouse long-termmethods or short-termmethods) of a womanbased onher demographic and socioeconomic characteristics[28 56 57]

Seeds data (119873 = 210 119889 = 7 and 119870 = 3) is a dataset that consists of 210 patterns belonging to three differentvarieties of wheat Kama Rosa and Canadian From eachspecies there are 70 observations for area 119860 perimeter 119875compactness 119862 (119862 = 4 lowast 119901119894 lowast119860119901

and2) length of kernel width

of kernel asymmetry coefficient and length of kernel groove[56]

Statlog (Heart) data (119873 = 270 119889 = 13119870 = 2) is a data setthat is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [56]

Wine data (119873 = 178 119889 = 13 119870 = 3) is also takenfromMCI laboratoryThese data are the results of a chemicalanalysis of wines grown in the same region in Italy but derivedfrom three different cultivars The analysis determined thequantities of 13 constituents found in each of the three types

of wines There are 178 instances with 13 numeric attributesin wine data set All attributes are continuous There is nomissing attribute value [28 56 57]

Balance scale data (119873 = 625 119889 = 4 and 119870 = 3) is a dataset that was generated to model psychological experimentalresults Each example is classified as having the balance scaletip to the right or left being balanced The attributes arethe left weight the left distance the right weight and theright distance The correct way to find the class is the greaterof (left-distance lowast left-weight) and (right-distance lowast right-weight) If they are equal it is balanced [56]

Habermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) is adata set that contains cases from a study that was conductedbetween 1958 and 1970 at the University of Chicagorsquos BillingsHospital on the survival of patients who had undergonesurgery for breast cancer It records two survival statuspatients with the age of patient at time of operation patientrsquosyear of operation and number of positive axillary nodesdetected [56]

The results of comparison of intracluster distances forthe seven clustering algorithms over 20 independent runs areshown in Table 5 Table 6 reports the 119901 values produced byWilcoxonrsquos rank-sum test comparing PGWO and the otheralgorithms over all the data sets

For Art data set the optimum value the worst value andthe average value of PGWO and GGSA are all 51390119864 + 02while the standard deviation of GGSA is better than PGWOFor PSO it only gets the optimum solution 51390119864 + 02

For Iris data set Cancer data set and Seeds data setPGWO PSO and GGSA provide the optimum value incomparison to those obtained by other methods Howeverthe worst value the average value and the standard deviationvalue of PGWO are superior to those of the other methods

For Heart data set PGWO PSO and CS all find theoptimum solution 10623119864 + 04 That means they all can findthe global solution However PGWO is slightly better for theworst value the average value and the standard deviation

For CMC data set and Wine data set Table 5 shows thatthe average best worst and standard deviation values ofthe fitness function for PGWO algorithm are much smallerthan those of the other six methods The PGWO clusteringalgorithm is capable of providing the same partition of thedata points in all runs

For balance scale data set the optimum values of thefitness function for PGWO PSO andGGSA are 14238119864+03That means they all can find the global solution And theoptimum values of the fitness function for GWO and 119870-means are 14239119864 + 03 the result is close to PGWO PSOand GGSA However the standard deviation values of GWO119870-means PGWO PSO GGSA and CS are 68756119864 minus 0136565119864 + 00 86619119864 minus 01 97396119864 minus 01 16015119864 + 00 and73239119864 minus 01 respectively From the standard deviation wecan see that the GWO is more stable than the other methods

For Habermanrsquos Survival data set the optimum value theworst value and the average value of the fitness function forCS and PSO are almost the same The results of CS and PSOalgorithms are better than those of the other methods

The 119901 values in Table 6 show that the results of PGWOare significantly better in Art data set Cancer data set

10 Discrete Dynamics in Nature and Society

Table 5 Comparison of intracluster distances for the seven clustering algorithms over 20 independent runs

Data set Criteria 119870-means GGSA CS ABC PSO GWO PGWO

Art

Best 51454E + 02 51390E + 02 51391E + 02 51752E + 02 51390E + 02 51455E + 02 51390E + 02Worst 90474E + 02 51390E + 02 53213E + 02 55298E + 02 89248E + 02 52058E + 02 51390E + 02Mean 60974E + 02 51390E + 02 51597E + 02 52620E + 02 53283E + 02 51687E + 02 51390E + 02Std 16903E + 02 15430119864 minus 13 48536E + 00 97988E + 00 84652E + 01 18768E + 00 61243119864 minus 08

Iris

Best 97190E + 01 96655E + 01 96659E + 01 97616E + 01 96655E + 01 96835E + 01 96655E + 01Worst 12318E + 02 11839E + 02 97684E + 01 10401E + 02 12767E + 02 12108E + 02 96655E + 01Mean 10226E + 02 98865E + 01 96862E + 01 99710E + 01 10596E + 02 99903E + 01 96655E + 01Std 10290E + 01 55454E + 00 31103E minus 01 18460E + 00 14578E + 01 67482E + 00 85869E minus 10

Cancer

Best 29763E + 03 29644E + 03 29645E + 03 29887E + 03 29644E + 03 29645E + 03 29644E + 03Worst 29884E + 03 30611E + 03 29677E + 03 32650E + 03 47288E + 03 29650E + 03 29644E + 03Mean 29830E + 03 29715E + 03 29651E + 03 30760E + 03 34055E + 03 29647E + 03 29644E + 03Std 48278E + 00 21284E + 01 73222E minus 01 65283E + 01 78385E + 02 11452E minus 01 16118E minus 08

CMC

Best 57016E + 03 56938E + 03 57054E + 03 58219E + 03 56942E + 03 58236E + 03 56937E + 03Worst 57053E + 03 57111E + 03 57762E + 03 63743E + 03 71580E + 03 60911E + 03 56937E + 03Mean 57040E + 03 56971E + 03 57263E + 03 60868E + 03 57690E + 03 59239E + 03 56937E + 03Std 11022E + 00 37872E + 00 21119E + 01 17292E + 02 32694E + 02 90635E + 01 98777E minus 04

Seeds

Best 31322E + 02 31180E + 02 31187E + 02 31573E + 02 31180E + 02 31323E + 02 31180E + 02Worst 31373E + 02 31354E + 02 31746E + 02 34997E + 02 42035E + 02 31782E + 02 31180E + 02Mean 31337E + 02 31189E + 02 31347E + 02 33161E + 02 32813E + 02 31475E + 02 31180E + 02Std 23686E minus 01 38862E minus 01 16383E + 00 10943E + 01 39697E + 01 15716E + 00 17815E minus 09

Heart

Best 10682E + 04 10749E + 04 10623E + 04 10683E + 04 10623E + 04 10637E + 04 10623E + 04Worst 10701E + 04 12684E + 04 10625E + 04 11622E + 04 10626E + 04 10683E + 04 10624E + 04Mean 10693E + 04 11542E + 04 10624E + 04 10920E + 04 10624E + 04 10657E + 04 10623E + 04Std 81672E + 00 57044E + 02 46239E minus 01 25921E + 02 76866E minus 01 13817E + 01 23235E minus 01

Wine

Best 16385E + 04 16493E + 04 16296E + 04 16566E + 04 16294E + 04 16316E + 04 16292E + 04Worst 18437E + 04 20245E + 04 16311E + 04 17668E + 04 16312E + 04 16371E + 04 16294E + 04Mean 16974E + 04 17999E + 04 16301E + 04 17010E + 04 16297E + 04 16345E + 04 16293E + 04Std 86757E + 02 11562E + 03 45677E + 00 36824E + 02 40149E + 00 14836E + 01 52338E minus 01

Balance scale

Best 14239E + 03 14238E + 03 14256E + 03 14265E + 03 14238E + 03 14239E + 03 14238E + 03Worst 14337E + 03 14291E + 03 14285E + 03 14310E + 03 14262E + 03 14260E + 03 14257E + 03Mean 14275E + 03 14259E + 03 14268E + 03 14282E + 03 14248E + 03 14243119864 + 03 14245119864 + 03

Std 36565E + 00 16015E + 00 73239E minus 01 11831E + 00 97396E minus 01 68756119864 minus 01 86619119864 minus 01

Haberman-Survival

Best 26251E + 03 25670E + 03 25670E + 03 25671E + 03 25670E + 03 25673E + 03 25670E + 03Worst 31966E + 03 26226E + 03 25670119864 + 03 25709E + 03 25678E + 03 26686E + 03 25678119864 + 03

Mean 26554E + 03 25702E + 03 25670119864 + 03 25679E + 03 25671E + 03 25898E + 03 25673119864 + 03

Std 12740E + 02 12354E + 01 17590119864 minus 06 83107E minus 01 30624E minus 01 23346E + 01 40711119864 minus 01

Table 6 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS GGSA and 119870-means over datasets (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA 119870-meansArt 680E minus 08 160E minus 05 680E minus 08 680E minus 08 611E minus 08 449E minus 08Iris 675E minus 08 119E minus 06 675E minus 08 675E minus 08 0107346 517E minus 08Cancer 676E minus 08 676E minus 08 676E minus 08 676E minus 08 12E minus 06 598E minus 08CMC 669E minus 08 669E minus 08 669E minus 08 669E minus 08 669E minus 08 339E minus 08Seeds 680E minus 08 680E minus 08 680E minus 08 680E minus 08 123E minus 03 481E minus 08Heart 661E minus 08 202E minus 06 661E minus 08 587E minus 07 661E minus 08 607E minus 08Wine 680E minus 08 123E minus 07 680E minus 08 680E minus 08 680E minus 08 663E minus 08Balance scale 0067868 0007712 68E minus 08 106E minus 07 0003966 601E minus 07Haberman-Survival 690E minus 07 0524948 178E minus 03 0860418 0090604 626E minus 08

Discrete Dynamics in Nature and Society 11

Art Iris

Cancer CMC

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900

950

1000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

80

90

100

110

120

130

140

150

160

170

180

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

2500

3000

3500

4000

4500

5000

5500

6000

6500

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

5500

6000

6500

7000

7500

8000

8500Av

erag

e bes

t so

far

50 100 150 2000Iteration

105

11

115

12

125

13

135

14

145

15

155

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

300

350

400

450

500

550

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

Figure 2 Continued

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 3: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

Discrete Dynamics in Nature and Society 3

Initialize the grey wolf population119883119894 = (119894 = 1 2 119899) and parametersCalculate the fitness of populationFind the first three agents119883120572119883120573119883120575While (119905 ltMax number of iterations)

Update the position of the current search agent by (4)Calculate the fitness of populationUpdate119883120572119883120573119883120575119905 = 119905 + 1

End whileReturn119883120572

Algorithm 1 GWO algorithm

Initialize the starting point1198831 independent vectors 119889119894 = 119890119894(119894 = 1 119863) the tolerance for stop criteria 120576 set119891(1) = 119891(1198831)119883119888 = 1198831 119870 = 1

While (stopping criterion is not met namely |Δ119891| gt 120576) doFor 119894 = 1 to 119863 doIf (119870 ge 2) then

119889119894 = 119889119894+1

End If119883119894+1 = 119883119894 + 120582119894119889119894 120582119894 is determined by Minimizing 119891(119883119894+1)

End For

119889119894+1 =

119863

sum

1

120582119894 lowast 119889119894 = 119883119863+1 minus 119883119863 119883119888(119896 + 1) = 119883119863+1 + 120582119896119889119894+1 119891(119896 + 1) = 119891(119883119888(119896 + 1))

119896 = 119896 + 1 1198831 = 119883119888(119896) Δ119891 = 119891(119896) minus 119891(119896 minus 1)

EndWhile

Algorithm 2 Powellrsquos method

In these formulas it may also be observed that there aretwo vectors and obliging the GWO algorithm to exploreand exploit the search space With decreasing 119860 half of theiterations are devoted to exploration (|119860 ge 1|) and the otherhalf are dedicated to exploitation (|119860| lt 1) The range of 119862is 2 le 119862 le 0 and the vector 119862 also improves explorationwhen 119862 gt 1 and the exploitation is emphasized when 119862 lt 1Note here that 119860 is decreased linearly over the course of theiterations In contrast 119862 is generated randomly whose aim isto emphasize explorationexploitation at any stage avoidinglocal optimaThemain steps of grey wolf optimizer are givenin Algorithm 1

3 Grey Wolf Optimizer Based on Powell LocalOptimization Method (PGWO)

31 Local Search Based on Powellrsquos Method Powellrsquos methodstrictly Powellrsquos conjugate direction method is an algorithmproposed by Powell [22] for finding a local minimum of afunction The function need not be differentiable and noderivatives are taken The method successively utilized abidirectional search approach along each search vector topursue the minimum of a function The new position isrepresented as a linear combination of the search vectorsThenew displacement vector as a new search vector is added tothe search vector list Correspondingly the most successful

vector which contributedmost to the new direction is deletedfrom the search vector list The algorithm is iterated by somerun times until no significant improvement is made Thedetailed steps of Powellrsquos method procedure are presented inAlgorithm 2 [47]

32 The Proposed Algorithm Over the last years manyresearch results have been published in the field of evo-lutionary algorithms [7] And the results show that thehybridizationsmetaheuristics and local search techniques areexceptionally successful Successful hybridizations have beenproposed especially in combinatorial and discrete solutionspaces [48 49] Interestingly for real-valued solution spacesfew results have been introduced yetmdashan astonishing fact asmany direct search methods are fairly fast optimizers [21]Therefore in order to overcome the shortcomings of GWOalgorithm including slow convergence speed easily fallinginto local optimumvalue low computation accuracy and lowsuccess rate of convergence [19 20] GWO algorithm basedon Powell local optimization method is proposed It adoptsthe powerful local optimization ability of Powellrsquos methodand embeds it into GWO as a local search operator In thiscase the proposed method has potential to provide supe-rior results compared to other state-of-the-art evolutionaryalgorithms The general steps of PGWO are presented asfollows among which 119901 indicates the performing probability

4 Discrete Dynamics in Nature and Society

Initialize the grey wolf population119883119894 = (119894 = 1 2 119899) and parametersCalculate the fitness of populationFind the first three agents119883120572119883120573119883120575

While (119905 ltMax number of iterations)Update the position of the current search agent by (4)If rand gt 119901 choose current best solution119883120572 as a starting point and generate a new solution1198831015840

120572by Powellrsquos

method as illustrated Algorithm 2 If1198831015840120572lt 119883120572 replace119883120572 with119883

1015840

120572 otherwise go to Step 6

Calculate the fitness of populationUpdate119883120572119883120573119883120575119905 = 119905 + 1

EndWhileReturn119883120572

Algorithm 3 PGWO algorithm

Table 1 Benchmark functions

Functions Name Dim Range 119891min

1198651 =

119899

sum

119894=1

1199092

119894Sphere 30 [minus100 100] 0

1198652 =

119899minus1

sum

119894=1

[100 (119909119894+1 minus 1199092

119894)2

+ (119909119894 minus 1)2] Rosenbrock 30 [minus30 30] 0

1198653 = minus20 exp(minus02radic1

119899

119899

sum

119894=1

1199092

119894) minus exp(1

119899

119899

sum

119894=1

cos (2120587119909119894)) + 20 + 119890 Ackley 30 [minus32 32] 0

1198654 =120587

119899 10 sin (1205871199101) +

119899minus1

sum

119894=1

(1199101 minus 1)2[1 + 10sin2 (120587119910119894+1)] + (119910119899 minus 1)

2

Penalized 30 [minus50 50] 0+

119899

sum

119894=1

119906(119909119894 10 100 4)

119910119894 = 1 +119909119894 + 1

4

119906(119909119894 119886 119896 119898) =

119896 (119909119894 minus 119886)119898

119909119894 gt 119886

0 minus119886 lt 119909119894 lt 119886

119896 (minus119909119894 minus 119886)119898

119909119894 lt minus119886

1198655= 01 sin2 (31205871199091) +

119899

sum

119894=1

(119909119894 minus 1)2[1 + sin2 (3120587119909119894 + 1)] + (119909119899 minus 1)

2[1 + sin2 (2120587119909119899)]

Penalized 2 30 [minus50 50] 0+

119899

sum

119894=1

119906(119909119894 5 100 4)

1198656 =

11

sum

119894=1

[119886119894 minus1199091 (1198872

119894+ 1198871198941199092)

1198872119894+ 1198871198941199093 + 1199094

]

2

Kowalik 4 [minus5 5] 00003075

1198657 = [1 + (1199091 + 1199092 + 1)2(19 minus 141199091 + 3119909

2

1minus 141199092 + 611990911199092 + 3119909

2

2)] Goldstein-Price 2 [minus2 2] 3

times [30 + (21199091 minus 31199092)2times (18 minus 321199091 + 12119909

2

1+ 481199092 minus 3611990911199092 + 27119909

2

2)]

of Powellrsquos method and is set to 05 We also have triedto vary the number of grey wolves (population size 119899) andthe performing probability of Powellrsquos method From oursimulations we found that 119899 = 30 and 119901 = 05 aresufficient for most optimization problems So we will usefixed 119899 = 30 and 119901 = 05 in the rest of the simulationsIn the following section various benchmark functions areemployed to investigate the efficiency of PGWO algorithm(see Algorithm 3)

4 Experimental Results and Discussion41 Simulation Platform All algorithms are tested in MatlabR2012a (714) and experiments are executed on AMD Athlon

(tm) II X4 640 Processor 300GHz PC with 3G RAMOperating system is Windows 7

42 Benchmark Functions In order to evaluate the propertyof PGWO algorithm seven standard benchmark functionsare employed [50 51] These benchmark functions can bedivided into two different groups unimodal and multimodalfunctions Table 1 lists the functions where Dim indicatesdimension of the function Range is the boundary of the func-tionrsquos search space and 119891min is the minimum return value ofthe function among which the former two functions (1198651-1198652)are unimodal (1198653ndash1198655) are multimodal functions and thelast two functions (1198656-1198657) are fixed-dimension multimodal

Discrete Dynamics in Nature and Society 5

Table 2 Initial parameters of algorithms

Algorithm Parameter Value

PGWO

Population size 30 Linearly decreased from 2 to 0

Performing probability of Powell (119875) 05Max iteration 500 (benchmark functions test) 200 (data sets test)

Stopping criteria Max iteration

GWO

Population size 30 Linearly decreased from 2 to 0

Max iteration 500 (functions test) 200 (data sets test)Stopping criteria Max iteration

PSO

Population size 301198621 1198622 14962 14962119882 07298

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

ABC

Population size 30Limit 10

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

CS

Population size 30119901119886 025

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

GGSA

Population size 301198881015840

1(minus211990531198793) + 2

1198881015840

2(211990531198793)

1198660 1120572 20

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

119879 indicates the maximum number of interactions 119905 is the current iteration

benchmark functionsThere are some parameters that shouldbe initialized before running Table 2 is the initial value ofthese algorithms

The experimental results are presented in Table 3 Theresults are averaged over 20 independent runs and boldresults mean that PGWO is better while underlined resultsmean that the other algorithm is better The Best WorstMean and Std represent the optimal fitness value worstfitness value mean fitness value and standard deviationrespectively Note that the Matlab code of the GGSA algo-rithm is given in httpwwwalimirjalilicomProjectshtml

To improve the performance evaluation of evolutionaryalgorithms statistical tests should be conducted [52] In orderto determine whether the results of PGWO differ from thebest results of the other algorithms in a statistical method anonparametric test which is known as Wilcoxonrsquos rank-sumtest [53 54] is performed at 5 significance levelThe119901 valuescalculated in Wilcoxonrsquos rank-sum test comparing PGWOand the other algorithms over all the benchmark functionsare given in Table 4 According to [52] 119901 values lt 005 can beconsidered as sufficient evidence against the null hypothesis

43 Comparison of Experiment Results As shown in Table 3PGWO has the best result for 1198651 For 1198652 PGWO providedbetter results than the other algorithms across Best Worstand Mean while the Std of GWO is better than PGWOTheunimodal benchmark functions have only one global solutionwithout any local optima so they are very suitable to examineexploitation Hence these results indicate that the proposedmethod provides greatly improved exploitation compared tothe other algorithms

However it should be noticed that multimodal bench-mark functions have many local minima The final resultsare more important because these functions can reflect theability of the algorithm escaping from poor local optima andobtaining the global optimum We have tested the experi-ments on 1198653ndash1198655 Seen from Table 3 the PGWO provides thebest results

For 1198656-1198657 these are fixed-dimension multimodal bench-mark functions with only a few local minima the dimensionsof the functions are also small In this case it is hard tojudge the property of the algorithms The major differencecomparedwith functions1198653ndash1198655 is that functions1198656-1198657 appear

6 Discrete Dynamics in Nature and Society

Table 3 Results comparison of different optimal algorithms for 20 independent runs

Function Criteria GGSA CS ABC PSO GWO PGWO

1198651

Best 10133E minus 17 37450E + 00 21820E + 01 17964E minus 06 47447E minus 29 25422E minus 54Worst 10181E minus 16 24776E + 01 18094E + 02 22907E minus 04 22227E minus 26 79888E minus 53Mean 51858E minus 17 88292E + 00 81908E + 01 65181E minus 05 18269E minus 27 16276E minus 53Std 26953E minus 17 47633E + 00 43834E + 01 69891E minus 05 48884E minus 27 17834E minus 53

1198652

Best 23974E + 01 30920E + 02 44296E + 03 24432E + 01 26182E + 01 49374E + 00Worst 34411E + 02 22219E + 03 45866E + 04 41217E + 02 28769E + 01 15466E + 01Mean 77481E + 01 55445E + 02 15193E + 04 12465E + 02 27507E + 01 11692E + 01Std 74751E + 01 41835E + 02 97239E + 03 14004E + 02 67488119864 minus 01 20844119864 + 00

1198653

Best 23591E minus 09 40632E + 00 92131E + 00 63889E minus 03 79048E minus 14 44409E minus 15Worst 98704E minus 09 10603E + 01 16954E + 01 49535E + 00 15010E minus 13 44409E minus 15Mean 50646E minus 09 60677E + 00 13090E + 01 20423E + 00 10498E minus 13 44409E minus 15Std 19095E minus 09 15421E + 00 22882E + 00 11464E + 00 17485E minus 14 0

1198654

Best 54733E minus 01 20863E + 00 75357E minus 01 24473E minus 03 18626E minus 02 16432E minus 32Worst 46556E + 00 41099E + 00 55219E + 00 36426E + 00 12149E minus 01 16432E minus 32Mean 19728E + 00 32711E + 00 20544E + 00 85739E minus 01 54300E minus 02 16432E minus 32Std 10488E + 00 59739E minus 01 13358E + 00 10326E + 00 29934E minus 02 28080E minus 48

1198655

Best 21024E minus 02 27015E + 00 22507E + 00 63080E minus 06 39444E minus 01 13498E minus 32Worst 24387E + 01 12516E + 01 11090E + 02 36461E + 00 85465E minus 01 19625E minus 01Mean 77881E + 00 62644E + 00 11124E + 01 37285E minus 01 67323E minus 01 45918E minus 02Std 66733E + 00 25624E + 00 23792E + 01 93480E minus 01 14330E minus 01 60835E minus 02

1198656

Best 12358E minus 03 31550E minus 04 83768E minus 04 30749119864 minus 04 30750E minus 04 30752119864 minus 04

Worst 12219E minus 02 60087119864 minus 04 34810E minus 03 19361E minus 03 20363E minus 02 15943119864 minus 03

Mean 43558E minus 03 43266119864 minus 04 16421E minus 03 93309E minus 04 64067E minus 03 45499119864 minus 04

Std 26046E minus 03 85397119864 minus 05 72141E minus 04 41733E minus 04 93762E minus 03 34265119864 minus 04

1198657

Best 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00Worst 30000E + 00 30000E + 00 30620E + 00 30000E + 00 84000E + 01 30000E + 00Mean 30000E + 00 30000E + 00 30136E + 00 30000E + 00 11100E + 01 30000E + 00Std 25794E minus 15 15717E minus 15 19505E minus 02 12310119864 minus 15 24931E + 01 92458119864 minus 07

Table 4 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS and GGSA over benchmarkfunctions (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA1198651 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198652 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198653 766E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198654 801E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198655 597E minus 08 0295606 597E minus 08 597E minus 08 467E minus 071198656 0003057 0000375 138E minus 06 0001782 106E minus 071198657 00002 493E minus 08 680E minus 08 619E minus 08 649E minus 08

to be simpler than 1198653ndash1198655 due to their low dimension and asmaller number of local minima For 1198656 CS provided betterresults than the other algorithms across Mean Std andWorst For 1198657 the majority of the algorithms can find theoptimal solution but the PSO is more stable than the otheralgorithms in terms of Std

The 119901 values ofWilcoxonrsquos rank-sum in Table 4 show thatthe results of PGWO in 1198655 are not significantly better thanPSO However the PGWO achieves significant improvementin all remaining benchmark functions compared to the otheralgorithms Therefore this is evidence that the proposed

algorithm has high performance in dealing with unimodalmultimodal and fixed-dimension multimodal benchmarkfunctions

The convergence curves of six algorithms are illustratedin Figure 1 As can be seen in these figures PGWO has amuch better convergence rate than the other algorithms onall benchmark functions

According to this comprehensive comparative study anddiscussions these results show that the proposed algorithmis able to significantly improve the performance of GWOand overcome its major shortcomings For that reason in the

Discrete Dynamics in Nature and Society 7

F6

times107

F4

F2F1times10

4

F3

F5

100 200 300 400 5000Iteration

0

05

1

15

2

25

Aver

age b

est s

o fa

r

101

102

103

104

105

106

107

108

109

Aver

age b

est s

o fa

r (lo

g)

GGSACSABC

PSOGWOPGWO

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

0

1

2

3

4

5

6

7

8

9

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus15

10minus10

10minus5

100

105

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus2

100

102

104

106

108

1010

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus4

10minus3

10minus2

10minus1

100

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

Figure 1 Continued

8 Discrete Dynamics in Nature and Society

100 200 300 400 5000Iteration

2

4

6

8

10

12

14

16

18

20

Aver

age b

est s

o fa

r

F7

GGSACSABC

PSOGWOPGWO

Figure 1 The convergence curves of the average fitness value over 20 independent runs

next section we apply PGWO algorithm to solve clusteringproblem

5 Data Clustering

51 The Mathematical Description of Data Clustering Clus-tering sample is set to119883 = 119883119894 119894 = 1 2 119899 where119883119894 is a119901dimensional vector Clustering problem is to find a partition119862 = 1198621 1198622 119862119898 satisfying the following conditions[26]

(1) 119883 = ⋃119898

119894=1119862119894

(2) 119862119894 = 120601 119894 = 1 2 119898(3) 119862119894 cap 119862119895 = 120601 119894 119895 = 1 2 119898 119894 = 119895

52 The Clustering Criterion Clustering is the process ofgrouping a set of data objects into a number of clusters orgroups The aim of clustering is to make the data withina cluster have a high similarity while being very dissimilarto objects in other clusters Dissimilarities and similaritiesare evaluated based on the attribute of data sets contain-ing distance metric The most popular distance metric isEuclidean distance [55] Let 119894 = (1199091198941 1199091198942 119909119894119901) and 119895 =

(1199091198951 1199091198952 119909119895119901) be two objects described by 119901 numericattributes the Euclidean distance between object 119894 and 119895 ispresented as follows

119889 (119894 119895)

= radic(1199091198941 minus 1199091198951)2

+ (1199091198942 minus 1199091198952)2

+ sdot sdot sdot + (119909119894119901 minus 119909119895119901)2

(5)

For given119873 objects the clustering problem is tominimize thesum squared Euclidean distance between objects and allocateeach object to one of 119896 cluster centers [55]The clustering aims

at finding clusters center through minimizing the objectivefunction The objective function is defined as follows [26]

119869119888 =

119898

sum

119896=1

sum

119883119894isin119862119896

119889 (119883119894 119885119896) (6)

where 119898 indicates the number of clusters 119862119896 indicates the119896th cluster 119885119896 indicates the kth center of the clusteringand 119889(119883119894 119885119896) indicates the distance of the sample to thecorresponding center of clustering namely119889(119883119894 119885119896) = 119883119894minus119885119896

53 PGWO Algorithm on Clustering In clustering analysiseach element in the data set is a 119901 dimensional vectorMoreover the actual position of a grey wolf represents the 119896cluster centers so each greywolf indicates a 119896lowast119901 dimensionalvector For each grey wolf 119894 its position is denoted as a vector119883119894 = (1199091198941 1199091198942 119909119894119896lowast119901) In the initialization phase weutilize maximum and minimum values of each componentof the data set (which is to be grouped) as PGWO algorithmthe initialization search scope of the grey wolves and theinitialization solution is randomly generated in this rangeWe use (6) to calculate the fitness function of grey wolvesrsquoindividuals and the main steps of the fitness function areshown in Algorithm 4

54 Data Clustering Experimental Results and DiscussionIn order to verify performance of the proposed PGWOapproach for clustering we compare the results of the 119870-means GGSA CS ABC PSO GWO and PGWO clusteringalgorithms using nine different data sets that are selectedfrom the UCI machine learning repository [56]

Artificial data set (119873 = 600 119889 = 2 and 119896 = 4) is atwo-featured problemwith four unique classes A total of 600

Discrete Dynamics in Nature and Society 9

(1) For data vector 119909119894(2) Calculate the Euclidean distance by (5)(3) Assign 119909119894 to the closest cluster center(4) Calculate the measure function by (6)(5) End For(6) Return value of the fitness function

Algorithm 4 Main steps of the fitness function

patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732 (120583 = (

119898119894

0) Σ = [[

05 005

005 05]]) (7)

where 119894 = 1 2 3 4 1198981 = minus3 1198982 = 0 1198983 = 3 and 119898 = 6120583 and Σ are mean vector and covariance matrix respectively[45 57]

Iris data (119873 = 150 119889 = 4 and 119870 = 3) is a data setwith 150 random samples of flowers from the Iris speciessetosa versicolor and virginica collected by Anderson [58]From each species there are 50 observations for sepal lengthsepal width petal length and petal width in cm This dataset was used by Fisher [59] in his initiation of the linear-discriminant-function technique [28 56 57]

Wisconsin breast cancer (119873 = 683 119889 = 9 and 119870 =

2) consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [28 56 57]

Contraceptive method choice (119873 = 1473 119889 = 10119870 = 3)CMC for short is a data set that is a subset of the 1987NationalIndonesia Contraceptive Prevalence Survey The samples aremarried women who either were not pregnant or did notknow if they were at the time of interview The problemis to predict the current contraceptive method choice (nouse long-termmethods or short-termmethods) of a womanbased onher demographic and socioeconomic characteristics[28 56 57]

Seeds data (119873 = 210 119889 = 7 and 119870 = 3) is a dataset that consists of 210 patterns belonging to three differentvarieties of wheat Kama Rosa and Canadian From eachspecies there are 70 observations for area 119860 perimeter 119875compactness 119862 (119862 = 4 lowast 119901119894 lowast119860119901

and2) length of kernel width

of kernel asymmetry coefficient and length of kernel groove[56]

Statlog (Heart) data (119873 = 270 119889 = 13119870 = 2) is a data setthat is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [56]

Wine data (119873 = 178 119889 = 13 119870 = 3) is also takenfromMCI laboratoryThese data are the results of a chemicalanalysis of wines grown in the same region in Italy but derivedfrom three different cultivars The analysis determined thequantities of 13 constituents found in each of the three types

of wines There are 178 instances with 13 numeric attributesin wine data set All attributes are continuous There is nomissing attribute value [28 56 57]

Balance scale data (119873 = 625 119889 = 4 and 119870 = 3) is a dataset that was generated to model psychological experimentalresults Each example is classified as having the balance scaletip to the right or left being balanced The attributes arethe left weight the left distance the right weight and theright distance The correct way to find the class is the greaterof (left-distance lowast left-weight) and (right-distance lowast right-weight) If they are equal it is balanced [56]

Habermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) is adata set that contains cases from a study that was conductedbetween 1958 and 1970 at the University of Chicagorsquos BillingsHospital on the survival of patients who had undergonesurgery for breast cancer It records two survival statuspatients with the age of patient at time of operation patientrsquosyear of operation and number of positive axillary nodesdetected [56]

The results of comparison of intracluster distances forthe seven clustering algorithms over 20 independent runs areshown in Table 5 Table 6 reports the 119901 values produced byWilcoxonrsquos rank-sum test comparing PGWO and the otheralgorithms over all the data sets

For Art data set the optimum value the worst value andthe average value of PGWO and GGSA are all 51390119864 + 02while the standard deviation of GGSA is better than PGWOFor PSO it only gets the optimum solution 51390119864 + 02

For Iris data set Cancer data set and Seeds data setPGWO PSO and GGSA provide the optimum value incomparison to those obtained by other methods Howeverthe worst value the average value and the standard deviationvalue of PGWO are superior to those of the other methods

For Heart data set PGWO PSO and CS all find theoptimum solution 10623119864 + 04 That means they all can findthe global solution However PGWO is slightly better for theworst value the average value and the standard deviation

For CMC data set and Wine data set Table 5 shows thatthe average best worst and standard deviation values ofthe fitness function for PGWO algorithm are much smallerthan those of the other six methods The PGWO clusteringalgorithm is capable of providing the same partition of thedata points in all runs

For balance scale data set the optimum values of thefitness function for PGWO PSO andGGSA are 14238119864+03That means they all can find the global solution And theoptimum values of the fitness function for GWO and 119870-means are 14239119864 + 03 the result is close to PGWO PSOand GGSA However the standard deviation values of GWO119870-means PGWO PSO GGSA and CS are 68756119864 minus 0136565119864 + 00 86619119864 minus 01 97396119864 minus 01 16015119864 + 00 and73239119864 minus 01 respectively From the standard deviation wecan see that the GWO is more stable than the other methods

For Habermanrsquos Survival data set the optimum value theworst value and the average value of the fitness function forCS and PSO are almost the same The results of CS and PSOalgorithms are better than those of the other methods

The 119901 values in Table 6 show that the results of PGWOare significantly better in Art data set Cancer data set

10 Discrete Dynamics in Nature and Society

Table 5 Comparison of intracluster distances for the seven clustering algorithms over 20 independent runs

Data set Criteria 119870-means GGSA CS ABC PSO GWO PGWO

Art

Best 51454E + 02 51390E + 02 51391E + 02 51752E + 02 51390E + 02 51455E + 02 51390E + 02Worst 90474E + 02 51390E + 02 53213E + 02 55298E + 02 89248E + 02 52058E + 02 51390E + 02Mean 60974E + 02 51390E + 02 51597E + 02 52620E + 02 53283E + 02 51687E + 02 51390E + 02Std 16903E + 02 15430119864 minus 13 48536E + 00 97988E + 00 84652E + 01 18768E + 00 61243119864 minus 08

Iris

Best 97190E + 01 96655E + 01 96659E + 01 97616E + 01 96655E + 01 96835E + 01 96655E + 01Worst 12318E + 02 11839E + 02 97684E + 01 10401E + 02 12767E + 02 12108E + 02 96655E + 01Mean 10226E + 02 98865E + 01 96862E + 01 99710E + 01 10596E + 02 99903E + 01 96655E + 01Std 10290E + 01 55454E + 00 31103E minus 01 18460E + 00 14578E + 01 67482E + 00 85869E minus 10

Cancer

Best 29763E + 03 29644E + 03 29645E + 03 29887E + 03 29644E + 03 29645E + 03 29644E + 03Worst 29884E + 03 30611E + 03 29677E + 03 32650E + 03 47288E + 03 29650E + 03 29644E + 03Mean 29830E + 03 29715E + 03 29651E + 03 30760E + 03 34055E + 03 29647E + 03 29644E + 03Std 48278E + 00 21284E + 01 73222E minus 01 65283E + 01 78385E + 02 11452E minus 01 16118E minus 08

CMC

Best 57016E + 03 56938E + 03 57054E + 03 58219E + 03 56942E + 03 58236E + 03 56937E + 03Worst 57053E + 03 57111E + 03 57762E + 03 63743E + 03 71580E + 03 60911E + 03 56937E + 03Mean 57040E + 03 56971E + 03 57263E + 03 60868E + 03 57690E + 03 59239E + 03 56937E + 03Std 11022E + 00 37872E + 00 21119E + 01 17292E + 02 32694E + 02 90635E + 01 98777E minus 04

Seeds

Best 31322E + 02 31180E + 02 31187E + 02 31573E + 02 31180E + 02 31323E + 02 31180E + 02Worst 31373E + 02 31354E + 02 31746E + 02 34997E + 02 42035E + 02 31782E + 02 31180E + 02Mean 31337E + 02 31189E + 02 31347E + 02 33161E + 02 32813E + 02 31475E + 02 31180E + 02Std 23686E minus 01 38862E minus 01 16383E + 00 10943E + 01 39697E + 01 15716E + 00 17815E minus 09

Heart

Best 10682E + 04 10749E + 04 10623E + 04 10683E + 04 10623E + 04 10637E + 04 10623E + 04Worst 10701E + 04 12684E + 04 10625E + 04 11622E + 04 10626E + 04 10683E + 04 10624E + 04Mean 10693E + 04 11542E + 04 10624E + 04 10920E + 04 10624E + 04 10657E + 04 10623E + 04Std 81672E + 00 57044E + 02 46239E minus 01 25921E + 02 76866E minus 01 13817E + 01 23235E minus 01

Wine

Best 16385E + 04 16493E + 04 16296E + 04 16566E + 04 16294E + 04 16316E + 04 16292E + 04Worst 18437E + 04 20245E + 04 16311E + 04 17668E + 04 16312E + 04 16371E + 04 16294E + 04Mean 16974E + 04 17999E + 04 16301E + 04 17010E + 04 16297E + 04 16345E + 04 16293E + 04Std 86757E + 02 11562E + 03 45677E + 00 36824E + 02 40149E + 00 14836E + 01 52338E minus 01

Balance scale

Best 14239E + 03 14238E + 03 14256E + 03 14265E + 03 14238E + 03 14239E + 03 14238E + 03Worst 14337E + 03 14291E + 03 14285E + 03 14310E + 03 14262E + 03 14260E + 03 14257E + 03Mean 14275E + 03 14259E + 03 14268E + 03 14282E + 03 14248E + 03 14243119864 + 03 14245119864 + 03

Std 36565E + 00 16015E + 00 73239E minus 01 11831E + 00 97396E minus 01 68756119864 minus 01 86619119864 minus 01

Haberman-Survival

Best 26251E + 03 25670E + 03 25670E + 03 25671E + 03 25670E + 03 25673E + 03 25670E + 03Worst 31966E + 03 26226E + 03 25670119864 + 03 25709E + 03 25678E + 03 26686E + 03 25678119864 + 03

Mean 26554E + 03 25702E + 03 25670119864 + 03 25679E + 03 25671E + 03 25898E + 03 25673119864 + 03

Std 12740E + 02 12354E + 01 17590119864 minus 06 83107E minus 01 30624E minus 01 23346E + 01 40711119864 minus 01

Table 6 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS GGSA and 119870-means over datasets (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA 119870-meansArt 680E minus 08 160E minus 05 680E minus 08 680E minus 08 611E minus 08 449E minus 08Iris 675E minus 08 119E minus 06 675E minus 08 675E minus 08 0107346 517E minus 08Cancer 676E minus 08 676E minus 08 676E minus 08 676E minus 08 12E minus 06 598E minus 08CMC 669E minus 08 669E minus 08 669E minus 08 669E minus 08 669E minus 08 339E minus 08Seeds 680E minus 08 680E minus 08 680E minus 08 680E minus 08 123E minus 03 481E minus 08Heart 661E minus 08 202E minus 06 661E minus 08 587E minus 07 661E minus 08 607E minus 08Wine 680E minus 08 123E minus 07 680E minus 08 680E minus 08 680E minus 08 663E minus 08Balance scale 0067868 0007712 68E minus 08 106E minus 07 0003966 601E minus 07Haberman-Survival 690E minus 07 0524948 178E minus 03 0860418 0090604 626E minus 08

Discrete Dynamics in Nature and Society 11

Art Iris

Cancer CMC

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900

950

1000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

80

90

100

110

120

130

140

150

160

170

180

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

2500

3000

3500

4000

4500

5000

5500

6000

6500

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

5500

6000

6500

7000

7500

8000

8500Av

erag

e bes

t so

far

50 100 150 2000Iteration

105

11

115

12

125

13

135

14

145

15

155

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

300

350

400

450

500

550

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

Figure 2 Continued

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 4: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

4 Discrete Dynamics in Nature and Society

Initialize the grey wolf population119883119894 = (119894 = 1 2 119899) and parametersCalculate the fitness of populationFind the first three agents119883120572119883120573119883120575

While (119905 ltMax number of iterations)Update the position of the current search agent by (4)If rand gt 119901 choose current best solution119883120572 as a starting point and generate a new solution1198831015840

120572by Powellrsquos

method as illustrated Algorithm 2 If1198831015840120572lt 119883120572 replace119883120572 with119883

1015840

120572 otherwise go to Step 6

Calculate the fitness of populationUpdate119883120572119883120573119883120575119905 = 119905 + 1

EndWhileReturn119883120572

Algorithm 3 PGWO algorithm

Table 1 Benchmark functions

Functions Name Dim Range 119891min

1198651 =

119899

sum

119894=1

1199092

119894Sphere 30 [minus100 100] 0

1198652 =

119899minus1

sum

119894=1

[100 (119909119894+1 minus 1199092

119894)2

+ (119909119894 minus 1)2] Rosenbrock 30 [minus30 30] 0

1198653 = minus20 exp(minus02radic1

119899

119899

sum

119894=1

1199092

119894) minus exp(1

119899

119899

sum

119894=1

cos (2120587119909119894)) + 20 + 119890 Ackley 30 [minus32 32] 0

1198654 =120587

119899 10 sin (1205871199101) +

119899minus1

sum

119894=1

(1199101 minus 1)2[1 + 10sin2 (120587119910119894+1)] + (119910119899 minus 1)

2

Penalized 30 [minus50 50] 0+

119899

sum

119894=1

119906(119909119894 10 100 4)

119910119894 = 1 +119909119894 + 1

4

119906(119909119894 119886 119896 119898) =

119896 (119909119894 minus 119886)119898

119909119894 gt 119886

0 minus119886 lt 119909119894 lt 119886

119896 (minus119909119894 minus 119886)119898

119909119894 lt minus119886

1198655= 01 sin2 (31205871199091) +

119899

sum

119894=1

(119909119894 minus 1)2[1 + sin2 (3120587119909119894 + 1)] + (119909119899 minus 1)

2[1 + sin2 (2120587119909119899)]

Penalized 2 30 [minus50 50] 0+

119899

sum

119894=1

119906(119909119894 5 100 4)

1198656 =

11

sum

119894=1

[119886119894 minus1199091 (1198872

119894+ 1198871198941199092)

1198872119894+ 1198871198941199093 + 1199094

]

2

Kowalik 4 [minus5 5] 00003075

1198657 = [1 + (1199091 + 1199092 + 1)2(19 minus 141199091 + 3119909

2

1minus 141199092 + 611990911199092 + 3119909

2

2)] Goldstein-Price 2 [minus2 2] 3

times [30 + (21199091 minus 31199092)2times (18 minus 321199091 + 12119909

2

1+ 481199092 minus 3611990911199092 + 27119909

2

2)]

of Powellrsquos method and is set to 05 We also have triedto vary the number of grey wolves (population size 119899) andthe performing probability of Powellrsquos method From oursimulations we found that 119899 = 30 and 119901 = 05 aresufficient for most optimization problems So we will usefixed 119899 = 30 and 119901 = 05 in the rest of the simulationsIn the following section various benchmark functions areemployed to investigate the efficiency of PGWO algorithm(see Algorithm 3)

4 Experimental Results and Discussion41 Simulation Platform All algorithms are tested in MatlabR2012a (714) and experiments are executed on AMD Athlon

(tm) II X4 640 Processor 300GHz PC with 3G RAMOperating system is Windows 7

42 Benchmark Functions In order to evaluate the propertyof PGWO algorithm seven standard benchmark functionsare employed [50 51] These benchmark functions can bedivided into two different groups unimodal and multimodalfunctions Table 1 lists the functions where Dim indicatesdimension of the function Range is the boundary of the func-tionrsquos search space and 119891min is the minimum return value ofthe function among which the former two functions (1198651-1198652)are unimodal (1198653ndash1198655) are multimodal functions and thelast two functions (1198656-1198657) are fixed-dimension multimodal

Discrete Dynamics in Nature and Society 5

Table 2 Initial parameters of algorithms

Algorithm Parameter Value

PGWO

Population size 30 Linearly decreased from 2 to 0

Performing probability of Powell (119875) 05Max iteration 500 (benchmark functions test) 200 (data sets test)

Stopping criteria Max iteration

GWO

Population size 30 Linearly decreased from 2 to 0

Max iteration 500 (functions test) 200 (data sets test)Stopping criteria Max iteration

PSO

Population size 301198621 1198622 14962 14962119882 07298

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

ABC

Population size 30Limit 10

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

CS

Population size 30119901119886 025

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

GGSA

Population size 301198881015840

1(minus211990531198793) + 2

1198881015840

2(211990531198793)

1198660 1120572 20

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

119879 indicates the maximum number of interactions 119905 is the current iteration

benchmark functionsThere are some parameters that shouldbe initialized before running Table 2 is the initial value ofthese algorithms

The experimental results are presented in Table 3 Theresults are averaged over 20 independent runs and boldresults mean that PGWO is better while underlined resultsmean that the other algorithm is better The Best WorstMean and Std represent the optimal fitness value worstfitness value mean fitness value and standard deviationrespectively Note that the Matlab code of the GGSA algo-rithm is given in httpwwwalimirjalilicomProjectshtml

To improve the performance evaluation of evolutionaryalgorithms statistical tests should be conducted [52] In orderto determine whether the results of PGWO differ from thebest results of the other algorithms in a statistical method anonparametric test which is known as Wilcoxonrsquos rank-sumtest [53 54] is performed at 5 significance levelThe119901 valuescalculated in Wilcoxonrsquos rank-sum test comparing PGWOand the other algorithms over all the benchmark functionsare given in Table 4 According to [52] 119901 values lt 005 can beconsidered as sufficient evidence against the null hypothesis

43 Comparison of Experiment Results As shown in Table 3PGWO has the best result for 1198651 For 1198652 PGWO providedbetter results than the other algorithms across Best Worstand Mean while the Std of GWO is better than PGWOTheunimodal benchmark functions have only one global solutionwithout any local optima so they are very suitable to examineexploitation Hence these results indicate that the proposedmethod provides greatly improved exploitation compared tothe other algorithms

However it should be noticed that multimodal bench-mark functions have many local minima The final resultsare more important because these functions can reflect theability of the algorithm escaping from poor local optima andobtaining the global optimum We have tested the experi-ments on 1198653ndash1198655 Seen from Table 3 the PGWO provides thebest results

For 1198656-1198657 these are fixed-dimension multimodal bench-mark functions with only a few local minima the dimensionsof the functions are also small In this case it is hard tojudge the property of the algorithms The major differencecomparedwith functions1198653ndash1198655 is that functions1198656-1198657 appear

6 Discrete Dynamics in Nature and Society

Table 3 Results comparison of different optimal algorithms for 20 independent runs

Function Criteria GGSA CS ABC PSO GWO PGWO

1198651

Best 10133E minus 17 37450E + 00 21820E + 01 17964E minus 06 47447E minus 29 25422E minus 54Worst 10181E minus 16 24776E + 01 18094E + 02 22907E minus 04 22227E minus 26 79888E minus 53Mean 51858E minus 17 88292E + 00 81908E + 01 65181E minus 05 18269E minus 27 16276E minus 53Std 26953E minus 17 47633E + 00 43834E + 01 69891E minus 05 48884E minus 27 17834E minus 53

1198652

Best 23974E + 01 30920E + 02 44296E + 03 24432E + 01 26182E + 01 49374E + 00Worst 34411E + 02 22219E + 03 45866E + 04 41217E + 02 28769E + 01 15466E + 01Mean 77481E + 01 55445E + 02 15193E + 04 12465E + 02 27507E + 01 11692E + 01Std 74751E + 01 41835E + 02 97239E + 03 14004E + 02 67488119864 minus 01 20844119864 + 00

1198653

Best 23591E minus 09 40632E + 00 92131E + 00 63889E minus 03 79048E minus 14 44409E minus 15Worst 98704E minus 09 10603E + 01 16954E + 01 49535E + 00 15010E minus 13 44409E minus 15Mean 50646E minus 09 60677E + 00 13090E + 01 20423E + 00 10498E minus 13 44409E minus 15Std 19095E minus 09 15421E + 00 22882E + 00 11464E + 00 17485E minus 14 0

1198654

Best 54733E minus 01 20863E + 00 75357E minus 01 24473E minus 03 18626E minus 02 16432E minus 32Worst 46556E + 00 41099E + 00 55219E + 00 36426E + 00 12149E minus 01 16432E minus 32Mean 19728E + 00 32711E + 00 20544E + 00 85739E minus 01 54300E minus 02 16432E minus 32Std 10488E + 00 59739E minus 01 13358E + 00 10326E + 00 29934E minus 02 28080E minus 48

1198655

Best 21024E minus 02 27015E + 00 22507E + 00 63080E minus 06 39444E minus 01 13498E minus 32Worst 24387E + 01 12516E + 01 11090E + 02 36461E + 00 85465E minus 01 19625E minus 01Mean 77881E + 00 62644E + 00 11124E + 01 37285E minus 01 67323E minus 01 45918E minus 02Std 66733E + 00 25624E + 00 23792E + 01 93480E minus 01 14330E minus 01 60835E minus 02

1198656

Best 12358E minus 03 31550E minus 04 83768E minus 04 30749119864 minus 04 30750E minus 04 30752119864 minus 04

Worst 12219E minus 02 60087119864 minus 04 34810E minus 03 19361E minus 03 20363E minus 02 15943119864 minus 03

Mean 43558E minus 03 43266119864 minus 04 16421E minus 03 93309E minus 04 64067E minus 03 45499119864 minus 04

Std 26046E minus 03 85397119864 minus 05 72141E minus 04 41733E minus 04 93762E minus 03 34265119864 minus 04

1198657

Best 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00Worst 30000E + 00 30000E + 00 30620E + 00 30000E + 00 84000E + 01 30000E + 00Mean 30000E + 00 30000E + 00 30136E + 00 30000E + 00 11100E + 01 30000E + 00Std 25794E minus 15 15717E minus 15 19505E minus 02 12310119864 minus 15 24931E + 01 92458119864 minus 07

Table 4 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS and GGSA over benchmarkfunctions (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA1198651 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198652 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198653 766E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198654 801E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198655 597E minus 08 0295606 597E minus 08 597E minus 08 467E minus 071198656 0003057 0000375 138E minus 06 0001782 106E minus 071198657 00002 493E minus 08 680E minus 08 619E minus 08 649E minus 08

to be simpler than 1198653ndash1198655 due to their low dimension and asmaller number of local minima For 1198656 CS provided betterresults than the other algorithms across Mean Std andWorst For 1198657 the majority of the algorithms can find theoptimal solution but the PSO is more stable than the otheralgorithms in terms of Std

The 119901 values ofWilcoxonrsquos rank-sum in Table 4 show thatthe results of PGWO in 1198655 are not significantly better thanPSO However the PGWO achieves significant improvementin all remaining benchmark functions compared to the otheralgorithms Therefore this is evidence that the proposed

algorithm has high performance in dealing with unimodalmultimodal and fixed-dimension multimodal benchmarkfunctions

The convergence curves of six algorithms are illustratedin Figure 1 As can be seen in these figures PGWO has amuch better convergence rate than the other algorithms onall benchmark functions

According to this comprehensive comparative study anddiscussions these results show that the proposed algorithmis able to significantly improve the performance of GWOand overcome its major shortcomings For that reason in the

Discrete Dynamics in Nature and Society 7

F6

times107

F4

F2F1times10

4

F3

F5

100 200 300 400 5000Iteration

0

05

1

15

2

25

Aver

age b

est s

o fa

r

101

102

103

104

105

106

107

108

109

Aver

age b

est s

o fa

r (lo

g)

GGSACSABC

PSOGWOPGWO

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

0

1

2

3

4

5

6

7

8

9

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus15

10minus10

10minus5

100

105

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus2

100

102

104

106

108

1010

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus4

10minus3

10minus2

10minus1

100

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

Figure 1 Continued

8 Discrete Dynamics in Nature and Society

100 200 300 400 5000Iteration

2

4

6

8

10

12

14

16

18

20

Aver

age b

est s

o fa

r

F7

GGSACSABC

PSOGWOPGWO

Figure 1 The convergence curves of the average fitness value over 20 independent runs

next section we apply PGWO algorithm to solve clusteringproblem

5 Data Clustering

51 The Mathematical Description of Data Clustering Clus-tering sample is set to119883 = 119883119894 119894 = 1 2 119899 where119883119894 is a119901dimensional vector Clustering problem is to find a partition119862 = 1198621 1198622 119862119898 satisfying the following conditions[26]

(1) 119883 = ⋃119898

119894=1119862119894

(2) 119862119894 = 120601 119894 = 1 2 119898(3) 119862119894 cap 119862119895 = 120601 119894 119895 = 1 2 119898 119894 = 119895

52 The Clustering Criterion Clustering is the process ofgrouping a set of data objects into a number of clusters orgroups The aim of clustering is to make the data withina cluster have a high similarity while being very dissimilarto objects in other clusters Dissimilarities and similaritiesare evaluated based on the attribute of data sets contain-ing distance metric The most popular distance metric isEuclidean distance [55] Let 119894 = (1199091198941 1199091198942 119909119894119901) and 119895 =

(1199091198951 1199091198952 119909119895119901) be two objects described by 119901 numericattributes the Euclidean distance between object 119894 and 119895 ispresented as follows

119889 (119894 119895)

= radic(1199091198941 minus 1199091198951)2

+ (1199091198942 minus 1199091198952)2

+ sdot sdot sdot + (119909119894119901 minus 119909119895119901)2

(5)

For given119873 objects the clustering problem is tominimize thesum squared Euclidean distance between objects and allocateeach object to one of 119896 cluster centers [55]The clustering aims

at finding clusters center through minimizing the objectivefunction The objective function is defined as follows [26]

119869119888 =

119898

sum

119896=1

sum

119883119894isin119862119896

119889 (119883119894 119885119896) (6)

where 119898 indicates the number of clusters 119862119896 indicates the119896th cluster 119885119896 indicates the kth center of the clusteringand 119889(119883119894 119885119896) indicates the distance of the sample to thecorresponding center of clustering namely119889(119883119894 119885119896) = 119883119894minus119885119896

53 PGWO Algorithm on Clustering In clustering analysiseach element in the data set is a 119901 dimensional vectorMoreover the actual position of a grey wolf represents the 119896cluster centers so each greywolf indicates a 119896lowast119901 dimensionalvector For each grey wolf 119894 its position is denoted as a vector119883119894 = (1199091198941 1199091198942 119909119894119896lowast119901) In the initialization phase weutilize maximum and minimum values of each componentof the data set (which is to be grouped) as PGWO algorithmthe initialization search scope of the grey wolves and theinitialization solution is randomly generated in this rangeWe use (6) to calculate the fitness function of grey wolvesrsquoindividuals and the main steps of the fitness function areshown in Algorithm 4

54 Data Clustering Experimental Results and DiscussionIn order to verify performance of the proposed PGWOapproach for clustering we compare the results of the 119870-means GGSA CS ABC PSO GWO and PGWO clusteringalgorithms using nine different data sets that are selectedfrom the UCI machine learning repository [56]

Artificial data set (119873 = 600 119889 = 2 and 119896 = 4) is atwo-featured problemwith four unique classes A total of 600

Discrete Dynamics in Nature and Society 9

(1) For data vector 119909119894(2) Calculate the Euclidean distance by (5)(3) Assign 119909119894 to the closest cluster center(4) Calculate the measure function by (6)(5) End For(6) Return value of the fitness function

Algorithm 4 Main steps of the fitness function

patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732 (120583 = (

119898119894

0) Σ = [[

05 005

005 05]]) (7)

where 119894 = 1 2 3 4 1198981 = minus3 1198982 = 0 1198983 = 3 and 119898 = 6120583 and Σ are mean vector and covariance matrix respectively[45 57]

Iris data (119873 = 150 119889 = 4 and 119870 = 3) is a data setwith 150 random samples of flowers from the Iris speciessetosa versicolor and virginica collected by Anderson [58]From each species there are 50 observations for sepal lengthsepal width petal length and petal width in cm This dataset was used by Fisher [59] in his initiation of the linear-discriminant-function technique [28 56 57]

Wisconsin breast cancer (119873 = 683 119889 = 9 and 119870 =

2) consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [28 56 57]

Contraceptive method choice (119873 = 1473 119889 = 10119870 = 3)CMC for short is a data set that is a subset of the 1987NationalIndonesia Contraceptive Prevalence Survey The samples aremarried women who either were not pregnant or did notknow if they were at the time of interview The problemis to predict the current contraceptive method choice (nouse long-termmethods or short-termmethods) of a womanbased onher demographic and socioeconomic characteristics[28 56 57]

Seeds data (119873 = 210 119889 = 7 and 119870 = 3) is a dataset that consists of 210 patterns belonging to three differentvarieties of wheat Kama Rosa and Canadian From eachspecies there are 70 observations for area 119860 perimeter 119875compactness 119862 (119862 = 4 lowast 119901119894 lowast119860119901

and2) length of kernel width

of kernel asymmetry coefficient and length of kernel groove[56]

Statlog (Heart) data (119873 = 270 119889 = 13119870 = 2) is a data setthat is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [56]

Wine data (119873 = 178 119889 = 13 119870 = 3) is also takenfromMCI laboratoryThese data are the results of a chemicalanalysis of wines grown in the same region in Italy but derivedfrom three different cultivars The analysis determined thequantities of 13 constituents found in each of the three types

of wines There are 178 instances with 13 numeric attributesin wine data set All attributes are continuous There is nomissing attribute value [28 56 57]

Balance scale data (119873 = 625 119889 = 4 and 119870 = 3) is a dataset that was generated to model psychological experimentalresults Each example is classified as having the balance scaletip to the right or left being balanced The attributes arethe left weight the left distance the right weight and theright distance The correct way to find the class is the greaterof (left-distance lowast left-weight) and (right-distance lowast right-weight) If they are equal it is balanced [56]

Habermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) is adata set that contains cases from a study that was conductedbetween 1958 and 1970 at the University of Chicagorsquos BillingsHospital on the survival of patients who had undergonesurgery for breast cancer It records two survival statuspatients with the age of patient at time of operation patientrsquosyear of operation and number of positive axillary nodesdetected [56]

The results of comparison of intracluster distances forthe seven clustering algorithms over 20 independent runs areshown in Table 5 Table 6 reports the 119901 values produced byWilcoxonrsquos rank-sum test comparing PGWO and the otheralgorithms over all the data sets

For Art data set the optimum value the worst value andthe average value of PGWO and GGSA are all 51390119864 + 02while the standard deviation of GGSA is better than PGWOFor PSO it only gets the optimum solution 51390119864 + 02

For Iris data set Cancer data set and Seeds data setPGWO PSO and GGSA provide the optimum value incomparison to those obtained by other methods Howeverthe worst value the average value and the standard deviationvalue of PGWO are superior to those of the other methods

For Heart data set PGWO PSO and CS all find theoptimum solution 10623119864 + 04 That means they all can findthe global solution However PGWO is slightly better for theworst value the average value and the standard deviation

For CMC data set and Wine data set Table 5 shows thatthe average best worst and standard deviation values ofthe fitness function for PGWO algorithm are much smallerthan those of the other six methods The PGWO clusteringalgorithm is capable of providing the same partition of thedata points in all runs

For balance scale data set the optimum values of thefitness function for PGWO PSO andGGSA are 14238119864+03That means they all can find the global solution And theoptimum values of the fitness function for GWO and 119870-means are 14239119864 + 03 the result is close to PGWO PSOand GGSA However the standard deviation values of GWO119870-means PGWO PSO GGSA and CS are 68756119864 minus 0136565119864 + 00 86619119864 minus 01 97396119864 minus 01 16015119864 + 00 and73239119864 minus 01 respectively From the standard deviation wecan see that the GWO is more stable than the other methods

For Habermanrsquos Survival data set the optimum value theworst value and the average value of the fitness function forCS and PSO are almost the same The results of CS and PSOalgorithms are better than those of the other methods

The 119901 values in Table 6 show that the results of PGWOare significantly better in Art data set Cancer data set

10 Discrete Dynamics in Nature and Society

Table 5 Comparison of intracluster distances for the seven clustering algorithms over 20 independent runs

Data set Criteria 119870-means GGSA CS ABC PSO GWO PGWO

Art

Best 51454E + 02 51390E + 02 51391E + 02 51752E + 02 51390E + 02 51455E + 02 51390E + 02Worst 90474E + 02 51390E + 02 53213E + 02 55298E + 02 89248E + 02 52058E + 02 51390E + 02Mean 60974E + 02 51390E + 02 51597E + 02 52620E + 02 53283E + 02 51687E + 02 51390E + 02Std 16903E + 02 15430119864 minus 13 48536E + 00 97988E + 00 84652E + 01 18768E + 00 61243119864 minus 08

Iris

Best 97190E + 01 96655E + 01 96659E + 01 97616E + 01 96655E + 01 96835E + 01 96655E + 01Worst 12318E + 02 11839E + 02 97684E + 01 10401E + 02 12767E + 02 12108E + 02 96655E + 01Mean 10226E + 02 98865E + 01 96862E + 01 99710E + 01 10596E + 02 99903E + 01 96655E + 01Std 10290E + 01 55454E + 00 31103E minus 01 18460E + 00 14578E + 01 67482E + 00 85869E minus 10

Cancer

Best 29763E + 03 29644E + 03 29645E + 03 29887E + 03 29644E + 03 29645E + 03 29644E + 03Worst 29884E + 03 30611E + 03 29677E + 03 32650E + 03 47288E + 03 29650E + 03 29644E + 03Mean 29830E + 03 29715E + 03 29651E + 03 30760E + 03 34055E + 03 29647E + 03 29644E + 03Std 48278E + 00 21284E + 01 73222E minus 01 65283E + 01 78385E + 02 11452E minus 01 16118E minus 08

CMC

Best 57016E + 03 56938E + 03 57054E + 03 58219E + 03 56942E + 03 58236E + 03 56937E + 03Worst 57053E + 03 57111E + 03 57762E + 03 63743E + 03 71580E + 03 60911E + 03 56937E + 03Mean 57040E + 03 56971E + 03 57263E + 03 60868E + 03 57690E + 03 59239E + 03 56937E + 03Std 11022E + 00 37872E + 00 21119E + 01 17292E + 02 32694E + 02 90635E + 01 98777E minus 04

Seeds

Best 31322E + 02 31180E + 02 31187E + 02 31573E + 02 31180E + 02 31323E + 02 31180E + 02Worst 31373E + 02 31354E + 02 31746E + 02 34997E + 02 42035E + 02 31782E + 02 31180E + 02Mean 31337E + 02 31189E + 02 31347E + 02 33161E + 02 32813E + 02 31475E + 02 31180E + 02Std 23686E minus 01 38862E minus 01 16383E + 00 10943E + 01 39697E + 01 15716E + 00 17815E minus 09

Heart

Best 10682E + 04 10749E + 04 10623E + 04 10683E + 04 10623E + 04 10637E + 04 10623E + 04Worst 10701E + 04 12684E + 04 10625E + 04 11622E + 04 10626E + 04 10683E + 04 10624E + 04Mean 10693E + 04 11542E + 04 10624E + 04 10920E + 04 10624E + 04 10657E + 04 10623E + 04Std 81672E + 00 57044E + 02 46239E minus 01 25921E + 02 76866E minus 01 13817E + 01 23235E minus 01

Wine

Best 16385E + 04 16493E + 04 16296E + 04 16566E + 04 16294E + 04 16316E + 04 16292E + 04Worst 18437E + 04 20245E + 04 16311E + 04 17668E + 04 16312E + 04 16371E + 04 16294E + 04Mean 16974E + 04 17999E + 04 16301E + 04 17010E + 04 16297E + 04 16345E + 04 16293E + 04Std 86757E + 02 11562E + 03 45677E + 00 36824E + 02 40149E + 00 14836E + 01 52338E minus 01

Balance scale

Best 14239E + 03 14238E + 03 14256E + 03 14265E + 03 14238E + 03 14239E + 03 14238E + 03Worst 14337E + 03 14291E + 03 14285E + 03 14310E + 03 14262E + 03 14260E + 03 14257E + 03Mean 14275E + 03 14259E + 03 14268E + 03 14282E + 03 14248E + 03 14243119864 + 03 14245119864 + 03

Std 36565E + 00 16015E + 00 73239E minus 01 11831E + 00 97396E minus 01 68756119864 minus 01 86619119864 minus 01

Haberman-Survival

Best 26251E + 03 25670E + 03 25670E + 03 25671E + 03 25670E + 03 25673E + 03 25670E + 03Worst 31966E + 03 26226E + 03 25670119864 + 03 25709E + 03 25678E + 03 26686E + 03 25678119864 + 03

Mean 26554E + 03 25702E + 03 25670119864 + 03 25679E + 03 25671E + 03 25898E + 03 25673119864 + 03

Std 12740E + 02 12354E + 01 17590119864 minus 06 83107E minus 01 30624E minus 01 23346E + 01 40711119864 minus 01

Table 6 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS GGSA and 119870-means over datasets (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA 119870-meansArt 680E minus 08 160E minus 05 680E minus 08 680E minus 08 611E minus 08 449E minus 08Iris 675E minus 08 119E minus 06 675E minus 08 675E minus 08 0107346 517E minus 08Cancer 676E minus 08 676E minus 08 676E minus 08 676E minus 08 12E minus 06 598E minus 08CMC 669E minus 08 669E minus 08 669E minus 08 669E minus 08 669E minus 08 339E minus 08Seeds 680E minus 08 680E minus 08 680E minus 08 680E minus 08 123E minus 03 481E minus 08Heart 661E minus 08 202E minus 06 661E minus 08 587E minus 07 661E minus 08 607E minus 08Wine 680E minus 08 123E minus 07 680E minus 08 680E minus 08 680E minus 08 663E minus 08Balance scale 0067868 0007712 68E minus 08 106E minus 07 0003966 601E minus 07Haberman-Survival 690E minus 07 0524948 178E minus 03 0860418 0090604 626E minus 08

Discrete Dynamics in Nature and Society 11

Art Iris

Cancer CMC

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900

950

1000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

80

90

100

110

120

130

140

150

160

170

180

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

2500

3000

3500

4000

4500

5000

5500

6000

6500

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

5500

6000

6500

7000

7500

8000

8500Av

erag

e bes

t so

far

50 100 150 2000Iteration

105

11

115

12

125

13

135

14

145

15

155

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

300

350

400

450

500

550

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

Figure 2 Continued

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 5: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

Discrete Dynamics in Nature and Society 5

Table 2 Initial parameters of algorithms

Algorithm Parameter Value

PGWO

Population size 30 Linearly decreased from 2 to 0

Performing probability of Powell (119875) 05Max iteration 500 (benchmark functions test) 200 (data sets test)

Stopping criteria Max iteration

GWO

Population size 30 Linearly decreased from 2 to 0

Max iteration 500 (functions test) 200 (data sets test)Stopping criteria Max iteration

PSO

Population size 301198621 1198622 14962 14962119882 07298

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

ABC

Population size 30Limit 10

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

CS

Population size 30119901119886 025

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

GGSA

Population size 301198881015840

1(minus211990531198793) + 2

1198881015840

2(211990531198793)

1198660 1120572 20

Max iteration 500 (benchmark functions test) 200 (data sets test)Stopping criteria Max iteration

119879 indicates the maximum number of interactions 119905 is the current iteration

benchmark functionsThere are some parameters that shouldbe initialized before running Table 2 is the initial value ofthese algorithms

The experimental results are presented in Table 3 Theresults are averaged over 20 independent runs and boldresults mean that PGWO is better while underlined resultsmean that the other algorithm is better The Best WorstMean and Std represent the optimal fitness value worstfitness value mean fitness value and standard deviationrespectively Note that the Matlab code of the GGSA algo-rithm is given in httpwwwalimirjalilicomProjectshtml

To improve the performance evaluation of evolutionaryalgorithms statistical tests should be conducted [52] In orderto determine whether the results of PGWO differ from thebest results of the other algorithms in a statistical method anonparametric test which is known as Wilcoxonrsquos rank-sumtest [53 54] is performed at 5 significance levelThe119901 valuescalculated in Wilcoxonrsquos rank-sum test comparing PGWOand the other algorithms over all the benchmark functionsare given in Table 4 According to [52] 119901 values lt 005 can beconsidered as sufficient evidence against the null hypothesis

43 Comparison of Experiment Results As shown in Table 3PGWO has the best result for 1198651 For 1198652 PGWO providedbetter results than the other algorithms across Best Worstand Mean while the Std of GWO is better than PGWOTheunimodal benchmark functions have only one global solutionwithout any local optima so they are very suitable to examineexploitation Hence these results indicate that the proposedmethod provides greatly improved exploitation compared tothe other algorithms

However it should be noticed that multimodal bench-mark functions have many local minima The final resultsare more important because these functions can reflect theability of the algorithm escaping from poor local optima andobtaining the global optimum We have tested the experi-ments on 1198653ndash1198655 Seen from Table 3 the PGWO provides thebest results

For 1198656-1198657 these are fixed-dimension multimodal bench-mark functions with only a few local minima the dimensionsof the functions are also small In this case it is hard tojudge the property of the algorithms The major differencecomparedwith functions1198653ndash1198655 is that functions1198656-1198657 appear

6 Discrete Dynamics in Nature and Society

Table 3 Results comparison of different optimal algorithms for 20 independent runs

Function Criteria GGSA CS ABC PSO GWO PGWO

1198651

Best 10133E minus 17 37450E + 00 21820E + 01 17964E minus 06 47447E minus 29 25422E minus 54Worst 10181E minus 16 24776E + 01 18094E + 02 22907E minus 04 22227E minus 26 79888E minus 53Mean 51858E minus 17 88292E + 00 81908E + 01 65181E minus 05 18269E minus 27 16276E minus 53Std 26953E minus 17 47633E + 00 43834E + 01 69891E minus 05 48884E minus 27 17834E minus 53

1198652

Best 23974E + 01 30920E + 02 44296E + 03 24432E + 01 26182E + 01 49374E + 00Worst 34411E + 02 22219E + 03 45866E + 04 41217E + 02 28769E + 01 15466E + 01Mean 77481E + 01 55445E + 02 15193E + 04 12465E + 02 27507E + 01 11692E + 01Std 74751E + 01 41835E + 02 97239E + 03 14004E + 02 67488119864 minus 01 20844119864 + 00

1198653

Best 23591E minus 09 40632E + 00 92131E + 00 63889E minus 03 79048E minus 14 44409E minus 15Worst 98704E minus 09 10603E + 01 16954E + 01 49535E + 00 15010E minus 13 44409E minus 15Mean 50646E minus 09 60677E + 00 13090E + 01 20423E + 00 10498E minus 13 44409E minus 15Std 19095E minus 09 15421E + 00 22882E + 00 11464E + 00 17485E minus 14 0

1198654

Best 54733E minus 01 20863E + 00 75357E minus 01 24473E minus 03 18626E minus 02 16432E minus 32Worst 46556E + 00 41099E + 00 55219E + 00 36426E + 00 12149E minus 01 16432E minus 32Mean 19728E + 00 32711E + 00 20544E + 00 85739E minus 01 54300E minus 02 16432E minus 32Std 10488E + 00 59739E minus 01 13358E + 00 10326E + 00 29934E minus 02 28080E minus 48

1198655

Best 21024E minus 02 27015E + 00 22507E + 00 63080E minus 06 39444E minus 01 13498E minus 32Worst 24387E + 01 12516E + 01 11090E + 02 36461E + 00 85465E minus 01 19625E minus 01Mean 77881E + 00 62644E + 00 11124E + 01 37285E minus 01 67323E minus 01 45918E minus 02Std 66733E + 00 25624E + 00 23792E + 01 93480E minus 01 14330E minus 01 60835E minus 02

1198656

Best 12358E minus 03 31550E minus 04 83768E minus 04 30749119864 minus 04 30750E minus 04 30752119864 minus 04

Worst 12219E minus 02 60087119864 minus 04 34810E minus 03 19361E minus 03 20363E minus 02 15943119864 minus 03

Mean 43558E minus 03 43266119864 minus 04 16421E minus 03 93309E minus 04 64067E minus 03 45499119864 minus 04

Std 26046E minus 03 85397119864 minus 05 72141E minus 04 41733E minus 04 93762E minus 03 34265119864 minus 04

1198657

Best 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00Worst 30000E + 00 30000E + 00 30620E + 00 30000E + 00 84000E + 01 30000E + 00Mean 30000E + 00 30000E + 00 30136E + 00 30000E + 00 11100E + 01 30000E + 00Std 25794E minus 15 15717E minus 15 19505E minus 02 12310119864 minus 15 24931E + 01 92458119864 minus 07

Table 4 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS and GGSA over benchmarkfunctions (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA1198651 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198652 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198653 766E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198654 801E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198655 597E minus 08 0295606 597E minus 08 597E minus 08 467E minus 071198656 0003057 0000375 138E minus 06 0001782 106E minus 071198657 00002 493E minus 08 680E minus 08 619E minus 08 649E minus 08

to be simpler than 1198653ndash1198655 due to their low dimension and asmaller number of local minima For 1198656 CS provided betterresults than the other algorithms across Mean Std andWorst For 1198657 the majority of the algorithms can find theoptimal solution but the PSO is more stable than the otheralgorithms in terms of Std

The 119901 values ofWilcoxonrsquos rank-sum in Table 4 show thatthe results of PGWO in 1198655 are not significantly better thanPSO However the PGWO achieves significant improvementin all remaining benchmark functions compared to the otheralgorithms Therefore this is evidence that the proposed

algorithm has high performance in dealing with unimodalmultimodal and fixed-dimension multimodal benchmarkfunctions

The convergence curves of six algorithms are illustratedin Figure 1 As can be seen in these figures PGWO has amuch better convergence rate than the other algorithms onall benchmark functions

According to this comprehensive comparative study anddiscussions these results show that the proposed algorithmis able to significantly improve the performance of GWOand overcome its major shortcomings For that reason in the

Discrete Dynamics in Nature and Society 7

F6

times107

F4

F2F1times10

4

F3

F5

100 200 300 400 5000Iteration

0

05

1

15

2

25

Aver

age b

est s

o fa

r

101

102

103

104

105

106

107

108

109

Aver

age b

est s

o fa

r (lo

g)

GGSACSABC

PSOGWOPGWO

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

0

1

2

3

4

5

6

7

8

9

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus15

10minus10

10minus5

100

105

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus2

100

102

104

106

108

1010

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus4

10minus3

10minus2

10minus1

100

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

Figure 1 Continued

8 Discrete Dynamics in Nature and Society

100 200 300 400 5000Iteration

2

4

6

8

10

12

14

16

18

20

Aver

age b

est s

o fa

r

F7

GGSACSABC

PSOGWOPGWO

Figure 1 The convergence curves of the average fitness value over 20 independent runs

next section we apply PGWO algorithm to solve clusteringproblem

5 Data Clustering

51 The Mathematical Description of Data Clustering Clus-tering sample is set to119883 = 119883119894 119894 = 1 2 119899 where119883119894 is a119901dimensional vector Clustering problem is to find a partition119862 = 1198621 1198622 119862119898 satisfying the following conditions[26]

(1) 119883 = ⋃119898

119894=1119862119894

(2) 119862119894 = 120601 119894 = 1 2 119898(3) 119862119894 cap 119862119895 = 120601 119894 119895 = 1 2 119898 119894 = 119895

52 The Clustering Criterion Clustering is the process ofgrouping a set of data objects into a number of clusters orgroups The aim of clustering is to make the data withina cluster have a high similarity while being very dissimilarto objects in other clusters Dissimilarities and similaritiesare evaluated based on the attribute of data sets contain-ing distance metric The most popular distance metric isEuclidean distance [55] Let 119894 = (1199091198941 1199091198942 119909119894119901) and 119895 =

(1199091198951 1199091198952 119909119895119901) be two objects described by 119901 numericattributes the Euclidean distance between object 119894 and 119895 ispresented as follows

119889 (119894 119895)

= radic(1199091198941 minus 1199091198951)2

+ (1199091198942 minus 1199091198952)2

+ sdot sdot sdot + (119909119894119901 minus 119909119895119901)2

(5)

For given119873 objects the clustering problem is tominimize thesum squared Euclidean distance between objects and allocateeach object to one of 119896 cluster centers [55]The clustering aims

at finding clusters center through minimizing the objectivefunction The objective function is defined as follows [26]

119869119888 =

119898

sum

119896=1

sum

119883119894isin119862119896

119889 (119883119894 119885119896) (6)

where 119898 indicates the number of clusters 119862119896 indicates the119896th cluster 119885119896 indicates the kth center of the clusteringand 119889(119883119894 119885119896) indicates the distance of the sample to thecorresponding center of clustering namely119889(119883119894 119885119896) = 119883119894minus119885119896

53 PGWO Algorithm on Clustering In clustering analysiseach element in the data set is a 119901 dimensional vectorMoreover the actual position of a grey wolf represents the 119896cluster centers so each greywolf indicates a 119896lowast119901 dimensionalvector For each grey wolf 119894 its position is denoted as a vector119883119894 = (1199091198941 1199091198942 119909119894119896lowast119901) In the initialization phase weutilize maximum and minimum values of each componentof the data set (which is to be grouped) as PGWO algorithmthe initialization search scope of the grey wolves and theinitialization solution is randomly generated in this rangeWe use (6) to calculate the fitness function of grey wolvesrsquoindividuals and the main steps of the fitness function areshown in Algorithm 4

54 Data Clustering Experimental Results and DiscussionIn order to verify performance of the proposed PGWOapproach for clustering we compare the results of the 119870-means GGSA CS ABC PSO GWO and PGWO clusteringalgorithms using nine different data sets that are selectedfrom the UCI machine learning repository [56]

Artificial data set (119873 = 600 119889 = 2 and 119896 = 4) is atwo-featured problemwith four unique classes A total of 600

Discrete Dynamics in Nature and Society 9

(1) For data vector 119909119894(2) Calculate the Euclidean distance by (5)(3) Assign 119909119894 to the closest cluster center(4) Calculate the measure function by (6)(5) End For(6) Return value of the fitness function

Algorithm 4 Main steps of the fitness function

patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732 (120583 = (

119898119894

0) Σ = [[

05 005

005 05]]) (7)

where 119894 = 1 2 3 4 1198981 = minus3 1198982 = 0 1198983 = 3 and 119898 = 6120583 and Σ are mean vector and covariance matrix respectively[45 57]

Iris data (119873 = 150 119889 = 4 and 119870 = 3) is a data setwith 150 random samples of flowers from the Iris speciessetosa versicolor and virginica collected by Anderson [58]From each species there are 50 observations for sepal lengthsepal width petal length and petal width in cm This dataset was used by Fisher [59] in his initiation of the linear-discriminant-function technique [28 56 57]

Wisconsin breast cancer (119873 = 683 119889 = 9 and 119870 =

2) consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [28 56 57]

Contraceptive method choice (119873 = 1473 119889 = 10119870 = 3)CMC for short is a data set that is a subset of the 1987NationalIndonesia Contraceptive Prevalence Survey The samples aremarried women who either were not pregnant or did notknow if they were at the time of interview The problemis to predict the current contraceptive method choice (nouse long-termmethods or short-termmethods) of a womanbased onher demographic and socioeconomic characteristics[28 56 57]

Seeds data (119873 = 210 119889 = 7 and 119870 = 3) is a dataset that consists of 210 patterns belonging to three differentvarieties of wheat Kama Rosa and Canadian From eachspecies there are 70 observations for area 119860 perimeter 119875compactness 119862 (119862 = 4 lowast 119901119894 lowast119860119901

and2) length of kernel width

of kernel asymmetry coefficient and length of kernel groove[56]

Statlog (Heart) data (119873 = 270 119889 = 13119870 = 2) is a data setthat is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [56]

Wine data (119873 = 178 119889 = 13 119870 = 3) is also takenfromMCI laboratoryThese data are the results of a chemicalanalysis of wines grown in the same region in Italy but derivedfrom three different cultivars The analysis determined thequantities of 13 constituents found in each of the three types

of wines There are 178 instances with 13 numeric attributesin wine data set All attributes are continuous There is nomissing attribute value [28 56 57]

Balance scale data (119873 = 625 119889 = 4 and 119870 = 3) is a dataset that was generated to model psychological experimentalresults Each example is classified as having the balance scaletip to the right or left being balanced The attributes arethe left weight the left distance the right weight and theright distance The correct way to find the class is the greaterof (left-distance lowast left-weight) and (right-distance lowast right-weight) If they are equal it is balanced [56]

Habermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) is adata set that contains cases from a study that was conductedbetween 1958 and 1970 at the University of Chicagorsquos BillingsHospital on the survival of patients who had undergonesurgery for breast cancer It records two survival statuspatients with the age of patient at time of operation patientrsquosyear of operation and number of positive axillary nodesdetected [56]

The results of comparison of intracluster distances forthe seven clustering algorithms over 20 independent runs areshown in Table 5 Table 6 reports the 119901 values produced byWilcoxonrsquos rank-sum test comparing PGWO and the otheralgorithms over all the data sets

For Art data set the optimum value the worst value andthe average value of PGWO and GGSA are all 51390119864 + 02while the standard deviation of GGSA is better than PGWOFor PSO it only gets the optimum solution 51390119864 + 02

For Iris data set Cancer data set and Seeds data setPGWO PSO and GGSA provide the optimum value incomparison to those obtained by other methods Howeverthe worst value the average value and the standard deviationvalue of PGWO are superior to those of the other methods

For Heart data set PGWO PSO and CS all find theoptimum solution 10623119864 + 04 That means they all can findthe global solution However PGWO is slightly better for theworst value the average value and the standard deviation

For CMC data set and Wine data set Table 5 shows thatthe average best worst and standard deviation values ofthe fitness function for PGWO algorithm are much smallerthan those of the other six methods The PGWO clusteringalgorithm is capable of providing the same partition of thedata points in all runs

For balance scale data set the optimum values of thefitness function for PGWO PSO andGGSA are 14238119864+03That means they all can find the global solution And theoptimum values of the fitness function for GWO and 119870-means are 14239119864 + 03 the result is close to PGWO PSOand GGSA However the standard deviation values of GWO119870-means PGWO PSO GGSA and CS are 68756119864 minus 0136565119864 + 00 86619119864 minus 01 97396119864 minus 01 16015119864 + 00 and73239119864 minus 01 respectively From the standard deviation wecan see that the GWO is more stable than the other methods

For Habermanrsquos Survival data set the optimum value theworst value and the average value of the fitness function forCS and PSO are almost the same The results of CS and PSOalgorithms are better than those of the other methods

The 119901 values in Table 6 show that the results of PGWOare significantly better in Art data set Cancer data set

10 Discrete Dynamics in Nature and Society

Table 5 Comparison of intracluster distances for the seven clustering algorithms over 20 independent runs

Data set Criteria 119870-means GGSA CS ABC PSO GWO PGWO

Art

Best 51454E + 02 51390E + 02 51391E + 02 51752E + 02 51390E + 02 51455E + 02 51390E + 02Worst 90474E + 02 51390E + 02 53213E + 02 55298E + 02 89248E + 02 52058E + 02 51390E + 02Mean 60974E + 02 51390E + 02 51597E + 02 52620E + 02 53283E + 02 51687E + 02 51390E + 02Std 16903E + 02 15430119864 minus 13 48536E + 00 97988E + 00 84652E + 01 18768E + 00 61243119864 minus 08

Iris

Best 97190E + 01 96655E + 01 96659E + 01 97616E + 01 96655E + 01 96835E + 01 96655E + 01Worst 12318E + 02 11839E + 02 97684E + 01 10401E + 02 12767E + 02 12108E + 02 96655E + 01Mean 10226E + 02 98865E + 01 96862E + 01 99710E + 01 10596E + 02 99903E + 01 96655E + 01Std 10290E + 01 55454E + 00 31103E minus 01 18460E + 00 14578E + 01 67482E + 00 85869E minus 10

Cancer

Best 29763E + 03 29644E + 03 29645E + 03 29887E + 03 29644E + 03 29645E + 03 29644E + 03Worst 29884E + 03 30611E + 03 29677E + 03 32650E + 03 47288E + 03 29650E + 03 29644E + 03Mean 29830E + 03 29715E + 03 29651E + 03 30760E + 03 34055E + 03 29647E + 03 29644E + 03Std 48278E + 00 21284E + 01 73222E minus 01 65283E + 01 78385E + 02 11452E minus 01 16118E minus 08

CMC

Best 57016E + 03 56938E + 03 57054E + 03 58219E + 03 56942E + 03 58236E + 03 56937E + 03Worst 57053E + 03 57111E + 03 57762E + 03 63743E + 03 71580E + 03 60911E + 03 56937E + 03Mean 57040E + 03 56971E + 03 57263E + 03 60868E + 03 57690E + 03 59239E + 03 56937E + 03Std 11022E + 00 37872E + 00 21119E + 01 17292E + 02 32694E + 02 90635E + 01 98777E minus 04

Seeds

Best 31322E + 02 31180E + 02 31187E + 02 31573E + 02 31180E + 02 31323E + 02 31180E + 02Worst 31373E + 02 31354E + 02 31746E + 02 34997E + 02 42035E + 02 31782E + 02 31180E + 02Mean 31337E + 02 31189E + 02 31347E + 02 33161E + 02 32813E + 02 31475E + 02 31180E + 02Std 23686E minus 01 38862E minus 01 16383E + 00 10943E + 01 39697E + 01 15716E + 00 17815E minus 09

Heart

Best 10682E + 04 10749E + 04 10623E + 04 10683E + 04 10623E + 04 10637E + 04 10623E + 04Worst 10701E + 04 12684E + 04 10625E + 04 11622E + 04 10626E + 04 10683E + 04 10624E + 04Mean 10693E + 04 11542E + 04 10624E + 04 10920E + 04 10624E + 04 10657E + 04 10623E + 04Std 81672E + 00 57044E + 02 46239E minus 01 25921E + 02 76866E minus 01 13817E + 01 23235E minus 01

Wine

Best 16385E + 04 16493E + 04 16296E + 04 16566E + 04 16294E + 04 16316E + 04 16292E + 04Worst 18437E + 04 20245E + 04 16311E + 04 17668E + 04 16312E + 04 16371E + 04 16294E + 04Mean 16974E + 04 17999E + 04 16301E + 04 17010E + 04 16297E + 04 16345E + 04 16293E + 04Std 86757E + 02 11562E + 03 45677E + 00 36824E + 02 40149E + 00 14836E + 01 52338E minus 01

Balance scale

Best 14239E + 03 14238E + 03 14256E + 03 14265E + 03 14238E + 03 14239E + 03 14238E + 03Worst 14337E + 03 14291E + 03 14285E + 03 14310E + 03 14262E + 03 14260E + 03 14257E + 03Mean 14275E + 03 14259E + 03 14268E + 03 14282E + 03 14248E + 03 14243119864 + 03 14245119864 + 03

Std 36565E + 00 16015E + 00 73239E minus 01 11831E + 00 97396E minus 01 68756119864 minus 01 86619119864 minus 01

Haberman-Survival

Best 26251E + 03 25670E + 03 25670E + 03 25671E + 03 25670E + 03 25673E + 03 25670E + 03Worst 31966E + 03 26226E + 03 25670119864 + 03 25709E + 03 25678E + 03 26686E + 03 25678119864 + 03

Mean 26554E + 03 25702E + 03 25670119864 + 03 25679E + 03 25671E + 03 25898E + 03 25673119864 + 03

Std 12740E + 02 12354E + 01 17590119864 minus 06 83107E minus 01 30624E minus 01 23346E + 01 40711119864 minus 01

Table 6 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS GGSA and 119870-means over datasets (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA 119870-meansArt 680E minus 08 160E minus 05 680E minus 08 680E minus 08 611E minus 08 449E minus 08Iris 675E minus 08 119E minus 06 675E minus 08 675E minus 08 0107346 517E minus 08Cancer 676E minus 08 676E minus 08 676E minus 08 676E minus 08 12E minus 06 598E minus 08CMC 669E minus 08 669E minus 08 669E minus 08 669E minus 08 669E minus 08 339E minus 08Seeds 680E minus 08 680E minus 08 680E minus 08 680E minus 08 123E minus 03 481E minus 08Heart 661E minus 08 202E minus 06 661E minus 08 587E minus 07 661E minus 08 607E minus 08Wine 680E minus 08 123E minus 07 680E minus 08 680E minus 08 680E minus 08 663E minus 08Balance scale 0067868 0007712 68E minus 08 106E minus 07 0003966 601E minus 07Haberman-Survival 690E minus 07 0524948 178E minus 03 0860418 0090604 626E minus 08

Discrete Dynamics in Nature and Society 11

Art Iris

Cancer CMC

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900

950

1000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

80

90

100

110

120

130

140

150

160

170

180

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

2500

3000

3500

4000

4500

5000

5500

6000

6500

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

5500

6000

6500

7000

7500

8000

8500Av

erag

e bes

t so

far

50 100 150 2000Iteration

105

11

115

12

125

13

135

14

145

15

155

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

300

350

400

450

500

550

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

Figure 2 Continued

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 6: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

6 Discrete Dynamics in Nature and Society

Table 3 Results comparison of different optimal algorithms for 20 independent runs

Function Criteria GGSA CS ABC PSO GWO PGWO

1198651

Best 10133E minus 17 37450E + 00 21820E + 01 17964E minus 06 47447E minus 29 25422E minus 54Worst 10181E minus 16 24776E + 01 18094E + 02 22907E minus 04 22227E minus 26 79888E minus 53Mean 51858E minus 17 88292E + 00 81908E + 01 65181E minus 05 18269E minus 27 16276E minus 53Std 26953E minus 17 47633E + 00 43834E + 01 69891E minus 05 48884E minus 27 17834E minus 53

1198652

Best 23974E + 01 30920E + 02 44296E + 03 24432E + 01 26182E + 01 49374E + 00Worst 34411E + 02 22219E + 03 45866E + 04 41217E + 02 28769E + 01 15466E + 01Mean 77481E + 01 55445E + 02 15193E + 04 12465E + 02 27507E + 01 11692E + 01Std 74751E + 01 41835E + 02 97239E + 03 14004E + 02 67488119864 minus 01 20844119864 + 00

1198653

Best 23591E minus 09 40632E + 00 92131E + 00 63889E minus 03 79048E minus 14 44409E minus 15Worst 98704E minus 09 10603E + 01 16954E + 01 49535E + 00 15010E minus 13 44409E minus 15Mean 50646E minus 09 60677E + 00 13090E + 01 20423E + 00 10498E minus 13 44409E minus 15Std 19095E minus 09 15421E + 00 22882E + 00 11464E + 00 17485E minus 14 0

1198654

Best 54733E minus 01 20863E + 00 75357E minus 01 24473E minus 03 18626E minus 02 16432E minus 32Worst 46556E + 00 41099E + 00 55219E + 00 36426E + 00 12149E minus 01 16432E minus 32Mean 19728E + 00 32711E + 00 20544E + 00 85739E minus 01 54300E minus 02 16432E minus 32Std 10488E + 00 59739E minus 01 13358E + 00 10326E + 00 29934E minus 02 28080E minus 48

1198655

Best 21024E minus 02 27015E + 00 22507E + 00 63080E minus 06 39444E minus 01 13498E minus 32Worst 24387E + 01 12516E + 01 11090E + 02 36461E + 00 85465E minus 01 19625E minus 01Mean 77881E + 00 62644E + 00 11124E + 01 37285E minus 01 67323E minus 01 45918E minus 02Std 66733E + 00 25624E + 00 23792E + 01 93480E minus 01 14330E minus 01 60835E minus 02

1198656

Best 12358E minus 03 31550E minus 04 83768E minus 04 30749119864 minus 04 30750E minus 04 30752119864 minus 04

Worst 12219E minus 02 60087119864 minus 04 34810E minus 03 19361E minus 03 20363E minus 02 15943119864 minus 03

Mean 43558E minus 03 43266119864 minus 04 16421E minus 03 93309E minus 04 64067E minus 03 45499119864 minus 04

Std 26046E minus 03 85397119864 minus 05 72141E minus 04 41733E minus 04 93762E minus 03 34265119864 minus 04

1198657

Best 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00 30000E + 00Worst 30000E + 00 30000E + 00 30620E + 00 30000E + 00 84000E + 01 30000E + 00Mean 30000E + 00 30000E + 00 30136E + 00 30000E + 00 11100E + 01 30000E + 00Std 25794E minus 15 15717E minus 15 19505E minus 02 12310119864 minus 15 24931E + 01 92458119864 minus 07

Table 4 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS and GGSA over benchmarkfunctions (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA1198651 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198652 680E minus 08 680E minus 08 680E minus 08 680E minus 08 680E minus 081198653 766E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198654 801E minus 09 801E minus 09 801E minus 09 801E minus 09 801E minus 091198655 597E minus 08 0295606 597E minus 08 597E minus 08 467E minus 071198656 0003057 0000375 138E minus 06 0001782 106E minus 071198657 00002 493E minus 08 680E minus 08 619E minus 08 649E minus 08

to be simpler than 1198653ndash1198655 due to their low dimension and asmaller number of local minima For 1198656 CS provided betterresults than the other algorithms across Mean Std andWorst For 1198657 the majority of the algorithms can find theoptimal solution but the PSO is more stable than the otheralgorithms in terms of Std

The 119901 values ofWilcoxonrsquos rank-sum in Table 4 show thatthe results of PGWO in 1198655 are not significantly better thanPSO However the PGWO achieves significant improvementin all remaining benchmark functions compared to the otheralgorithms Therefore this is evidence that the proposed

algorithm has high performance in dealing with unimodalmultimodal and fixed-dimension multimodal benchmarkfunctions

The convergence curves of six algorithms are illustratedin Figure 1 As can be seen in these figures PGWO has amuch better convergence rate than the other algorithms onall benchmark functions

According to this comprehensive comparative study anddiscussions these results show that the proposed algorithmis able to significantly improve the performance of GWOand overcome its major shortcomings For that reason in the

Discrete Dynamics in Nature and Society 7

F6

times107

F4

F2F1times10

4

F3

F5

100 200 300 400 5000Iteration

0

05

1

15

2

25

Aver

age b

est s

o fa

r

101

102

103

104

105

106

107

108

109

Aver

age b

est s

o fa

r (lo

g)

GGSACSABC

PSOGWOPGWO

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

0

1

2

3

4

5

6

7

8

9

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus15

10minus10

10minus5

100

105

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus2

100

102

104

106

108

1010

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus4

10minus3

10minus2

10minus1

100

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

Figure 1 Continued

8 Discrete Dynamics in Nature and Society

100 200 300 400 5000Iteration

2

4

6

8

10

12

14

16

18

20

Aver

age b

est s

o fa

r

F7

GGSACSABC

PSOGWOPGWO

Figure 1 The convergence curves of the average fitness value over 20 independent runs

next section we apply PGWO algorithm to solve clusteringproblem

5 Data Clustering

51 The Mathematical Description of Data Clustering Clus-tering sample is set to119883 = 119883119894 119894 = 1 2 119899 where119883119894 is a119901dimensional vector Clustering problem is to find a partition119862 = 1198621 1198622 119862119898 satisfying the following conditions[26]

(1) 119883 = ⋃119898

119894=1119862119894

(2) 119862119894 = 120601 119894 = 1 2 119898(3) 119862119894 cap 119862119895 = 120601 119894 119895 = 1 2 119898 119894 = 119895

52 The Clustering Criterion Clustering is the process ofgrouping a set of data objects into a number of clusters orgroups The aim of clustering is to make the data withina cluster have a high similarity while being very dissimilarto objects in other clusters Dissimilarities and similaritiesare evaluated based on the attribute of data sets contain-ing distance metric The most popular distance metric isEuclidean distance [55] Let 119894 = (1199091198941 1199091198942 119909119894119901) and 119895 =

(1199091198951 1199091198952 119909119895119901) be two objects described by 119901 numericattributes the Euclidean distance between object 119894 and 119895 ispresented as follows

119889 (119894 119895)

= radic(1199091198941 minus 1199091198951)2

+ (1199091198942 minus 1199091198952)2

+ sdot sdot sdot + (119909119894119901 minus 119909119895119901)2

(5)

For given119873 objects the clustering problem is tominimize thesum squared Euclidean distance between objects and allocateeach object to one of 119896 cluster centers [55]The clustering aims

at finding clusters center through minimizing the objectivefunction The objective function is defined as follows [26]

119869119888 =

119898

sum

119896=1

sum

119883119894isin119862119896

119889 (119883119894 119885119896) (6)

where 119898 indicates the number of clusters 119862119896 indicates the119896th cluster 119885119896 indicates the kth center of the clusteringand 119889(119883119894 119885119896) indicates the distance of the sample to thecorresponding center of clustering namely119889(119883119894 119885119896) = 119883119894minus119885119896

53 PGWO Algorithm on Clustering In clustering analysiseach element in the data set is a 119901 dimensional vectorMoreover the actual position of a grey wolf represents the 119896cluster centers so each greywolf indicates a 119896lowast119901 dimensionalvector For each grey wolf 119894 its position is denoted as a vector119883119894 = (1199091198941 1199091198942 119909119894119896lowast119901) In the initialization phase weutilize maximum and minimum values of each componentof the data set (which is to be grouped) as PGWO algorithmthe initialization search scope of the grey wolves and theinitialization solution is randomly generated in this rangeWe use (6) to calculate the fitness function of grey wolvesrsquoindividuals and the main steps of the fitness function areshown in Algorithm 4

54 Data Clustering Experimental Results and DiscussionIn order to verify performance of the proposed PGWOapproach for clustering we compare the results of the 119870-means GGSA CS ABC PSO GWO and PGWO clusteringalgorithms using nine different data sets that are selectedfrom the UCI machine learning repository [56]

Artificial data set (119873 = 600 119889 = 2 and 119896 = 4) is atwo-featured problemwith four unique classes A total of 600

Discrete Dynamics in Nature and Society 9

(1) For data vector 119909119894(2) Calculate the Euclidean distance by (5)(3) Assign 119909119894 to the closest cluster center(4) Calculate the measure function by (6)(5) End For(6) Return value of the fitness function

Algorithm 4 Main steps of the fitness function

patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732 (120583 = (

119898119894

0) Σ = [[

05 005

005 05]]) (7)

where 119894 = 1 2 3 4 1198981 = minus3 1198982 = 0 1198983 = 3 and 119898 = 6120583 and Σ are mean vector and covariance matrix respectively[45 57]

Iris data (119873 = 150 119889 = 4 and 119870 = 3) is a data setwith 150 random samples of flowers from the Iris speciessetosa versicolor and virginica collected by Anderson [58]From each species there are 50 observations for sepal lengthsepal width petal length and petal width in cm This dataset was used by Fisher [59] in his initiation of the linear-discriminant-function technique [28 56 57]

Wisconsin breast cancer (119873 = 683 119889 = 9 and 119870 =

2) consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [28 56 57]

Contraceptive method choice (119873 = 1473 119889 = 10119870 = 3)CMC for short is a data set that is a subset of the 1987NationalIndonesia Contraceptive Prevalence Survey The samples aremarried women who either were not pregnant or did notknow if they were at the time of interview The problemis to predict the current contraceptive method choice (nouse long-termmethods or short-termmethods) of a womanbased onher demographic and socioeconomic characteristics[28 56 57]

Seeds data (119873 = 210 119889 = 7 and 119870 = 3) is a dataset that consists of 210 patterns belonging to three differentvarieties of wheat Kama Rosa and Canadian From eachspecies there are 70 observations for area 119860 perimeter 119875compactness 119862 (119862 = 4 lowast 119901119894 lowast119860119901

and2) length of kernel width

of kernel asymmetry coefficient and length of kernel groove[56]

Statlog (Heart) data (119873 = 270 119889 = 13119870 = 2) is a data setthat is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [56]

Wine data (119873 = 178 119889 = 13 119870 = 3) is also takenfromMCI laboratoryThese data are the results of a chemicalanalysis of wines grown in the same region in Italy but derivedfrom three different cultivars The analysis determined thequantities of 13 constituents found in each of the three types

of wines There are 178 instances with 13 numeric attributesin wine data set All attributes are continuous There is nomissing attribute value [28 56 57]

Balance scale data (119873 = 625 119889 = 4 and 119870 = 3) is a dataset that was generated to model psychological experimentalresults Each example is classified as having the balance scaletip to the right or left being balanced The attributes arethe left weight the left distance the right weight and theright distance The correct way to find the class is the greaterof (left-distance lowast left-weight) and (right-distance lowast right-weight) If they are equal it is balanced [56]

Habermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) is adata set that contains cases from a study that was conductedbetween 1958 and 1970 at the University of Chicagorsquos BillingsHospital on the survival of patients who had undergonesurgery for breast cancer It records two survival statuspatients with the age of patient at time of operation patientrsquosyear of operation and number of positive axillary nodesdetected [56]

The results of comparison of intracluster distances forthe seven clustering algorithms over 20 independent runs areshown in Table 5 Table 6 reports the 119901 values produced byWilcoxonrsquos rank-sum test comparing PGWO and the otheralgorithms over all the data sets

For Art data set the optimum value the worst value andthe average value of PGWO and GGSA are all 51390119864 + 02while the standard deviation of GGSA is better than PGWOFor PSO it only gets the optimum solution 51390119864 + 02

For Iris data set Cancer data set and Seeds data setPGWO PSO and GGSA provide the optimum value incomparison to those obtained by other methods Howeverthe worst value the average value and the standard deviationvalue of PGWO are superior to those of the other methods

For Heart data set PGWO PSO and CS all find theoptimum solution 10623119864 + 04 That means they all can findthe global solution However PGWO is slightly better for theworst value the average value and the standard deviation

For CMC data set and Wine data set Table 5 shows thatthe average best worst and standard deviation values ofthe fitness function for PGWO algorithm are much smallerthan those of the other six methods The PGWO clusteringalgorithm is capable of providing the same partition of thedata points in all runs

For balance scale data set the optimum values of thefitness function for PGWO PSO andGGSA are 14238119864+03That means they all can find the global solution And theoptimum values of the fitness function for GWO and 119870-means are 14239119864 + 03 the result is close to PGWO PSOand GGSA However the standard deviation values of GWO119870-means PGWO PSO GGSA and CS are 68756119864 minus 0136565119864 + 00 86619119864 minus 01 97396119864 minus 01 16015119864 + 00 and73239119864 minus 01 respectively From the standard deviation wecan see that the GWO is more stable than the other methods

For Habermanrsquos Survival data set the optimum value theworst value and the average value of the fitness function forCS and PSO are almost the same The results of CS and PSOalgorithms are better than those of the other methods

The 119901 values in Table 6 show that the results of PGWOare significantly better in Art data set Cancer data set

10 Discrete Dynamics in Nature and Society

Table 5 Comparison of intracluster distances for the seven clustering algorithms over 20 independent runs

Data set Criteria 119870-means GGSA CS ABC PSO GWO PGWO

Art

Best 51454E + 02 51390E + 02 51391E + 02 51752E + 02 51390E + 02 51455E + 02 51390E + 02Worst 90474E + 02 51390E + 02 53213E + 02 55298E + 02 89248E + 02 52058E + 02 51390E + 02Mean 60974E + 02 51390E + 02 51597E + 02 52620E + 02 53283E + 02 51687E + 02 51390E + 02Std 16903E + 02 15430119864 minus 13 48536E + 00 97988E + 00 84652E + 01 18768E + 00 61243119864 minus 08

Iris

Best 97190E + 01 96655E + 01 96659E + 01 97616E + 01 96655E + 01 96835E + 01 96655E + 01Worst 12318E + 02 11839E + 02 97684E + 01 10401E + 02 12767E + 02 12108E + 02 96655E + 01Mean 10226E + 02 98865E + 01 96862E + 01 99710E + 01 10596E + 02 99903E + 01 96655E + 01Std 10290E + 01 55454E + 00 31103E minus 01 18460E + 00 14578E + 01 67482E + 00 85869E minus 10

Cancer

Best 29763E + 03 29644E + 03 29645E + 03 29887E + 03 29644E + 03 29645E + 03 29644E + 03Worst 29884E + 03 30611E + 03 29677E + 03 32650E + 03 47288E + 03 29650E + 03 29644E + 03Mean 29830E + 03 29715E + 03 29651E + 03 30760E + 03 34055E + 03 29647E + 03 29644E + 03Std 48278E + 00 21284E + 01 73222E minus 01 65283E + 01 78385E + 02 11452E minus 01 16118E minus 08

CMC

Best 57016E + 03 56938E + 03 57054E + 03 58219E + 03 56942E + 03 58236E + 03 56937E + 03Worst 57053E + 03 57111E + 03 57762E + 03 63743E + 03 71580E + 03 60911E + 03 56937E + 03Mean 57040E + 03 56971E + 03 57263E + 03 60868E + 03 57690E + 03 59239E + 03 56937E + 03Std 11022E + 00 37872E + 00 21119E + 01 17292E + 02 32694E + 02 90635E + 01 98777E minus 04

Seeds

Best 31322E + 02 31180E + 02 31187E + 02 31573E + 02 31180E + 02 31323E + 02 31180E + 02Worst 31373E + 02 31354E + 02 31746E + 02 34997E + 02 42035E + 02 31782E + 02 31180E + 02Mean 31337E + 02 31189E + 02 31347E + 02 33161E + 02 32813E + 02 31475E + 02 31180E + 02Std 23686E minus 01 38862E minus 01 16383E + 00 10943E + 01 39697E + 01 15716E + 00 17815E minus 09

Heart

Best 10682E + 04 10749E + 04 10623E + 04 10683E + 04 10623E + 04 10637E + 04 10623E + 04Worst 10701E + 04 12684E + 04 10625E + 04 11622E + 04 10626E + 04 10683E + 04 10624E + 04Mean 10693E + 04 11542E + 04 10624E + 04 10920E + 04 10624E + 04 10657E + 04 10623E + 04Std 81672E + 00 57044E + 02 46239E minus 01 25921E + 02 76866E minus 01 13817E + 01 23235E minus 01

Wine

Best 16385E + 04 16493E + 04 16296E + 04 16566E + 04 16294E + 04 16316E + 04 16292E + 04Worst 18437E + 04 20245E + 04 16311E + 04 17668E + 04 16312E + 04 16371E + 04 16294E + 04Mean 16974E + 04 17999E + 04 16301E + 04 17010E + 04 16297E + 04 16345E + 04 16293E + 04Std 86757E + 02 11562E + 03 45677E + 00 36824E + 02 40149E + 00 14836E + 01 52338E minus 01

Balance scale

Best 14239E + 03 14238E + 03 14256E + 03 14265E + 03 14238E + 03 14239E + 03 14238E + 03Worst 14337E + 03 14291E + 03 14285E + 03 14310E + 03 14262E + 03 14260E + 03 14257E + 03Mean 14275E + 03 14259E + 03 14268E + 03 14282E + 03 14248E + 03 14243119864 + 03 14245119864 + 03

Std 36565E + 00 16015E + 00 73239E minus 01 11831E + 00 97396E minus 01 68756119864 minus 01 86619119864 minus 01

Haberman-Survival

Best 26251E + 03 25670E + 03 25670E + 03 25671E + 03 25670E + 03 25673E + 03 25670E + 03Worst 31966E + 03 26226E + 03 25670119864 + 03 25709E + 03 25678E + 03 26686E + 03 25678119864 + 03

Mean 26554E + 03 25702E + 03 25670119864 + 03 25679E + 03 25671E + 03 25898E + 03 25673119864 + 03

Std 12740E + 02 12354E + 01 17590119864 minus 06 83107E minus 01 30624E minus 01 23346E + 01 40711119864 minus 01

Table 6 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS GGSA and 119870-means over datasets (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA 119870-meansArt 680E minus 08 160E minus 05 680E minus 08 680E minus 08 611E minus 08 449E minus 08Iris 675E minus 08 119E minus 06 675E minus 08 675E minus 08 0107346 517E minus 08Cancer 676E minus 08 676E minus 08 676E minus 08 676E minus 08 12E minus 06 598E minus 08CMC 669E minus 08 669E minus 08 669E minus 08 669E minus 08 669E minus 08 339E minus 08Seeds 680E minus 08 680E minus 08 680E minus 08 680E minus 08 123E minus 03 481E minus 08Heart 661E minus 08 202E minus 06 661E minus 08 587E minus 07 661E minus 08 607E minus 08Wine 680E minus 08 123E minus 07 680E minus 08 680E minus 08 680E minus 08 663E minus 08Balance scale 0067868 0007712 68E minus 08 106E minus 07 0003966 601E minus 07Haberman-Survival 690E minus 07 0524948 178E minus 03 0860418 0090604 626E minus 08

Discrete Dynamics in Nature and Society 11

Art Iris

Cancer CMC

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900

950

1000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

80

90

100

110

120

130

140

150

160

170

180

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

2500

3000

3500

4000

4500

5000

5500

6000

6500

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

5500

6000

6500

7000

7500

8000

8500Av

erag

e bes

t so

far

50 100 150 2000Iteration

105

11

115

12

125

13

135

14

145

15

155

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

300

350

400

450

500

550

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

Figure 2 Continued

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 7: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

Discrete Dynamics in Nature and Society 7

F6

times107

F4

F2F1times10

4

F3

F5

100 200 300 400 5000Iteration

0

05

1

15

2

25

Aver

age b

est s

o fa

r

101

102

103

104

105

106

107

108

109

Aver

age b

est s

o fa

r (lo

g)

GGSACSABC

PSOGWOPGWO

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

0

1

2

3

4

5

6

7

8

9

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus15

10minus10

10minus5

100

105

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus2

100

102

104

106

108

1010

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

10minus4

10minus3

10minus2

10minus1

100

Aver

age b

est s

o fa

r (lo

g)

100 200 300 400 5000Iteration

GGSACSABC

PSOGWOPGWO

Figure 1 Continued

8 Discrete Dynamics in Nature and Society

100 200 300 400 5000Iteration

2

4

6

8

10

12

14

16

18

20

Aver

age b

est s

o fa

r

F7

GGSACSABC

PSOGWOPGWO

Figure 1 The convergence curves of the average fitness value over 20 independent runs

next section we apply PGWO algorithm to solve clusteringproblem

5 Data Clustering

51 The Mathematical Description of Data Clustering Clus-tering sample is set to119883 = 119883119894 119894 = 1 2 119899 where119883119894 is a119901dimensional vector Clustering problem is to find a partition119862 = 1198621 1198622 119862119898 satisfying the following conditions[26]

(1) 119883 = ⋃119898

119894=1119862119894

(2) 119862119894 = 120601 119894 = 1 2 119898(3) 119862119894 cap 119862119895 = 120601 119894 119895 = 1 2 119898 119894 = 119895

52 The Clustering Criterion Clustering is the process ofgrouping a set of data objects into a number of clusters orgroups The aim of clustering is to make the data withina cluster have a high similarity while being very dissimilarto objects in other clusters Dissimilarities and similaritiesare evaluated based on the attribute of data sets contain-ing distance metric The most popular distance metric isEuclidean distance [55] Let 119894 = (1199091198941 1199091198942 119909119894119901) and 119895 =

(1199091198951 1199091198952 119909119895119901) be two objects described by 119901 numericattributes the Euclidean distance between object 119894 and 119895 ispresented as follows

119889 (119894 119895)

= radic(1199091198941 minus 1199091198951)2

+ (1199091198942 minus 1199091198952)2

+ sdot sdot sdot + (119909119894119901 minus 119909119895119901)2

(5)

For given119873 objects the clustering problem is tominimize thesum squared Euclidean distance between objects and allocateeach object to one of 119896 cluster centers [55]The clustering aims

at finding clusters center through minimizing the objectivefunction The objective function is defined as follows [26]

119869119888 =

119898

sum

119896=1

sum

119883119894isin119862119896

119889 (119883119894 119885119896) (6)

where 119898 indicates the number of clusters 119862119896 indicates the119896th cluster 119885119896 indicates the kth center of the clusteringand 119889(119883119894 119885119896) indicates the distance of the sample to thecorresponding center of clustering namely119889(119883119894 119885119896) = 119883119894minus119885119896

53 PGWO Algorithm on Clustering In clustering analysiseach element in the data set is a 119901 dimensional vectorMoreover the actual position of a grey wolf represents the 119896cluster centers so each greywolf indicates a 119896lowast119901 dimensionalvector For each grey wolf 119894 its position is denoted as a vector119883119894 = (1199091198941 1199091198942 119909119894119896lowast119901) In the initialization phase weutilize maximum and minimum values of each componentof the data set (which is to be grouped) as PGWO algorithmthe initialization search scope of the grey wolves and theinitialization solution is randomly generated in this rangeWe use (6) to calculate the fitness function of grey wolvesrsquoindividuals and the main steps of the fitness function areshown in Algorithm 4

54 Data Clustering Experimental Results and DiscussionIn order to verify performance of the proposed PGWOapproach for clustering we compare the results of the 119870-means GGSA CS ABC PSO GWO and PGWO clusteringalgorithms using nine different data sets that are selectedfrom the UCI machine learning repository [56]

Artificial data set (119873 = 600 119889 = 2 and 119896 = 4) is atwo-featured problemwith four unique classes A total of 600

Discrete Dynamics in Nature and Society 9

(1) For data vector 119909119894(2) Calculate the Euclidean distance by (5)(3) Assign 119909119894 to the closest cluster center(4) Calculate the measure function by (6)(5) End For(6) Return value of the fitness function

Algorithm 4 Main steps of the fitness function

patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732 (120583 = (

119898119894

0) Σ = [[

05 005

005 05]]) (7)

where 119894 = 1 2 3 4 1198981 = minus3 1198982 = 0 1198983 = 3 and 119898 = 6120583 and Σ are mean vector and covariance matrix respectively[45 57]

Iris data (119873 = 150 119889 = 4 and 119870 = 3) is a data setwith 150 random samples of flowers from the Iris speciessetosa versicolor and virginica collected by Anderson [58]From each species there are 50 observations for sepal lengthsepal width petal length and petal width in cm This dataset was used by Fisher [59] in his initiation of the linear-discriminant-function technique [28 56 57]

Wisconsin breast cancer (119873 = 683 119889 = 9 and 119870 =

2) consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [28 56 57]

Contraceptive method choice (119873 = 1473 119889 = 10119870 = 3)CMC for short is a data set that is a subset of the 1987NationalIndonesia Contraceptive Prevalence Survey The samples aremarried women who either were not pregnant or did notknow if they were at the time of interview The problemis to predict the current contraceptive method choice (nouse long-termmethods or short-termmethods) of a womanbased onher demographic and socioeconomic characteristics[28 56 57]

Seeds data (119873 = 210 119889 = 7 and 119870 = 3) is a dataset that consists of 210 patterns belonging to three differentvarieties of wheat Kama Rosa and Canadian From eachspecies there are 70 observations for area 119860 perimeter 119875compactness 119862 (119862 = 4 lowast 119901119894 lowast119860119901

and2) length of kernel width

of kernel asymmetry coefficient and length of kernel groove[56]

Statlog (Heart) data (119873 = 270 119889 = 13119870 = 2) is a data setthat is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [56]

Wine data (119873 = 178 119889 = 13 119870 = 3) is also takenfromMCI laboratoryThese data are the results of a chemicalanalysis of wines grown in the same region in Italy but derivedfrom three different cultivars The analysis determined thequantities of 13 constituents found in each of the three types

of wines There are 178 instances with 13 numeric attributesin wine data set All attributes are continuous There is nomissing attribute value [28 56 57]

Balance scale data (119873 = 625 119889 = 4 and 119870 = 3) is a dataset that was generated to model psychological experimentalresults Each example is classified as having the balance scaletip to the right or left being balanced The attributes arethe left weight the left distance the right weight and theright distance The correct way to find the class is the greaterof (left-distance lowast left-weight) and (right-distance lowast right-weight) If they are equal it is balanced [56]

Habermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) is adata set that contains cases from a study that was conductedbetween 1958 and 1970 at the University of Chicagorsquos BillingsHospital on the survival of patients who had undergonesurgery for breast cancer It records two survival statuspatients with the age of patient at time of operation patientrsquosyear of operation and number of positive axillary nodesdetected [56]

The results of comparison of intracluster distances forthe seven clustering algorithms over 20 independent runs areshown in Table 5 Table 6 reports the 119901 values produced byWilcoxonrsquos rank-sum test comparing PGWO and the otheralgorithms over all the data sets

For Art data set the optimum value the worst value andthe average value of PGWO and GGSA are all 51390119864 + 02while the standard deviation of GGSA is better than PGWOFor PSO it only gets the optimum solution 51390119864 + 02

For Iris data set Cancer data set and Seeds data setPGWO PSO and GGSA provide the optimum value incomparison to those obtained by other methods Howeverthe worst value the average value and the standard deviationvalue of PGWO are superior to those of the other methods

For Heart data set PGWO PSO and CS all find theoptimum solution 10623119864 + 04 That means they all can findthe global solution However PGWO is slightly better for theworst value the average value and the standard deviation

For CMC data set and Wine data set Table 5 shows thatthe average best worst and standard deviation values ofthe fitness function for PGWO algorithm are much smallerthan those of the other six methods The PGWO clusteringalgorithm is capable of providing the same partition of thedata points in all runs

For balance scale data set the optimum values of thefitness function for PGWO PSO andGGSA are 14238119864+03That means they all can find the global solution And theoptimum values of the fitness function for GWO and 119870-means are 14239119864 + 03 the result is close to PGWO PSOand GGSA However the standard deviation values of GWO119870-means PGWO PSO GGSA and CS are 68756119864 minus 0136565119864 + 00 86619119864 minus 01 97396119864 minus 01 16015119864 + 00 and73239119864 minus 01 respectively From the standard deviation wecan see that the GWO is more stable than the other methods

For Habermanrsquos Survival data set the optimum value theworst value and the average value of the fitness function forCS and PSO are almost the same The results of CS and PSOalgorithms are better than those of the other methods

The 119901 values in Table 6 show that the results of PGWOare significantly better in Art data set Cancer data set

10 Discrete Dynamics in Nature and Society

Table 5 Comparison of intracluster distances for the seven clustering algorithms over 20 independent runs

Data set Criteria 119870-means GGSA CS ABC PSO GWO PGWO

Art

Best 51454E + 02 51390E + 02 51391E + 02 51752E + 02 51390E + 02 51455E + 02 51390E + 02Worst 90474E + 02 51390E + 02 53213E + 02 55298E + 02 89248E + 02 52058E + 02 51390E + 02Mean 60974E + 02 51390E + 02 51597E + 02 52620E + 02 53283E + 02 51687E + 02 51390E + 02Std 16903E + 02 15430119864 minus 13 48536E + 00 97988E + 00 84652E + 01 18768E + 00 61243119864 minus 08

Iris

Best 97190E + 01 96655E + 01 96659E + 01 97616E + 01 96655E + 01 96835E + 01 96655E + 01Worst 12318E + 02 11839E + 02 97684E + 01 10401E + 02 12767E + 02 12108E + 02 96655E + 01Mean 10226E + 02 98865E + 01 96862E + 01 99710E + 01 10596E + 02 99903E + 01 96655E + 01Std 10290E + 01 55454E + 00 31103E minus 01 18460E + 00 14578E + 01 67482E + 00 85869E minus 10

Cancer

Best 29763E + 03 29644E + 03 29645E + 03 29887E + 03 29644E + 03 29645E + 03 29644E + 03Worst 29884E + 03 30611E + 03 29677E + 03 32650E + 03 47288E + 03 29650E + 03 29644E + 03Mean 29830E + 03 29715E + 03 29651E + 03 30760E + 03 34055E + 03 29647E + 03 29644E + 03Std 48278E + 00 21284E + 01 73222E minus 01 65283E + 01 78385E + 02 11452E minus 01 16118E minus 08

CMC

Best 57016E + 03 56938E + 03 57054E + 03 58219E + 03 56942E + 03 58236E + 03 56937E + 03Worst 57053E + 03 57111E + 03 57762E + 03 63743E + 03 71580E + 03 60911E + 03 56937E + 03Mean 57040E + 03 56971E + 03 57263E + 03 60868E + 03 57690E + 03 59239E + 03 56937E + 03Std 11022E + 00 37872E + 00 21119E + 01 17292E + 02 32694E + 02 90635E + 01 98777E minus 04

Seeds

Best 31322E + 02 31180E + 02 31187E + 02 31573E + 02 31180E + 02 31323E + 02 31180E + 02Worst 31373E + 02 31354E + 02 31746E + 02 34997E + 02 42035E + 02 31782E + 02 31180E + 02Mean 31337E + 02 31189E + 02 31347E + 02 33161E + 02 32813E + 02 31475E + 02 31180E + 02Std 23686E minus 01 38862E minus 01 16383E + 00 10943E + 01 39697E + 01 15716E + 00 17815E minus 09

Heart

Best 10682E + 04 10749E + 04 10623E + 04 10683E + 04 10623E + 04 10637E + 04 10623E + 04Worst 10701E + 04 12684E + 04 10625E + 04 11622E + 04 10626E + 04 10683E + 04 10624E + 04Mean 10693E + 04 11542E + 04 10624E + 04 10920E + 04 10624E + 04 10657E + 04 10623E + 04Std 81672E + 00 57044E + 02 46239E minus 01 25921E + 02 76866E minus 01 13817E + 01 23235E minus 01

Wine

Best 16385E + 04 16493E + 04 16296E + 04 16566E + 04 16294E + 04 16316E + 04 16292E + 04Worst 18437E + 04 20245E + 04 16311E + 04 17668E + 04 16312E + 04 16371E + 04 16294E + 04Mean 16974E + 04 17999E + 04 16301E + 04 17010E + 04 16297E + 04 16345E + 04 16293E + 04Std 86757E + 02 11562E + 03 45677E + 00 36824E + 02 40149E + 00 14836E + 01 52338E minus 01

Balance scale

Best 14239E + 03 14238E + 03 14256E + 03 14265E + 03 14238E + 03 14239E + 03 14238E + 03Worst 14337E + 03 14291E + 03 14285E + 03 14310E + 03 14262E + 03 14260E + 03 14257E + 03Mean 14275E + 03 14259E + 03 14268E + 03 14282E + 03 14248E + 03 14243119864 + 03 14245119864 + 03

Std 36565E + 00 16015E + 00 73239E minus 01 11831E + 00 97396E minus 01 68756119864 minus 01 86619119864 minus 01

Haberman-Survival

Best 26251E + 03 25670E + 03 25670E + 03 25671E + 03 25670E + 03 25673E + 03 25670E + 03Worst 31966E + 03 26226E + 03 25670119864 + 03 25709E + 03 25678E + 03 26686E + 03 25678119864 + 03

Mean 26554E + 03 25702E + 03 25670119864 + 03 25679E + 03 25671E + 03 25898E + 03 25673119864 + 03

Std 12740E + 02 12354E + 01 17590119864 minus 06 83107E minus 01 30624E minus 01 23346E + 01 40711119864 minus 01

Table 6 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS GGSA and 119870-means over datasets (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA 119870-meansArt 680E minus 08 160E minus 05 680E minus 08 680E minus 08 611E minus 08 449E minus 08Iris 675E minus 08 119E minus 06 675E minus 08 675E minus 08 0107346 517E minus 08Cancer 676E minus 08 676E minus 08 676E minus 08 676E minus 08 12E minus 06 598E minus 08CMC 669E minus 08 669E minus 08 669E minus 08 669E minus 08 669E minus 08 339E minus 08Seeds 680E minus 08 680E minus 08 680E minus 08 680E minus 08 123E minus 03 481E minus 08Heart 661E minus 08 202E minus 06 661E minus 08 587E minus 07 661E minus 08 607E minus 08Wine 680E minus 08 123E minus 07 680E minus 08 680E minus 08 680E minus 08 663E minus 08Balance scale 0067868 0007712 68E minus 08 106E minus 07 0003966 601E minus 07Haberman-Survival 690E minus 07 0524948 178E minus 03 0860418 0090604 626E minus 08

Discrete Dynamics in Nature and Society 11

Art Iris

Cancer CMC

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900

950

1000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

80

90

100

110

120

130

140

150

160

170

180

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

2500

3000

3500

4000

4500

5000

5500

6000

6500

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

5500

6000

6500

7000

7500

8000

8500Av

erag

e bes

t so

far

50 100 150 2000Iteration

105

11

115

12

125

13

135

14

145

15

155

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

300

350

400

450

500

550

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

Figure 2 Continued

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 8: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

8 Discrete Dynamics in Nature and Society

100 200 300 400 5000Iteration

2

4

6

8

10

12

14

16

18

20

Aver

age b

est s

o fa

r

F7

GGSACSABC

PSOGWOPGWO

Figure 1 The convergence curves of the average fitness value over 20 independent runs

next section we apply PGWO algorithm to solve clusteringproblem

5 Data Clustering

51 The Mathematical Description of Data Clustering Clus-tering sample is set to119883 = 119883119894 119894 = 1 2 119899 where119883119894 is a119901dimensional vector Clustering problem is to find a partition119862 = 1198621 1198622 119862119898 satisfying the following conditions[26]

(1) 119883 = ⋃119898

119894=1119862119894

(2) 119862119894 = 120601 119894 = 1 2 119898(3) 119862119894 cap 119862119895 = 120601 119894 119895 = 1 2 119898 119894 = 119895

52 The Clustering Criterion Clustering is the process ofgrouping a set of data objects into a number of clusters orgroups The aim of clustering is to make the data withina cluster have a high similarity while being very dissimilarto objects in other clusters Dissimilarities and similaritiesare evaluated based on the attribute of data sets contain-ing distance metric The most popular distance metric isEuclidean distance [55] Let 119894 = (1199091198941 1199091198942 119909119894119901) and 119895 =

(1199091198951 1199091198952 119909119895119901) be two objects described by 119901 numericattributes the Euclidean distance between object 119894 and 119895 ispresented as follows

119889 (119894 119895)

= radic(1199091198941 minus 1199091198951)2

+ (1199091198942 minus 1199091198952)2

+ sdot sdot sdot + (119909119894119901 minus 119909119895119901)2

(5)

For given119873 objects the clustering problem is tominimize thesum squared Euclidean distance between objects and allocateeach object to one of 119896 cluster centers [55]The clustering aims

at finding clusters center through minimizing the objectivefunction The objective function is defined as follows [26]

119869119888 =

119898

sum

119896=1

sum

119883119894isin119862119896

119889 (119883119894 119885119896) (6)

where 119898 indicates the number of clusters 119862119896 indicates the119896th cluster 119885119896 indicates the kth center of the clusteringand 119889(119883119894 119885119896) indicates the distance of the sample to thecorresponding center of clustering namely119889(119883119894 119885119896) = 119883119894minus119885119896

53 PGWO Algorithm on Clustering In clustering analysiseach element in the data set is a 119901 dimensional vectorMoreover the actual position of a grey wolf represents the 119896cluster centers so each greywolf indicates a 119896lowast119901 dimensionalvector For each grey wolf 119894 its position is denoted as a vector119883119894 = (1199091198941 1199091198942 119909119894119896lowast119901) In the initialization phase weutilize maximum and minimum values of each componentof the data set (which is to be grouped) as PGWO algorithmthe initialization search scope of the grey wolves and theinitialization solution is randomly generated in this rangeWe use (6) to calculate the fitness function of grey wolvesrsquoindividuals and the main steps of the fitness function areshown in Algorithm 4

54 Data Clustering Experimental Results and DiscussionIn order to verify performance of the proposed PGWOapproach for clustering we compare the results of the 119870-means GGSA CS ABC PSO GWO and PGWO clusteringalgorithms using nine different data sets that are selectedfrom the UCI machine learning repository [56]

Artificial data set (119873 = 600 119889 = 2 and 119896 = 4) is atwo-featured problemwith four unique classes A total of 600

Discrete Dynamics in Nature and Society 9

(1) For data vector 119909119894(2) Calculate the Euclidean distance by (5)(3) Assign 119909119894 to the closest cluster center(4) Calculate the measure function by (6)(5) End For(6) Return value of the fitness function

Algorithm 4 Main steps of the fitness function

patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732 (120583 = (

119898119894

0) Σ = [[

05 005

005 05]]) (7)

where 119894 = 1 2 3 4 1198981 = minus3 1198982 = 0 1198983 = 3 and 119898 = 6120583 and Σ are mean vector and covariance matrix respectively[45 57]

Iris data (119873 = 150 119889 = 4 and 119870 = 3) is a data setwith 150 random samples of flowers from the Iris speciessetosa versicolor and virginica collected by Anderson [58]From each species there are 50 observations for sepal lengthsepal width petal length and petal width in cm This dataset was used by Fisher [59] in his initiation of the linear-discriminant-function technique [28 56 57]

Wisconsin breast cancer (119873 = 683 119889 = 9 and 119870 =

2) consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [28 56 57]

Contraceptive method choice (119873 = 1473 119889 = 10119870 = 3)CMC for short is a data set that is a subset of the 1987NationalIndonesia Contraceptive Prevalence Survey The samples aremarried women who either were not pregnant or did notknow if they were at the time of interview The problemis to predict the current contraceptive method choice (nouse long-termmethods or short-termmethods) of a womanbased onher demographic and socioeconomic characteristics[28 56 57]

Seeds data (119873 = 210 119889 = 7 and 119870 = 3) is a dataset that consists of 210 patterns belonging to three differentvarieties of wheat Kama Rosa and Canadian From eachspecies there are 70 observations for area 119860 perimeter 119875compactness 119862 (119862 = 4 lowast 119901119894 lowast119860119901

and2) length of kernel width

of kernel asymmetry coefficient and length of kernel groove[56]

Statlog (Heart) data (119873 = 270 119889 = 13119870 = 2) is a data setthat is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [56]

Wine data (119873 = 178 119889 = 13 119870 = 3) is also takenfromMCI laboratoryThese data are the results of a chemicalanalysis of wines grown in the same region in Italy but derivedfrom three different cultivars The analysis determined thequantities of 13 constituents found in each of the three types

of wines There are 178 instances with 13 numeric attributesin wine data set All attributes are continuous There is nomissing attribute value [28 56 57]

Balance scale data (119873 = 625 119889 = 4 and 119870 = 3) is a dataset that was generated to model psychological experimentalresults Each example is classified as having the balance scaletip to the right or left being balanced The attributes arethe left weight the left distance the right weight and theright distance The correct way to find the class is the greaterof (left-distance lowast left-weight) and (right-distance lowast right-weight) If they are equal it is balanced [56]

Habermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) is adata set that contains cases from a study that was conductedbetween 1958 and 1970 at the University of Chicagorsquos BillingsHospital on the survival of patients who had undergonesurgery for breast cancer It records two survival statuspatients with the age of patient at time of operation patientrsquosyear of operation and number of positive axillary nodesdetected [56]

The results of comparison of intracluster distances forthe seven clustering algorithms over 20 independent runs areshown in Table 5 Table 6 reports the 119901 values produced byWilcoxonrsquos rank-sum test comparing PGWO and the otheralgorithms over all the data sets

For Art data set the optimum value the worst value andthe average value of PGWO and GGSA are all 51390119864 + 02while the standard deviation of GGSA is better than PGWOFor PSO it only gets the optimum solution 51390119864 + 02

For Iris data set Cancer data set and Seeds data setPGWO PSO and GGSA provide the optimum value incomparison to those obtained by other methods Howeverthe worst value the average value and the standard deviationvalue of PGWO are superior to those of the other methods

For Heart data set PGWO PSO and CS all find theoptimum solution 10623119864 + 04 That means they all can findthe global solution However PGWO is slightly better for theworst value the average value and the standard deviation

For CMC data set and Wine data set Table 5 shows thatthe average best worst and standard deviation values ofthe fitness function for PGWO algorithm are much smallerthan those of the other six methods The PGWO clusteringalgorithm is capable of providing the same partition of thedata points in all runs

For balance scale data set the optimum values of thefitness function for PGWO PSO andGGSA are 14238119864+03That means they all can find the global solution And theoptimum values of the fitness function for GWO and 119870-means are 14239119864 + 03 the result is close to PGWO PSOand GGSA However the standard deviation values of GWO119870-means PGWO PSO GGSA and CS are 68756119864 minus 0136565119864 + 00 86619119864 minus 01 97396119864 minus 01 16015119864 + 00 and73239119864 minus 01 respectively From the standard deviation wecan see that the GWO is more stable than the other methods

For Habermanrsquos Survival data set the optimum value theworst value and the average value of the fitness function forCS and PSO are almost the same The results of CS and PSOalgorithms are better than those of the other methods

The 119901 values in Table 6 show that the results of PGWOare significantly better in Art data set Cancer data set

10 Discrete Dynamics in Nature and Society

Table 5 Comparison of intracluster distances for the seven clustering algorithms over 20 independent runs

Data set Criteria 119870-means GGSA CS ABC PSO GWO PGWO

Art

Best 51454E + 02 51390E + 02 51391E + 02 51752E + 02 51390E + 02 51455E + 02 51390E + 02Worst 90474E + 02 51390E + 02 53213E + 02 55298E + 02 89248E + 02 52058E + 02 51390E + 02Mean 60974E + 02 51390E + 02 51597E + 02 52620E + 02 53283E + 02 51687E + 02 51390E + 02Std 16903E + 02 15430119864 minus 13 48536E + 00 97988E + 00 84652E + 01 18768E + 00 61243119864 minus 08

Iris

Best 97190E + 01 96655E + 01 96659E + 01 97616E + 01 96655E + 01 96835E + 01 96655E + 01Worst 12318E + 02 11839E + 02 97684E + 01 10401E + 02 12767E + 02 12108E + 02 96655E + 01Mean 10226E + 02 98865E + 01 96862E + 01 99710E + 01 10596E + 02 99903E + 01 96655E + 01Std 10290E + 01 55454E + 00 31103E minus 01 18460E + 00 14578E + 01 67482E + 00 85869E minus 10

Cancer

Best 29763E + 03 29644E + 03 29645E + 03 29887E + 03 29644E + 03 29645E + 03 29644E + 03Worst 29884E + 03 30611E + 03 29677E + 03 32650E + 03 47288E + 03 29650E + 03 29644E + 03Mean 29830E + 03 29715E + 03 29651E + 03 30760E + 03 34055E + 03 29647E + 03 29644E + 03Std 48278E + 00 21284E + 01 73222E minus 01 65283E + 01 78385E + 02 11452E minus 01 16118E minus 08

CMC

Best 57016E + 03 56938E + 03 57054E + 03 58219E + 03 56942E + 03 58236E + 03 56937E + 03Worst 57053E + 03 57111E + 03 57762E + 03 63743E + 03 71580E + 03 60911E + 03 56937E + 03Mean 57040E + 03 56971E + 03 57263E + 03 60868E + 03 57690E + 03 59239E + 03 56937E + 03Std 11022E + 00 37872E + 00 21119E + 01 17292E + 02 32694E + 02 90635E + 01 98777E minus 04

Seeds

Best 31322E + 02 31180E + 02 31187E + 02 31573E + 02 31180E + 02 31323E + 02 31180E + 02Worst 31373E + 02 31354E + 02 31746E + 02 34997E + 02 42035E + 02 31782E + 02 31180E + 02Mean 31337E + 02 31189E + 02 31347E + 02 33161E + 02 32813E + 02 31475E + 02 31180E + 02Std 23686E minus 01 38862E minus 01 16383E + 00 10943E + 01 39697E + 01 15716E + 00 17815E minus 09

Heart

Best 10682E + 04 10749E + 04 10623E + 04 10683E + 04 10623E + 04 10637E + 04 10623E + 04Worst 10701E + 04 12684E + 04 10625E + 04 11622E + 04 10626E + 04 10683E + 04 10624E + 04Mean 10693E + 04 11542E + 04 10624E + 04 10920E + 04 10624E + 04 10657E + 04 10623E + 04Std 81672E + 00 57044E + 02 46239E minus 01 25921E + 02 76866E minus 01 13817E + 01 23235E minus 01

Wine

Best 16385E + 04 16493E + 04 16296E + 04 16566E + 04 16294E + 04 16316E + 04 16292E + 04Worst 18437E + 04 20245E + 04 16311E + 04 17668E + 04 16312E + 04 16371E + 04 16294E + 04Mean 16974E + 04 17999E + 04 16301E + 04 17010E + 04 16297E + 04 16345E + 04 16293E + 04Std 86757E + 02 11562E + 03 45677E + 00 36824E + 02 40149E + 00 14836E + 01 52338E minus 01

Balance scale

Best 14239E + 03 14238E + 03 14256E + 03 14265E + 03 14238E + 03 14239E + 03 14238E + 03Worst 14337E + 03 14291E + 03 14285E + 03 14310E + 03 14262E + 03 14260E + 03 14257E + 03Mean 14275E + 03 14259E + 03 14268E + 03 14282E + 03 14248E + 03 14243119864 + 03 14245119864 + 03

Std 36565E + 00 16015E + 00 73239E minus 01 11831E + 00 97396E minus 01 68756119864 minus 01 86619119864 minus 01

Haberman-Survival

Best 26251E + 03 25670E + 03 25670E + 03 25671E + 03 25670E + 03 25673E + 03 25670E + 03Worst 31966E + 03 26226E + 03 25670119864 + 03 25709E + 03 25678E + 03 26686E + 03 25678119864 + 03

Mean 26554E + 03 25702E + 03 25670119864 + 03 25679E + 03 25671E + 03 25898E + 03 25673119864 + 03

Std 12740E + 02 12354E + 01 17590119864 minus 06 83107E minus 01 30624E minus 01 23346E + 01 40711119864 minus 01

Table 6 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS GGSA and 119870-means over datasets (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA 119870-meansArt 680E minus 08 160E minus 05 680E minus 08 680E minus 08 611E minus 08 449E minus 08Iris 675E minus 08 119E minus 06 675E minus 08 675E minus 08 0107346 517E minus 08Cancer 676E minus 08 676E minus 08 676E minus 08 676E minus 08 12E minus 06 598E minus 08CMC 669E minus 08 669E minus 08 669E minus 08 669E minus 08 669E minus 08 339E minus 08Seeds 680E minus 08 680E minus 08 680E minus 08 680E minus 08 123E minus 03 481E minus 08Heart 661E minus 08 202E minus 06 661E minus 08 587E minus 07 661E minus 08 607E minus 08Wine 680E minus 08 123E minus 07 680E minus 08 680E minus 08 680E minus 08 663E minus 08Balance scale 0067868 0007712 68E minus 08 106E minus 07 0003966 601E minus 07Haberman-Survival 690E minus 07 0524948 178E minus 03 0860418 0090604 626E minus 08

Discrete Dynamics in Nature and Society 11

Art Iris

Cancer CMC

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900

950

1000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

80

90

100

110

120

130

140

150

160

170

180

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

2500

3000

3500

4000

4500

5000

5500

6000

6500

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

5500

6000

6500

7000

7500

8000

8500Av

erag

e bes

t so

far

50 100 150 2000Iteration

105

11

115

12

125

13

135

14

145

15

155

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

300

350

400

450

500

550

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

Figure 2 Continued

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 9: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

Discrete Dynamics in Nature and Society 9

(1) For data vector 119909119894(2) Calculate the Euclidean distance by (5)(3) Assign 119909119894 to the closest cluster center(4) Calculate the measure function by (6)(5) End For(6) Return value of the fitness function

Algorithm 4 Main steps of the fitness function

patterns were drawn from four independent bivariate normaldistributions where classes were distributed according to

1198732 (120583 = (

119898119894

0) Σ = [[

05 005

005 05]]) (7)

where 119894 = 1 2 3 4 1198981 = minus3 1198982 = 0 1198983 = 3 and 119898 = 6120583 and Σ are mean vector and covariance matrix respectively[45 57]

Iris data (119873 = 150 119889 = 4 and 119870 = 3) is a data setwith 150 random samples of flowers from the Iris speciessetosa versicolor and virginica collected by Anderson [58]From each species there are 50 observations for sepal lengthsepal width petal length and petal width in cm This dataset was used by Fisher [59] in his initiation of the linear-discriminant-function technique [28 56 57]

Wisconsin breast cancer (119873 = 683 119889 = 9 and 119870 =

2) consists of 683 objects characterized by nine featuresclump thickness cell size uniformity cell shape uniformitymarginal adhesion single epithelial cell size bare nucleibland chromatin normal nucleoli andmitosesThere are twocategories in the data malignant (444 objects) and benign(239 objects) [28 56 57]

Contraceptive method choice (119873 = 1473 119889 = 10119870 = 3)CMC for short is a data set that is a subset of the 1987NationalIndonesia Contraceptive Prevalence Survey The samples aremarried women who either were not pregnant or did notknow if they were at the time of interview The problemis to predict the current contraceptive method choice (nouse long-termmethods or short-termmethods) of a womanbased onher demographic and socioeconomic characteristics[28 56 57]

Seeds data (119873 = 210 119889 = 7 and 119870 = 3) is a dataset that consists of 210 patterns belonging to three differentvarieties of wheat Kama Rosa and Canadian From eachspecies there are 70 observations for area 119860 perimeter 119875compactness 119862 (119862 = 4 lowast 119901119894 lowast119860119901

and2) length of kernel width

of kernel asymmetry coefficient and length of kernel groove[56]

Statlog (Heart) data (119873 = 270 119889 = 13119870 = 2) is a data setthat is a heart disease database similar to a database alreadypresent in the repository (heart disease databases) but in aslightly different form [56]

Wine data (119873 = 178 119889 = 13 119870 = 3) is also takenfromMCI laboratoryThese data are the results of a chemicalanalysis of wines grown in the same region in Italy but derivedfrom three different cultivars The analysis determined thequantities of 13 constituents found in each of the three types

of wines There are 178 instances with 13 numeric attributesin wine data set All attributes are continuous There is nomissing attribute value [28 56 57]

Balance scale data (119873 = 625 119889 = 4 and 119870 = 3) is a dataset that was generated to model psychological experimentalresults Each example is classified as having the balance scaletip to the right or left being balanced The attributes arethe left weight the left distance the right weight and theright distance The correct way to find the class is the greaterof (left-distance lowast left-weight) and (right-distance lowast right-weight) If they are equal it is balanced [56]

Habermanrsquos Survival (119873 = 306 119889 = 3 and 119870 = 2) is adata set that contains cases from a study that was conductedbetween 1958 and 1970 at the University of Chicagorsquos BillingsHospital on the survival of patients who had undergonesurgery for breast cancer It records two survival statuspatients with the age of patient at time of operation patientrsquosyear of operation and number of positive axillary nodesdetected [56]

The results of comparison of intracluster distances forthe seven clustering algorithms over 20 independent runs areshown in Table 5 Table 6 reports the 119901 values produced byWilcoxonrsquos rank-sum test comparing PGWO and the otheralgorithms over all the data sets

For Art data set the optimum value the worst value andthe average value of PGWO and GGSA are all 51390119864 + 02while the standard deviation of GGSA is better than PGWOFor PSO it only gets the optimum solution 51390119864 + 02

For Iris data set Cancer data set and Seeds data setPGWO PSO and GGSA provide the optimum value incomparison to those obtained by other methods Howeverthe worst value the average value and the standard deviationvalue of PGWO are superior to those of the other methods

For Heart data set PGWO PSO and CS all find theoptimum solution 10623119864 + 04 That means they all can findthe global solution However PGWO is slightly better for theworst value the average value and the standard deviation

For CMC data set and Wine data set Table 5 shows thatthe average best worst and standard deviation values ofthe fitness function for PGWO algorithm are much smallerthan those of the other six methods The PGWO clusteringalgorithm is capable of providing the same partition of thedata points in all runs

For balance scale data set the optimum values of thefitness function for PGWO PSO andGGSA are 14238119864+03That means they all can find the global solution And theoptimum values of the fitness function for GWO and 119870-means are 14239119864 + 03 the result is close to PGWO PSOand GGSA However the standard deviation values of GWO119870-means PGWO PSO GGSA and CS are 68756119864 minus 0136565119864 + 00 86619119864 minus 01 97396119864 minus 01 16015119864 + 00 and73239119864 minus 01 respectively From the standard deviation wecan see that the GWO is more stable than the other methods

For Habermanrsquos Survival data set the optimum value theworst value and the average value of the fitness function forCS and PSO are almost the same The results of CS and PSOalgorithms are better than those of the other methods

The 119901 values in Table 6 show that the results of PGWOare significantly better in Art data set Cancer data set

10 Discrete Dynamics in Nature and Society

Table 5 Comparison of intracluster distances for the seven clustering algorithms over 20 independent runs

Data set Criteria 119870-means GGSA CS ABC PSO GWO PGWO

Art

Best 51454E + 02 51390E + 02 51391E + 02 51752E + 02 51390E + 02 51455E + 02 51390E + 02Worst 90474E + 02 51390E + 02 53213E + 02 55298E + 02 89248E + 02 52058E + 02 51390E + 02Mean 60974E + 02 51390E + 02 51597E + 02 52620E + 02 53283E + 02 51687E + 02 51390E + 02Std 16903E + 02 15430119864 minus 13 48536E + 00 97988E + 00 84652E + 01 18768E + 00 61243119864 minus 08

Iris

Best 97190E + 01 96655E + 01 96659E + 01 97616E + 01 96655E + 01 96835E + 01 96655E + 01Worst 12318E + 02 11839E + 02 97684E + 01 10401E + 02 12767E + 02 12108E + 02 96655E + 01Mean 10226E + 02 98865E + 01 96862E + 01 99710E + 01 10596E + 02 99903E + 01 96655E + 01Std 10290E + 01 55454E + 00 31103E minus 01 18460E + 00 14578E + 01 67482E + 00 85869E minus 10

Cancer

Best 29763E + 03 29644E + 03 29645E + 03 29887E + 03 29644E + 03 29645E + 03 29644E + 03Worst 29884E + 03 30611E + 03 29677E + 03 32650E + 03 47288E + 03 29650E + 03 29644E + 03Mean 29830E + 03 29715E + 03 29651E + 03 30760E + 03 34055E + 03 29647E + 03 29644E + 03Std 48278E + 00 21284E + 01 73222E minus 01 65283E + 01 78385E + 02 11452E minus 01 16118E minus 08

CMC

Best 57016E + 03 56938E + 03 57054E + 03 58219E + 03 56942E + 03 58236E + 03 56937E + 03Worst 57053E + 03 57111E + 03 57762E + 03 63743E + 03 71580E + 03 60911E + 03 56937E + 03Mean 57040E + 03 56971E + 03 57263E + 03 60868E + 03 57690E + 03 59239E + 03 56937E + 03Std 11022E + 00 37872E + 00 21119E + 01 17292E + 02 32694E + 02 90635E + 01 98777E minus 04

Seeds

Best 31322E + 02 31180E + 02 31187E + 02 31573E + 02 31180E + 02 31323E + 02 31180E + 02Worst 31373E + 02 31354E + 02 31746E + 02 34997E + 02 42035E + 02 31782E + 02 31180E + 02Mean 31337E + 02 31189E + 02 31347E + 02 33161E + 02 32813E + 02 31475E + 02 31180E + 02Std 23686E minus 01 38862E minus 01 16383E + 00 10943E + 01 39697E + 01 15716E + 00 17815E minus 09

Heart

Best 10682E + 04 10749E + 04 10623E + 04 10683E + 04 10623E + 04 10637E + 04 10623E + 04Worst 10701E + 04 12684E + 04 10625E + 04 11622E + 04 10626E + 04 10683E + 04 10624E + 04Mean 10693E + 04 11542E + 04 10624E + 04 10920E + 04 10624E + 04 10657E + 04 10623E + 04Std 81672E + 00 57044E + 02 46239E minus 01 25921E + 02 76866E minus 01 13817E + 01 23235E minus 01

Wine

Best 16385E + 04 16493E + 04 16296E + 04 16566E + 04 16294E + 04 16316E + 04 16292E + 04Worst 18437E + 04 20245E + 04 16311E + 04 17668E + 04 16312E + 04 16371E + 04 16294E + 04Mean 16974E + 04 17999E + 04 16301E + 04 17010E + 04 16297E + 04 16345E + 04 16293E + 04Std 86757E + 02 11562E + 03 45677E + 00 36824E + 02 40149E + 00 14836E + 01 52338E minus 01

Balance scale

Best 14239E + 03 14238E + 03 14256E + 03 14265E + 03 14238E + 03 14239E + 03 14238E + 03Worst 14337E + 03 14291E + 03 14285E + 03 14310E + 03 14262E + 03 14260E + 03 14257E + 03Mean 14275E + 03 14259E + 03 14268E + 03 14282E + 03 14248E + 03 14243119864 + 03 14245119864 + 03

Std 36565E + 00 16015E + 00 73239E minus 01 11831E + 00 97396E minus 01 68756119864 minus 01 86619119864 minus 01

Haberman-Survival

Best 26251E + 03 25670E + 03 25670E + 03 25671E + 03 25670E + 03 25673E + 03 25670E + 03Worst 31966E + 03 26226E + 03 25670119864 + 03 25709E + 03 25678E + 03 26686E + 03 25678119864 + 03

Mean 26554E + 03 25702E + 03 25670119864 + 03 25679E + 03 25671E + 03 25898E + 03 25673119864 + 03

Std 12740E + 02 12354E + 01 17590119864 minus 06 83107E minus 01 30624E minus 01 23346E + 01 40711119864 minus 01

Table 6 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS GGSA and 119870-means over datasets (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA 119870-meansArt 680E minus 08 160E minus 05 680E minus 08 680E minus 08 611E minus 08 449E minus 08Iris 675E minus 08 119E minus 06 675E minus 08 675E minus 08 0107346 517E minus 08Cancer 676E minus 08 676E minus 08 676E minus 08 676E minus 08 12E minus 06 598E minus 08CMC 669E minus 08 669E minus 08 669E minus 08 669E minus 08 669E minus 08 339E minus 08Seeds 680E minus 08 680E minus 08 680E minus 08 680E minus 08 123E minus 03 481E minus 08Heart 661E minus 08 202E minus 06 661E minus 08 587E minus 07 661E minus 08 607E minus 08Wine 680E minus 08 123E minus 07 680E minus 08 680E minus 08 680E minus 08 663E minus 08Balance scale 0067868 0007712 68E minus 08 106E minus 07 0003966 601E minus 07Haberman-Survival 690E minus 07 0524948 178E minus 03 0860418 0090604 626E minus 08

Discrete Dynamics in Nature and Society 11

Art Iris

Cancer CMC

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900

950

1000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

80

90

100

110

120

130

140

150

160

170

180

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

2500

3000

3500

4000

4500

5000

5500

6000

6500

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

5500

6000

6500

7000

7500

8000

8500Av

erag

e bes

t so

far

50 100 150 2000Iteration

105

11

115

12

125

13

135

14

145

15

155

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

300

350

400

450

500

550

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

Figure 2 Continued

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 10: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

10 Discrete Dynamics in Nature and Society

Table 5 Comparison of intracluster distances for the seven clustering algorithms over 20 independent runs

Data set Criteria 119870-means GGSA CS ABC PSO GWO PGWO

Art

Best 51454E + 02 51390E + 02 51391E + 02 51752E + 02 51390E + 02 51455E + 02 51390E + 02Worst 90474E + 02 51390E + 02 53213E + 02 55298E + 02 89248E + 02 52058E + 02 51390E + 02Mean 60974E + 02 51390E + 02 51597E + 02 52620E + 02 53283E + 02 51687E + 02 51390E + 02Std 16903E + 02 15430119864 minus 13 48536E + 00 97988E + 00 84652E + 01 18768E + 00 61243119864 minus 08

Iris

Best 97190E + 01 96655E + 01 96659E + 01 97616E + 01 96655E + 01 96835E + 01 96655E + 01Worst 12318E + 02 11839E + 02 97684E + 01 10401E + 02 12767E + 02 12108E + 02 96655E + 01Mean 10226E + 02 98865E + 01 96862E + 01 99710E + 01 10596E + 02 99903E + 01 96655E + 01Std 10290E + 01 55454E + 00 31103E minus 01 18460E + 00 14578E + 01 67482E + 00 85869E minus 10

Cancer

Best 29763E + 03 29644E + 03 29645E + 03 29887E + 03 29644E + 03 29645E + 03 29644E + 03Worst 29884E + 03 30611E + 03 29677E + 03 32650E + 03 47288E + 03 29650E + 03 29644E + 03Mean 29830E + 03 29715E + 03 29651E + 03 30760E + 03 34055E + 03 29647E + 03 29644E + 03Std 48278E + 00 21284E + 01 73222E minus 01 65283E + 01 78385E + 02 11452E minus 01 16118E minus 08

CMC

Best 57016E + 03 56938E + 03 57054E + 03 58219E + 03 56942E + 03 58236E + 03 56937E + 03Worst 57053E + 03 57111E + 03 57762E + 03 63743E + 03 71580E + 03 60911E + 03 56937E + 03Mean 57040E + 03 56971E + 03 57263E + 03 60868E + 03 57690E + 03 59239E + 03 56937E + 03Std 11022E + 00 37872E + 00 21119E + 01 17292E + 02 32694E + 02 90635E + 01 98777E minus 04

Seeds

Best 31322E + 02 31180E + 02 31187E + 02 31573E + 02 31180E + 02 31323E + 02 31180E + 02Worst 31373E + 02 31354E + 02 31746E + 02 34997E + 02 42035E + 02 31782E + 02 31180E + 02Mean 31337E + 02 31189E + 02 31347E + 02 33161E + 02 32813E + 02 31475E + 02 31180E + 02Std 23686E minus 01 38862E minus 01 16383E + 00 10943E + 01 39697E + 01 15716E + 00 17815E minus 09

Heart

Best 10682E + 04 10749E + 04 10623E + 04 10683E + 04 10623E + 04 10637E + 04 10623E + 04Worst 10701E + 04 12684E + 04 10625E + 04 11622E + 04 10626E + 04 10683E + 04 10624E + 04Mean 10693E + 04 11542E + 04 10624E + 04 10920E + 04 10624E + 04 10657E + 04 10623E + 04Std 81672E + 00 57044E + 02 46239E minus 01 25921E + 02 76866E minus 01 13817E + 01 23235E minus 01

Wine

Best 16385E + 04 16493E + 04 16296E + 04 16566E + 04 16294E + 04 16316E + 04 16292E + 04Worst 18437E + 04 20245E + 04 16311E + 04 17668E + 04 16312E + 04 16371E + 04 16294E + 04Mean 16974E + 04 17999E + 04 16301E + 04 17010E + 04 16297E + 04 16345E + 04 16293E + 04Std 86757E + 02 11562E + 03 45677E + 00 36824E + 02 40149E + 00 14836E + 01 52338E minus 01

Balance scale

Best 14239E + 03 14238E + 03 14256E + 03 14265E + 03 14238E + 03 14239E + 03 14238E + 03Worst 14337E + 03 14291E + 03 14285E + 03 14310E + 03 14262E + 03 14260E + 03 14257E + 03Mean 14275E + 03 14259E + 03 14268E + 03 14282E + 03 14248E + 03 14243119864 + 03 14245119864 + 03

Std 36565E + 00 16015E + 00 73239E minus 01 11831E + 00 97396E minus 01 68756119864 minus 01 86619119864 minus 01

Haberman-Survival

Best 26251E + 03 25670E + 03 25670E + 03 25671E + 03 25670E + 03 25673E + 03 25670E + 03Worst 31966E + 03 26226E + 03 25670119864 + 03 25709E + 03 25678E + 03 26686E + 03 25678119864 + 03

Mean 26554E + 03 25702E + 03 25670119864 + 03 25679E + 03 25671E + 03 25898E + 03 25673119864 + 03

Std 12740E + 02 12354E + 01 17590119864 minus 06 83107E minus 01 30624E minus 01 23346E + 01 40711119864 minus 01

Table 6 119901 values produced by Wilcoxonrsquos rank-sum test comparing PGWO versus GWO PSO ABC CS GGSA and 119870-means over datasets (119901 ge 005 have been underlined)

PGWO vs GWO PSO ABC CS GGSA 119870-meansArt 680E minus 08 160E minus 05 680E minus 08 680E minus 08 611E minus 08 449E minus 08Iris 675E minus 08 119E minus 06 675E minus 08 675E minus 08 0107346 517E minus 08Cancer 676E minus 08 676E minus 08 676E minus 08 676E minus 08 12E minus 06 598E minus 08CMC 669E minus 08 669E minus 08 669E minus 08 669E minus 08 669E minus 08 339E minus 08Seeds 680E minus 08 680E minus 08 680E minus 08 680E minus 08 123E minus 03 481E minus 08Heart 661E minus 08 202E minus 06 661E minus 08 587E minus 07 661E minus 08 607E minus 08Wine 680E minus 08 123E minus 07 680E minus 08 680E minus 08 680E minus 08 663E minus 08Balance scale 0067868 0007712 68E minus 08 106E minus 07 0003966 601E minus 07Haberman-Survival 690E minus 07 0524948 178E minus 03 0860418 0090604 626E minus 08

Discrete Dynamics in Nature and Society 11

Art Iris

Cancer CMC

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900

950

1000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

80

90

100

110

120

130

140

150

160

170

180

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

2500

3000

3500

4000

4500

5000

5500

6000

6500

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

5500

6000

6500

7000

7500

8000

8500Av

erag

e bes

t so

far

50 100 150 2000Iteration

105

11

115

12

125

13

135

14

145

15

155

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

300

350

400

450

500

550

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

Figure 2 Continued

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 11: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

Discrete Dynamics in Nature and Society 11

Art Iris

Cancer CMC

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900

950

1000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

80

90

100

110

120

130

140

150

160

170

180

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

2500

3000

3500

4000

4500

5000

5500

6000

6500

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

5500

6000

6500

7000

7500

8000

8500Av

erag

e bes

t so

far

50 100 150 2000Iteration

105

11

115

12

125

13

135

14

145

15

155

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

300

350

400

450

500

550

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

GGSACSABC

PSOGWOPGWO

Figure 2 Continued

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 12: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

12 Discrete Dynamics in Nature and Society

Wine Balance scale

50 100 150 2000Iteration

16

165

17

175

18

185

19

195

2

Aver

age b

est s

o fa

r

GGSACSABC

PSOGWOPGWO

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

1420

1430

1440

1450

1460

1470

1480

Aver

age b

est s

o fa

r

2400

2500

2600

2700

2800

2900

3000

Aver

age b

est s

o fa

r

50 100 150 2000Iteration

GGSACSABC

PSOGWOPGWO

Habermanrsquos Survival

times104

Figure 2 Convergence curves of clustering on data sets over 20 independent runs

CMC data set Seeds data set Heart data set and Winedata set From Table 6 we conclude that for Iris data setPGWO is performing superior to the other algorithms exceptfor GGSA Since the 119901 value for balance scale data set ofPGWO versus GWO is more than 005 there is no statisticaldifference between them both For Haberman Survival dataset while comparing PGWO and the other algorithms wecan conclude that PGWO is significantly performing betterin three out of six groups compared to algorithms So it canbe claimed that PGWO provides better results than the otheralgorithms across the majority of the data sets

Figure 2 shows convergence curves of clustering on datasets over 20 independent runs As can be seen from the figurethe convergence speed of PGWO is the fastest

Figure 3 indicates ANOVA tests of clustering on data setsover 20 independent runs Seen from Figure 3 PGWO isvery stable for the majority of the data sets For Art Seedsand CMC data sets PGWO PSO and GGSA can obtain therelatively stable optimal values For Heart and Wine data

sets the stability of PGWO PSO and CS is outstanding ForCancer data set most of the algorithms can obtain the stableoptimal value except for ABC and PSO algorithms For Irisdata set we can clearly see that PGWO is better in termsof the stability For balance scale data set GWO obtains therelatively stable optimal value For Habermanrsquos Survival dataset the stability of CS and PSO is the best but PGWO andGGSA follow them closely

Clustering results of Art Iris and Survival data sets byPGWOalgorithmare presented in Figure 4which canmake itvisualized clearly It can be seen fromFigure 4 that the PGWOalgorithm possesses superior effect on Art Iris and Survivaldata sets

In summary the results show that the proposed methodsuccessfully outperforms the other algorithms across themajority of benchmark functions Furthermore the testresults of clustering problems show that PGWO is able toprovide very competitive results including the property ofthis algorithm Therefore it appears from this comparative

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 13: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

Discrete Dynamics in Nature and Society 13

Art Iris

CMC Cancer

Seeds Hearttimes104

500

550

600

650

700

750

800

850

900Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

85

90

95

100

105

110

115

120

125

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

3000

3500

4000

4500

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

5800

6000

6200

6400

6600

6800

7000

7200Fi

tnes

s val

ue

CS ABC PSO GWO PGWOGGSAAlgorithms

106

108

11

112

114

116

118

12

122

124

126

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

320

340

360

380

400

420

Fitn

ess v

alue

Figure 3 Continued

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 14: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

14 Discrete Dynamics in Nature and Society

Habermanrsquos Survival

Wine Balance scaletimes104

1423

1424

1425

1426

1427

1428

1429

1430

1431

Fitn

ess v

alue

165

17

175

18

185

19

195

2

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

2570

2580

2590

2600

2610

2620

2630

2640

2650

2660

2670

Fitn

ess v

alue

CS ABC PSO GWO PGWOGGSAAlgorithms

CS ABC PSO GWO PGWOGGSAAlgorithms

Figure 3 ANOVA tests of clustering on data sets over 20 independent runs

study that the proposed method has merit in the field ofevolutionary algorithm and optimization

6 Conclusion and Future Works

In order to apply the grey wolf optimizer to solve complexoptimization problems efficiently this paper proposed a novelgrey wolf optimizer based on Powell local optimizationmethod namely PGWO In PGWO at first original GWOalgorithm is applied to shrink the search region to a morepromising area Thereafter Powellrsquos method is implementedas a critical complement to perform the local search toexploit the limited area intensively to get better solutionsThe PGWO makes an attempt at taking merits of the GWOand Powellrsquos method in order to avoid all grey wolves gettingtrapped in inferior local optimal regionsThe PGWO enablesthe grey wolves to have more diverse exemplars to learnfrom as the grey wolves are updated each generation and

also form new grey wolves to search in a larger searchspace With both techniques combined PGWO can balanceexploration and exploitation and effectively solve complexproblems The experimental results show the effectivenessof Powellrsquos method in terms of solution quality and con-vergence speed The proposed algorithm is benchmarkedon seven well-known test functions and the results arecomparative study with GGSA CS ABC PSO and GWOThe results show that the PGWO algorithm is capable ofproviding very competitive results compared to these famousmetaheuristics Because of the superior performance of thePGWO algorithm we use it to solve clustering problemsThe algorithm has been tested on an artificial data set andeight real data sets To justify the performance of the PGWOalgorithm on clustering problems we compare it with theoriginal GWO GGSA CS ABC PSO and 119870-means Theresults prove that the PGWOalgorithm is able to significantlyoutperform others on the majority of the data sets in terms

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 15: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

Discrete Dynamics in Nature and Society 15

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set distribution

minus6

minus4

minus2

0

2

4

6

8

10

minus6 minus4 minus2 0 2 4 6 8 10

Artificial data set clustering

minus6

minus4

minus2

0

2

4

6

8

10

4 45 5 55 6 65 7 75 8

Iris data distribution

1

2

3

4

5

6

7

4 45 5 55 6 65 7 75 8

Iris data clustering

1

2

3

4

5

6

7

30 40 50 60 70 80 90

Habermanrsquos Survival data distribution

58

60

62

64

66

68

70

30 40 50 60 70 80 90

Habermanrsquos Survival data clustering

58

60

62

64

66

68

70

Figure 4 The original data distribution of Art Iris and Survival data sets and the clustering results by PGWO algorithm

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 16: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

16 Discrete Dynamics in Nature and Society

of average value and standard deviations of fitness functionMoreover the experimental results demonstrate that theproposed PGWO algorithm can be considered as a feasibleand efficient method to solve optimization problems

Our future work will focus on the two issues On onehand we would apply our proposed PGWO algorithm to testhigher dimensional problems and large number of patternsOn the other hand the PGWO clustering algorithm will alsobe extended to dynamically determine the optimal numberof clusters

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

Thiswork is supported by theNational Natural Science Foun-dation of China under Grants nos 61165015 61463007 and61563008 and the Project of Guangxi University for Nation-alities Science Foundation under Grant no 2012MDZD037

References

[1] J H Holland ldquoGenetic algorithmsrdquo Scientific American vol267 no 1 pp 66ndash72 1992

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference on Neural Net-works vol 4 pp 1942ndash1948 IEEE Perth Australia November-December 1995

[3] K Price and R Storn ldquoDifferential evolutionrdquo Dr DobbrsquosJournal vol 22 pp 18ndash20 1997

[4] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerNew York NY USA 2005

[5] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[6] D HWolpert andWGMacready ldquoNo free lunch theorems foroptimizationrdquo IEEE Transactions on Evolutionary Computationvol 1 no 1 pp 67ndash82 1997

[7] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[8] S Mirjalili S M Mirjalili and A Hatamlou ldquoMulti-verseoptimizer a nature-inspired algorithm for global optimizationrdquoNeural Computing and Applications 2015

[9] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[10] D Karaboga and B Basturk ldquoA powerful and efficient algo-rithm for numerical function optimization artificial bee colony(ABC) algorithmrdquo Journal of Global Optimization vol 39 no 3pp 459ndash471 2007

[11] X-S Yang ldquoFirefly algorithm levy flights and global optimiza-tionrdquo in Research and Development in Intelligent Systems XXVIpp 209ndash218 Springer London UK 2010

[12] X-S Yang and S Deb ldquoCuckoo Search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NaBIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[13] R Rajabioun ldquoCuckoo optimization algorithmrdquo Applied SoftComputing vol 11 no 8 pp 5508ndash5518 2011

[14] E Rashedi H Nezamabadi-Pour and S Saryazdi ldquoGSA agravitational search algorithmrdquo Information Sciences vol 179no 13 pp 2232ndash2248 2009

[15] S Mirjalili and A Lewis ldquoAdaptive gbest-guided gravitationalsearch algorithmrdquo Neural Computing and Applications vol 25no 7 pp 1569ndash1584 2014

[16] M H Sulaiman Z Mustaffa M R Mohamed and O AlimanldquoUsing the gray wolf optimizer for solving optimal reactivepower dispatch problemrdquo Applied Soft Computing vol 32 pp286ndash292 2015

[17] X H Song L Tang S T Zhao et al ldquoGrey Wolf Optimizerfor parameter estimation in surface wavesrdquo Soil Dynamics andEarthquake Engineering vol 75 pp 147ndash157 2015

[18] G M Komaki and V Kayvanfar ldquoGrey Wolf Optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[19] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[20] B Mahdad and K Srairi ldquoBlackout risk prevention in a smartgrid based flexible optimal strategy using Grey Wolf-patternsearch algorithmsrdquo Energy Conversion and Management vol98 pp 411ndash429 2015

[21] O Kramer ldquoIterated local search with Powellrsquos methoda memetic algorithm for continuous global optimizationrdquoMemetic Computing vol 2 no 1 pp 69ndash83 2010

[22] M J D Powell ldquoRestart procedures for the conjugate gradientmethodrdquoMathematical Programming vol 12 no 1 pp 241ndash2541977

[23] I E Evangelou D G Hadjimitsis A A Lazakidou and CClayton ldquoData mining and knowledge discovery in compleximage data using artificial neural networksrdquo in Proceedings ofthe Workshop on Complex Reasoning an Geographical DatalPaphos Cyprus 2001

[24] M S Kamel and S Z Selim ldquoNew algorithms for solving thefuzzy clustering problemrdquo Pattern Recognition vol 27 no 3 pp421ndash428 1994

[25] M Omran A Salman and A P Engelbrecht ldquoImage clas-sification using particle swarm optimizationrdquo in Proceedingsof the 4th Asia-Pacific Conference on Simulated Evolution andLearning Singapore 2002

[26] X J Lei Swarm Intelligent Optimization Algorithms and TheirApplications Science Press 2012

[27] A K Jain ldquoData clustering 50 years beyond K-meansrdquo PatternRecognition Letters vol 31 no 8 pp 651ndash666 2010

[28] W Zou Y Zhu H Chen and X Sui ldquoA clustering approachusing cooperative artificial bee colony algorithmrdquo DiscreteDynamics in Nature and Society vol 2010 Article ID 45979616 pages 2010

[29] B Zhang M Hsu and U Dayal ldquoK-Harmonic meansmdasha dataclustering algorithmrdquo Hewlett-Packard Labs Technical ReportHPL 1999

[30] S Z Selim and K Alsultan ldquoA simulated annealing algorithmfor the clustering problemrdquo Pattern Recognition vol 24 no 10pp 1003ndash1008 1991

[31] K S Al-Sultan ldquoA Tabu search approach to the clusteringproblemrdquo Pattern Recognition vol 28 no 9 pp 1443ndash1451 1995

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 17: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

Discrete Dynamics in Nature and Society 17

[32] C S Sung and H W Jin ldquoA tabu-search-based heuristic forclusteringrdquo Pattern Recognition vol 33 no 5 pp 849ndash8582000

[33] M C Cowgill R J Harvey and L T Watson ldquoA genetic algo-rithmapproach to cluster analysisrdquoComputers andMathematicswith Applications vol 37 no 7 pp 99ndash108 1999

[34] K Krishna and M N Murty ldquoGenetic K-means algorithmrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 29 no 3 pp 433ndash439 1999

[35] U Maulik and S Bandyopadhyay ldquoGenetic algorithm-basedclustering techniquerdquo Pattern Recognition vol 33 no 9 pp1455ndash1465 2000

[36] C A Murthy and N Chowdhury ldquoIn search of optimal clustersusing genetic algorithmsrdquoPatternRecognition Letters vol 17 no8 pp 825ndash832 1996

[37] M Fathian B Amiri and A Maroosi ldquoApplication of honey-bee mating optimization algorithm on clusteringrdquo AppliedMathematics and Computation vol 190 no 2 pp 1502ndash15132007

[38] D W Van der Merwe and A P Engelbrecht ldquoData cluster-ing using particle swarm optimizationrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo03) vol 1 pp 215ndash220 IEEE Canberra Australia December 2003

[39] A Hatamlou S Abdullah and M Hatamlou ldquoData clusteringusing big bang-big crunch algorithmrdquo in Innovative ComputingTechnology vol 241 of Communications in Computer andInformation Science pp 383ndash388 Springer 2011

[40] D Karaboga and C Ozturk ldquoA novel clustering approachartificial Bee Colony (ABC) algorithmrdquoApplied Soft ComputingJournal vol 11 no 1 pp 652ndash657 2011

[41] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoAppli-cation of gravitational search algorithm on data clusteringrdquo inRough Sets andKnowledge Technology vol 6954 of LectureNotesin Computer Science pp 337ndash346 Springer Berlin Germany2011

[42] A Hatamlou S Abdullah and H Nezamabadi-Pour ldquoA com-bined approach for clustering based on K-means and gravita-tional search algorithmsrdquo Swarm and Evolutionary Computa-tion vol 6 pp 47ndash52 2012

[43] P S Shelokar V K Jayaraman and B D Kulkarni ldquoAn antcolony approach for clusteringrdquo Analytica Chimica Acta vol509 no 2 pp 187ndash195 2004

[44] Y Kao and K Cheng ldquoAn ACO-based clustering algorithmrdquoin Ant Colony Optimization and Swarm Intelligence vol 4150of Lecture Notes in Computer Science pp 340ndash347 SpringerBerlin Germany 2006

[45] Y-T Kao E Zahara and I-W Kao ldquoA hybridized approach todata clusteringrdquo Expert Systems with Applications vol 34 no 3pp 1754ndash1762 2008

[46] T Niknam B Amiri J Olamaei and A Arefi ldquoAn efficienthybrid evolutionary optimization algorithm based on PSO andSA for clusteringrdquo Journal of Zhejiang University Science A vol10 no 4 pp 512ndash519 2009

[47] W-F Gao S-Y Liu and L-L Huang ldquoA novel artificial beecolony algorithm with Powellrsquos methodrdquo Applied Soft Comput-ing vol 13 no 9 pp 3763ndash3775 2013

[48] H R Lourenco O Martin and T Stutzle ldquoA beginnerrsquosintroduction to iterated local searchrdquo in Proceedings of the 4thMetaheuristics International Conference (MIC rsquo01) vol 2 pp 1ndash6 Porto Portugal 2001

[49] T G Stutzle Local Search Algorithms for Combinatorial Prob-lems Analysis Improvements and New Applications vol 220 ofDISKI Dissertations on Artificial Intelligence Infix PublishersSankt Augustin Germany 1999

[50] X S Yang Ed Test Problems in Optimization An Introductionwith Metaheuristic Applications Wiley London UK 2010

[51] M Molga and C Smutnicki ldquoTest functions for optimization-needsrdquo 2005 httpwwwzsdictpwrwrocplfilesdocsfunc-tionspdf

[52] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[53] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 specialsession on real parameter optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

[54] F Wilcoxon ldquoIndividual comparisons by ranking methodsrdquoBiometrics Bulletin vol 1 no 6 pp 80ndash83 1945

[55] J W Han M Kamber and J Pei Data Mining Concepts andTechniques Morgan Kaufmann Publishers 2011

[56] C Blake and C J Merz ldquoUCI Repository of Machine LearningDatabasesrdquo 1998

[57] T Niknam and B Amiri ldquoAn efficient hybrid approach basedon PSO ACO and k-means for cluster analysisrdquo Applied SoftComputing Journal vol 10 no 1 pp 183ndash197 2010

[58] E Anderson ldquoThe irises of the Gaspe peninsulardquo Bulletin of theAmerican Iris Society vol 59 pp 2ndash5 1935

[59] R A Fisher ldquoThe use of multiple measurements in taxonomicproblemsrdquo Annals of Eugenics vol 7 no 2 pp 179ndash188 1936

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 18: Research Article Grey Wolf Optimizer Based on Powell Local ...downloads.hindawi.com/journals/ddns/2015/481360.pdf · Research Article Grey Wolf Optimizer Based on Powell Local Optimization

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of