Research ArticleParameter Determination of Milling Process Using a NovelTeaching-Learning-Based Optimization Algorithm
Zhibo Zhai, Shujuan Li, and Yong Liu
School of Mechanical and Instrument Engineering, Xiโan University of Technology, 5 South Jinhua Road, Xiโan, Shaanxi 710048, China
Correspondence should be addressed to Shujuan Li; [email protected]
Received 24 July 2015; Revised 29 September 2015; Accepted 7 October 2015
Academic Editor: Anna Vila
Copyright ยฉ 2015 Zhibo Zhai et al. This is an open access article distributed under the Creative Commons Attribution License,which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Cutting parameter optimization dramatically affects the production time, cost, profit rate, and the quality of the final products, inmilling operations. Aiming to select the optimum machining parameters in multitool milling operations such as corner milling,face milling, pocket milling, and slot milling, this paper presents a novel version of TLBO, TLBO with dynamic assignmentlearning strategy (DATLBO), in which all the learners are divided into three categories based on their results in โLearner Phaseโ:good learners, moderate learners, and poor ones. Good learners are self-motivated and try to learn by themselves; each moderatelearner uses a probabilistic approach to select one of good learners to learn; each poor learner also uses a probabilistic approach toselect several moderate learners to learn. The CEC2005 contest benchmark problems are first used to illustrate the effectiveness ofthe proposed algorithm. Finally, the DATLBO algorithm is applied to a multitool milling process based on maximum profit ratecriterion with five practical technological constraints. The unit time, unit cost, and profit rate from the Handbook (HB), FeasibleDirection (FD) method, Genetic Algorithm (GA) method, five other TLBO variants, and DATLBO are compared, illustrating thatthe proposed approach is more effective than HB, FD, GA, and five other TLBO variants.
1. Introduction
In modern manufacturing, determining optimal cuttingparameters is of great importance to improve the quality ofproducts, to reduce themachining costs, and tomaximize theprofit rate. The main cutting parameters in multitool millingoperations include the feed per tooth, cutting velocity, and theradial and axial depths of cut. The conventional methods ofselecting of cutting parameters mainly depend either on theoperator experience or on machining data from handbooks.But it is a known fact that the cutting parameters obtainedfrom these resources, in most cases, are extremely conserva-tive. Consequently, it may not perform high productivity. Soit is necessary to develop a new technique to investigate thecutting optimization problem.
There are many mathematical programming techniquesto be used extensively for optimization of cutting parameterover the past few decades. In earlier studies, Gupta et al. [1]developed an integer programming for the determination ofoptimal subdivision of depth of cut in multipass turning withconstraints. Subsequently, Wang et al. [2] used deterministic
graphical programming to optimize machining parametersof cutting conditions for single pass turning operations.Shin and Joo [3] used a dynamic programming for thedetermination of optimum of machining conditions withpractical constraints. Petropoulos [4] developed a geometricprogramming model for the selection optimal selection ofmachining rate variables.
Although these mathematical programming techniqueshave been applied to solve the cutting parameter optimizationproblem, these studies have not involved some importantcutting constraints. Considering the number of constraintssuch as surface roughness, cutting force, cutting velocity,machining power, and tool life, cutting parameter optimiza-tion problem is very complicated. The additional variablesdue to number of passes make the solution procedure morecomplicated. These mathematical programming techniquesincline to obtain local optima and may be only useful for aspecific problem.
Recently, nontraditional optimization approachesrecently have been developed to solve the cutting parameteroptimization problem. Shunmugam et al. [5] used a Genetic
Hindawi Publishing CorporationMathematical Problems in EngineeringVolume 2015, Article ID 425689, 14 pageshttp://dx.doi.org/10.1155/2015/425689
2 Mathematical Problems in Engineering
Algorithm (GA) to optimize cutting parameters in multipassmilling operations, exploiting total production cost as theobjective function. Li et al. [6] developed a two-phase GA tooptimize the spindle speed and feed and select the tools fordrilling blind holes in parallel drilling operations to obtainthe minimum completion time. Krimpenis and Vosniakos[7] used a GA to optimize rough milling for parts withsculptured surfaces and select process parameters such asfederate, cutting speed, width of cut, raster pattern angle,spindle speed, and number of machining slices of variablethickness. AlthoughGAhas some advantages over traditionaltechniques, the successful application of GA depends on thepopulation size and the diversity of individual solutions in thesearch space. If its diversity cannot be maintained before theglobal optimum is reached, it may prematurely converge to alocal optimum. Liu andWang [8] proposed amodified GA tooptimize milling parameter selection. The operating domainis defined and changed to be around the optimal point in itsevolutionary processes so that the convergence speed andaccuracy are improved. Wang et al. [9] presented a parallelgenetic simulated annealing to select optimal machiningparameters for multipass milling operations. The Taguchimethod was initially used to predict cutting parameterperformance measures, and then the GA was utilized tooptimize the cutting conditions. Subsequently, Oktem [10]discussed the utilization of Artificial Neural Network (ANN)and GA for predicting the best combinations of cuttingparameters to provide the best surface roughness. Li et al.[11] suggested combining the ANN and GA to minimize themake-span in production scheduling problems. Antonio etal. [12] used a GA based on an elitist strategy to minimizemanufacturing costs of multipass cutting parameters inface milling operations. Zhou et al. [13] applied fuzzyparticle swarm optimization algorithm (PSO) to select themachining parameters for milling operations. Zarei et al.[14] proposed a Harmony Search (HS) algorithm to definethe optimum cutting parameters for a multipass face millingoperation. Mahdavinejad et al. [15] developed a new hybridoptimization approach by combining the immune algorithmwith ANN to predict the effect of milling parameters on thefinal surface roughness of Ti-6Al-4V work pieces. Bricenoet al. [16] selected an ANN for modeling and simulatingthe milling process. Orthogonal design and specificallyequally spaced dimensioning showed that ANN is a goodmethod to define process parameters. Venkata Rao andPawar [17] used an Artificial Bee Colony (ABC) algorithmto minimize production time of a multipass milling processto determine the optimal process parameters such as thenumber of passes, depth of cut for each pass, cutting velocity,and feed. Onwubolu [18] used a new optimization techniquebased on tribes to select the optimummachining parametersin multipass milling operations such as plain milling andface milling by simultaneously considering multipass roughmachining and finish machining.
Although some improvements in optimizing machiningparameters in milling operations have beenmade, these non-traditional optimization approaches require a lot of specificcontrolling parameters except for the common parameterssuch as number of generation and population size. For
instance, the GA involves crossover and mutation probabil-ity. Similarly, the HS algorithm includes harmony memoryconsidering rate, bandwidth rate, and a random select rate.These specific controlling parameters affect significantly theperformance of the above mentioned algorithms. Improperparameters of algorithms either raise total complexity ofconsumption time or fall into the local optimum. So thereremains a need for efficient and effective optimization algo-rithms for the cutting parameters determination.
Very recently, Rao et al. proposed a Teaching-Learning-Based Optimization (TLBO) [19] algorithm. This algorithmdoes not need specific controlling parameters except forthe common parameters such as number of generation andpopulation size. As a stochastic search strategy, it is a newalgorithm based on swarm intelligence having the char-acteristics of rapid convergence, simple computation, andno specific controlling parameters except for the commonparameters such as number of generation and populationsize. However, it has some undesirable dynamical propertiesthat degrade its searching ability [20]. One of the mostimportant issues is that the population tends to be trapped inthe local optima solution because of diversity loss. To improvethe performance of the original TLBO, a few modified orimproved algorithms are proposed in recent years, such asteaching-learning-based optimization with dynamic groupstrategy (DGSTLBO) [20], teaching-learning-based opti-mizationwith neighborhood search (NSTLBO) [21], an elitistteaching-learning-based optimization algorithm (ETLBO)[22], and a variant of teaching-learning-based optimizationalgorithm with differential learning (DLTLBO) [23]. Thesemodified TLBOs have better performance than the origi-nal TLBO on classical benchmark functions. Although theabovementioned variants TLBO have some improvements,they never focus on correct assignment problem. That isto say, each learner should be assured correct assignmentof learning objects in the โLearner Phase.โ To this aim, wepresent a novel version of TLBO, TLBOwith dynamic assign-ment learning strategy (DATLBO), in which all the learnersare divided into three categories in the โLearner Phaseโ:good learners, moderate learners, and poor ones. Goodlearners are self-motivated and try to learn by themselves;each moderate learner uses a probabilistic approach to selectone of good learners to learn; each poor learner also usesa probabilistic approach to select several moderate learnersto learn. The modification tries to both enable the diversityof the population to be preserved in order to discouragepremature convergence and achieve balance between theexplorative and exploitative tendencies of achieving bettersolution. A case study in multitool milling operations is usedto verify DATLBO. The results are compared with resultsfrom GA [24], the feasible direction method [25], handbookrecommendations [26], and five other TLBO variants.
The paper is organized as follows. Section 2 gives a shortintroduction to modeling of milling operations. OriginalTLBO algorithm and the proposed algorithm, DATLBO, aredescribed in Section 3. This case study of multitool millingparameter optimization is presented in Section 4, and thesummary and conclusions are given in Section 5. The lastsection provides the nomenclature.
Mathematical Problems in Engineering 3
End mill
aa
ar
Vf
D
Work piece
N
(a) End milling
aa
ar
VfD
Face millingcutterN
fr
Work piece
(b) Face milling
Figure 1: Milling operations.
2. Modeling of Milling Operations
Milling is a machining process which uses rotary multipletooth cutters to remove material from a work piece. Figure 1displays two kinds of milling operations: endmilling and facemilling. As the cutter rotates, each tooth removes a smallamount of material from the advancing work piece duringeach spindle revolution.
In this study, face milling and end milling opera-tions are considered. For multitool milling operations, itis very important to select the optimum process param-eters such as depth of cut, feed per tooth, and cuttingvelocity. Depth of cut is usually predetermined by thework piece geometry and operation sequence. Hence, deter-mining machining process parameters can be simplifiedto determining the proper cutting velocity and feed pertooth.
2.1. Modeling of Unit Time, Unit Cost, and Profit Rate.The mathematical model of multitool milling operationsformulated in this study is based on the research of Tolouei-Rad and Bidhendi [25].The decision variables considered forthis model are cutting velocity (V) and feed per tooth ๐
๐ง.
The objective function is to maximize the profit rate. Theunit production time for a single part in multitool milling
operations is the sum of setup time, machining time, and toolchanging time. The unit production time is
๐๐ข
= ๐ก๐
+๐
โ๐=1
๐ก๐๐
+๐
โ๐=1
๐ก๐๐ก๐
, (1)
where the machining time is
๐ก๐๐
=๐๐๐๐พ
1000V๐๐๐ง๐
๐ง๐
, ๐ = 1, 2, . . . , ๐. (2)
The unit cost for producing of the part in multitoolmilling operations is the sum of material cost, setup cost,machining cost, tool cost, and tool changing cost. The unitcost is
๐ถ๐ข
= ๐mat + (๐๐+ ๐๐) ๐ก๐
+๐
โ๐=1
(๐๐+ ๐๐) ๐ก๐๐
+๐
โ๐=1
[(๐๐+ ๐๐) ๐ก๐๐
+ ๐๐ก๐]
๐ก๐๐
๐๐
,
(3)
where the tool life is
๐๐=
60
๐๐
{๐ถ๐[(๐๐๐
/๐๐ง๐
) /5]๐๐
(๐๐๐
๐๐ง๐
)๐ค๐ ๐๐๐๐๐/1000
}
1/๐๐
. (4)
For multitool milling operations the profit rate is
๐๐
=๐๐
โ {๐mat + (๐๐+ ๐๐) ๐ก๐
+ โ๐
๐=1(๐๐+ ๐๐) ๐ก๐๐
+ โ๐
๐=1[(๐๐+ ๐๐) ๐ก๐๐
+ ๐๐ก๐] (๐ก๐๐
/๐๐)}
๐ก๐
+ โ๐
๐=1๐ก๐๐
+ โ๐
๐=1๐ก๐๐ก๐
. (5)
4 Mathematical Problems in Engineering
2.2.Milling Process Constraints. Feed per tooth is in the rangedetermined by theminimum andmaximum feed per tooth ofthe machine tool:
๐๐งmin โค ๐
๐งโค ๐๐งmax. (6)
Optimum cutting velocity is in the range determined bythe minimum and maximum cutting velocity of the machinetool:
Vmin โค V โค Vmax. (7)
The total cutting force constraint is
๐ถ๐น๐๐ฅ๐น๐
๐๐ฆ๐น๐ง
๐๐ข๐น๐
๐ง
๐๐๐น๐๐ค๐น๐พFC โ ๐น
๐(per) โค 0. (8)
Machining power cannot exceed the effective maximummachining power. Therefore, the power constraint is
๐น๐V
1000โ ๐๐๐
โค 0. (9)
The required surface roughness cannot exceed the max-imum allowable surface roughness. Therefore the surfaceroughness constraint for end milling operations is [25]
318๐๐ง
2
4๐โ ๐ ๐(at) โค 0 (10)
and for face milling constraint is [25]
318๐๐ง
tan (๐๐) + cot (๐
๐)
โ ๐ ๐(at) โค 0. (11)
3. Teaching-Learning-Based Optimization
3.1. TLBO Algorithm. Inspired by the philosophy of teachingand learning, Rao et al. presented a teaching-learning-basedoptimization (TLBO) [19]. It was developed based on thesimulation of a classical learning process in a class. Like otherpopulation set-based methods such as GA, DE, and PSO,TLBO also uses a population of candidate solutions, calledlearners, with their positions initialized randomly from thesearch space. The teacher is generally referred to as a highlylearned person who shares his or her knowledge with thelearners in a class.The quality of a teacher affects the outcome(i.e., grads or marks) of learners. Furthermore, learners alsolearn from interaction between themselves.
In original TLBO algorithm, different design variableswill be analogous to different subjects offered to learnersand the learnersโ result is analogous to the โfitness,โ asin other population based optimization techniques. Thelearning process of TLBO is divided into two phases whichconsists of โTeacher Phaseโ and โLearner Phase.โ
Teacher Phase. During teaching phase, the best learner orteacher tries to bring the mean result of the class in a subjecttaught by him or her who depends on his or her capability.But in practice, a teacher can only move the mean of a class
up to close to his or her result to some extent. Suppose that ๐denotes the teacher and ๐ denotes the mean at any iteration.If๐moves๐ toward its own levelmean๐, the newmeanwillbe ๐ designated as ๐new. The difference between the existingmean and the new mean is given as follows:
Difference Mean = ๐ (๐new โ ๐๐น๐) , (12)
where ๐ is a randomvector inwhich each element is a randomnumber in the range [0, 1].๐
๐นdenotes a teaching factorwhich
decides the value of mean to be changed, and the value of ๐๐น
can be either 1 or 2. Based on (12), the updating formula ofthe learning for a learner ๐
๐in teacher phase is given by
๐new,๐ = ๐๐+ DifferenceMean. (13)
Learner Phase. During learning phase, each learner interactswith other learners to improve his knowledge. The randominteraction of learners is going with the help of formalcommunications, presentations, group discussions, and soforth. If the other learner have more knowledge than him orher, the learner learns something new. The learner mode isbased on the following expression:
๐new,๐ ={{{
๐๐+ ๐ (๐
๐โ ๐๐) ,
๐๐+ ๐ (๐
๐โ ๐๐) .
(14)
3.2. DATLBO Algorithm. In the original TLBO, each learnerinteracts randomly with other learners in โLearner Phase,โwhich has certain blindness and does not assure correctassignment of learning objects to each learner. During thecourse of optimization, this situation results in slower con-vergence rate of optimization problem.Motivated by the fact,we propose a novel version of TLBO, TLBO with dynamicassignment learning strategy (DATLBO).
3.2.1. Dynamic Assignment Learning Strategy. Study revealsthat birds employ different strategies to conduct matingprocess among their society. The ultimate success of a birdto raise a brood with superior features depends on anappropriate assignment strategy it uses [27]. It is well knownthat, in the original TLBO, each learner interacts randomlywith other learners in โLearner Phase,โ which has a certaindegree of blindness and does not assure correct assignmentof learning objects to each learner. To be more specific,appropriate assignment strategy can play an important role inimproving the results of thewhole class. Inspired by the abovebird mating process, all the learners are divided into threecategories based on their results: good learners, moderatelearners, and poor ones. The good learners are those learnersthat have the most result; the moderate learners are thoselearners that have the better result, and the poor ones arethose learners that have bad result. Totally, each category hasits own learning pattern. By means of this assignment, entireclass is split into different categories of learners as per theirlevel and each learner is assigned an appropriate learningobject.The way by which each category produces a candidatesolution will be explained below in detail.
Mathematical Problems in Engineering 5
(1) For ๐ = 1 to ๐ท do(2) If ๐
1> sf
(3) ๐(๐)new,๐ = ๐(๐)๐
+ ๐ ร (๐2
โ ๐3) + ๐(๐)
๐;
(4) Else(5) ๐(๐)new,๐ = ๐(๐)
๐;
(6) End If(7) End For
Pseudocode 1
(1) ๐new,๐ = ๐๐+ ๐ค ร ๐ ร (๐
๐โ ๐๐);
(2) If ๐1
> sf(3) ๐(๐)new,๐ = ๐
2ร (๐(๐) โ ๐ข(๐));
(4) End If
Pseudocode 2
3.2.2. Each Category Learning Pattern in the Dynamic Assign-ment Learning Strategy. Each good learner is able to self-learn without the help of others in โLearner Phaseโ; that isto say, good learners are self-motivated and try to learn bythemselves. Therefore, each good learner tries to increasehis or her own knowledge abilities of certain subject bymaking a small change in her subjects probabilistically. Fromthe optimization view, exploitation of the best solutionsfound so far is performed by good learners. The self-learningpattern pseudocode of each good learner is implementedin Pseudocode 1, where ๐new,๐ is a newly generated learneraccording to ๐
๐, ๐ท is the problem dimension, ๐
1, ๐2, ๐3are
uniformly distributed random numbers in the range [0, 1],๐ ๐ is the self-motivated factor of each good learner, and ๐denotes the step size.
Each moderate learner selects one of the whole goodlearners with a probabilistic approach and learns from hisown interesting good learner. The assignment good learnerhas a better chance of being selected with more knowledge.The learning pattern pseudocode of each moderate learneris implemented in Pseudocode 2, where ๐new,๐ is a newlygenerated learner according to๐
๐,๐ค is a time-varying weight
to adjust the importance of the interesting good learner, ๐is a vector whose each element is distributed randomly inthe range [0, 1], ๐
๐is the selected object from the whole
good learners, ๐ ๐ is the self-motivated factor of eachmoderatelearner, and ๐ข and ๐ are the upper and lower bounds of theelements, respectively.
Each poor learner tends to learn from two or moremoderate learners that are selected with a probabilisticapproach. The learning pattern pseudocode of each poorlearner is implemented in Pseudocode 3, where ๐new,๐ is anewly generated learner according to ๐
๐, ๐ค is a time-varying
weight to adjust the importance of the interesting moderatelearner, ๐
๐is the number of being selected moderate learners,
๐๐is a vector whose each element, distributed randomly in
[0, 1], ๐๐is the ๐th selected object from the whole moderate
learners, ๐2is a random number in the range [0, 1], ๐ ๐ is the
(1) ๐new,๐ = ๐๐+ ๐ค ร โ
๐๐
๐=1๐๐ร (๐๐๐
โ ๐๐);
(2) If ๐1
> sf(3) ๐(๐)new,๐ = ๐
2ร (๐(๐) โ ๐ข(๐))
(4) End If
Pseudocode 3
self-motivated factor of each poor learner, and ๐ข and ๐ are theupper and lower bounds of the elements, respectively.
As explained above, the pseudocode of the dynamicassignment learning strategy is given in Pseudocode 4.
3.2.3. Each Category Parameters Adjustment in the DynamicAssignment Learning Strategy. To apply the dynamic assign-ment learning strategy to DATLBO algorithm, the appro-priate parameters have to be tuned. It seems that the mostimportant parameter is the proportion of each learner fromthe class. It is suggested that the percentages of good learners,moderate learners, and bad learners are, respectively, setto 20, 50, and 30 of the class. Only one assigned goodlearner and two or three assigned moderate learners will beenough. Self-motivated factor is between 0 and 1. ๐ ๐ can beset between 0.9 and 1. Small values of this parameter mayresult in bad impact on the performance of the assignmentlearning strategy. It is better to select ๐ ๐ as an increasing linearfunction which changes from a small value nearby zero (e.g.,0.1) to a large one nearby 1 (e.g., 0.9). This behavior allowslearners to change their subjects abilities at the beginning ofthe assignment learning strategy with high probability. Thisprobability decreases during the generations and helps thelearners to converge to the global solution.The step size ๐ canbe selected from the order of 10โ2 or 10โ3. To provide a goodbalance between local and global search, ๐ค decreases linearlyfrom a value nearby 2 to a small one nearby 0.
3.3.The Pseudocode of DATLBOAlgorithm. By incorporatingthe dynamic assignment learning strategy in โLearner Phaseโinto the original TLBO framework, the DATLBO algorithmis developed. The pseudocode of DATLBO algorithm ispresented in Pseudocode 5.
3.4. Experiments and Comparisons
3.4.1. Benchmark Functions Used in Experiments. To analyzeand compare the performance and accuracy of DATLBOalgorithm, a large set of CEC2005 tested benchmark func-tions are used to do the experiments. Based on the shapecharacteristics, the set of benchmark functions are groupedinto unimodal functions (๐น
1to ๐น5) and basic multimodal
functions (๐น6to ๐น12). The brief descriptions of these bench-
mark functions are listed in Table 1. For more details aboutthe definition of benchmark functions, refer to [28].
3.4.2. Experimental Platform, Termination Criterion, andParameter. All experiments run in the same machine with
6 Mathematical Problems in Engineering
BeginSet: sf = sfmin โ (sfmin โ sfmax) โ (๐ก/๐กmax); [pp, mm] = sort(fitness); fitness = pp; ๐ = ๐(mm,:);(1) For ๐ = 1 : ๐
1
(2) For ๐ = 1 to ๐ท do(3) If ๐
1> sf
(4) ๐(๐)new,๐ = ๐(๐)๐
+ ๐ ร (๐2
โ ๐3) + ๐(๐)
๐;
(5) Else(6) ๐(๐)new,๐ = ๐(๐)
๐;
(7) End If(8) End For(9) End For(10) index
1= roulette wheel 1 (fitness, ๐
1, ๐2);
(11) For ๐ = 1 : ๐2
(12) For ๐ = 1 to ๐ท do(13) ๐(๐)new,๐1+๐ = ๐(๐)
๐1+๐+ ๐ค ร ๐ ร (๐(๐)index1 โ ๐(๐)
๐1+๐);
(14) End For(15) ๐ = rand๐(๐ท);(16) If ๐
1> sf
(17) ๐(๐)new,๐1+๐ = ๐(๐) โ ๐2
ร (๐(๐) โ ๐ข(๐));(18) End If(19) End For(20) index
2= roulette wheel 2 (fitness, ๐
1, ๐3, ๐๐1);
(21) For ๐ = 1 : ๐3
(22) For ๐ = 1 to ๐ท do(23) For ๐ = 1 : ๐๐
1
(24) For ๐ = 1 to ๐ท do(25) ๐
3
(๐)
๐= rand(1, ๐ท) โ (๐(๐)index1(๐,๐) โ ๐(๐)
๐1+๐2+๐);
(26) End For(27) End For(28) ๐(๐)new,๐1+๐2+๐ = ๐(๐)
๐1+๐2+๐+ ๐ค โ sum(๐
3๐);
(29) ๐ = rand๐(๐ท);(30) If ๐
1> sf
(31) ๐(๐)new,๐1+๐2+๐ = ๐(๐) โ ๐2
ร (๐(๐) โ ๐ข(๐));(32) End If(33) End For(34) End ForEnd
Pseudocode 4: The pseudocode of dynamic assignment learning strategy.
a Celoron2.26GHz CPU, 2GB memory, and windows XPoperating system with Matlab7.9. For the purpose of decreas-ing statistical errors, all experiments independently run 25times for twelve test functions of 30 variables and 300,000function evaluations (FES) as the stopping criterion.
Theparameter setting ofDATLBOalgorithm is as follows:๐ = 50, ๐พ = 3, and number of good learners, moderatelearners, and poor learners is set to 10, 25, and 15, respectively;roulette wheel is used as the selection approach; number ofbeing assigned good learners and moderate learners are setto 1 and 2, respectively; self-motivated factor ๐ ๐ = 0.9; stepsize ๐ = 10โ3. The parameters of other algorithms agree wellwith the original papers.
3.4.3. Performance Metric. The mean value ๐นmean and stan-dard deviation (SD) of the function error value ๐น(๐ฅ) โ ๐น(๐ฅโ)are recorded to evaluate the performance of each algorithm,where ๐น(๐ฅ) and ๐น(๐ฅโ) denote the best fitness value and the
real global optimization value of test problem, respectively. Toverify whether the overall optimization performance of var-ious algorithms is significantly different, statistical analysismethod is used to compare the results obtained by algorithmsfor kinds of problems. Therefore, to statistically compareDATLBO algorithm with other five algorithms, the statisticaltool Wilcoxons rank sum test [29] at a 0.05 significance levelis used to evaluate whether the median fitness values (๐นmean)of two solutions from any two algorithms.
3.4.4. Numerical Experiments and Results. In this section,DATLBO algorithm is compared with five other TLBOvariants. Each corresponding table presents the experimentalresults, and the last three rows of each table summarize thecomparison results. The best results are shown in bold.
From the statistical results of Table 2, we can see that noneof the algorithms can perfectly solve the twelve CEC2005standard benchmark functions. From the Wilcoxonโs rank
Mathematical Problems in Engineering 7
Input: ๐, ๐ท, ๐พ, ๐, ๐, ๐ก, FESMAX; ๐1
= 0.2 โ ๐; ๐2
= 0.5 โ ๐; ๐3
= 0.3 โ ๐;๐คmax = 2.5; ๐คmin = 0.25 = 0.001; sfmax = 0.9; sfmin = 0.1; ๐๐
1= 2; ๐พ = 3;
(1) ๐ก = 0;(2) Generate an initial population: ๐ = {๐ฅ
1, ๐ฅ2, . . . , ๐ฅ
๐};
(3) FES = ๐; ๐กmax = floor((FESMAX โ ๐)/๐);(4)While FES <= FESMAX(6) Evaluate the objective function values: ๐(๐);(7) [๐๐, ๐๐] = min(fitness);(7) Find the best learner: ๐๐๐๐ ๐ก(๐ก);(8) ๐๐๐๐ ๐ก = mean(๐);(9) For ๐ = 1 : ๐(10) ๐
๐น= round(1 + rand);
(11) For ๐ = 1 : ๐ท
(12) ๐(๐)
new,๐ = ๐(๐)
๐+ rand ร (๐๐๐๐ ๐ก(๐) โ ๐
๐นร ๐๐๐๐ ๐ก(๐));
(13) If ๐(๐)
new,๐ > ๐๐ข(๐)
2
(14) ๐(๐)
new,๐ = max(๐๐ข(๐)
1, 2 ร ๐๐ข
๐
2โ ๐(๐)
new,๐)(15) End If(16) If ๐
(๐)
new,๐ < ๐๐ข(๐)
1
(17) ๐(๐)
new,๐ = max(๐๐ข(๐)
2, 2 ร ๐๐ข
๐
1โ ๐(๐)
new,๐);(18) End If(19) End For(20) End For(21) sf = sfmin โ (sfmin โ sfmax) โ (๐ก/๐กmax); [pp, mm] = sort(fitness); fitness = pp; ๐ = ๐(mm,:)(22) For ๐ = 1 : ๐
1
(23) For ๐ = 1 to ๐ท do(24) If ๐
1> sf
(25) ๐(๐)new,๐ = ๐(๐)๐
+ ๐ ร (๐2
โ ๐3) + ๐(๐)
๐;
(26) Else(27) ๐(๐)new,๐ = ๐(๐)
๐;
(28) End If(29) End For(30) End For(31) index
1= roulette wheel 1 (fitness, ๐
1, ๐2);
(32) For ๐ = 1 : ๐2
(33) For ๐ = 1 to ๐ท do(34) ๐(๐)new,๐1+๐ = ๐(๐)
๐1+๐+ ๐ค ร ๐ ร (๐(๐)index1 โ ๐(๐)
๐1+๐);
(35) End For(36) ๐ = rand๐(๐ท);(37) If ๐
1> sf
(38) ๐(๐)new,๐1+๐ = ๐(๐) โ ๐2
ร (๐(๐) โ ๐ข(๐));(39) End If(40) End For(41) index
2= roulette wheel 2 (fitness, ๐
1, ๐3, ๐๐1);
(42) For ๐ = 1 : ๐3
(43) For ๐ = 1 to ๐ท do(44) For ๐ = 1 : ๐๐
1
(45) For ๐ = 1 to ๐ท do(46) ๐
3
(๐)
๐= rand(1, ๐ท) โ (๐(๐)index1(๐,๐) โ ๐(๐)
๐1+๐2+๐);
(47) End For(48) End For(49) ๐(๐)new,๐1+๐2+๐ = ๐(๐)
๐1+๐2+๐+ ๐ค โ sum(๐
3๐);
(50) ๐ = rand๐(๐ท);(51) If ๐
1> sf
(52) ๐(๐)new,๐1+๐2+๐ = ๐(๐) โ ๐2
ร (๐(๐) โ ๐ข(๐));(53) End If(54) End For(55) End ForEnd
Pseudocode 5: The pseudocode of DATLBO algorithm.
8 Mathematical Problems in Engineering
Table 1: Benchmark functions definition.
Name Definition Range๐น1
Shifted sphere function [50, 100]
๐น2
Shifted Schwefelโs problem 1.2 [50, 100]
๐น3
Shifted rotated high conditioned elliptic function [50, 100]
๐น4
Shifted Schwefelโs problem 1.2 with noise in fitness [50, 100]
๐น5
Schwefelโs problem 2.6 with global optimum on bounds [50, 100]
๐น6
Shifted Rosenbrockโs function [50, 100]
๐น7
Shifted rotated Griewankโs function without bounds [300, 600]
๐น8
Shifted rotated Ackleyโs function with global optimum on bounds [16, 32]
๐น9
Shifted Rastriginโs function [2.5, 5]๐น10
Shifted rotated Rastriginโs function [2.5, 5]๐น11
Shifted rotated Weierstrass function [0.25, 0.5]๐น12
Schwefelโs problem 2.13 [50, 100]
sum test listed in the last three rows of Table 2, it is clear thatDATLBO algorithm outperforms original TLBO algorithmon test functions except for functions๐น
5,๐น6, and๐น
7. Although
DLTLBO algorithm outperforms DATLBO algorithm on testfunction ๐น
4, DATLBO algorithm is significantly better than
DLTLBO algorithm on other test functions such as ๐น1, ๐น3, ๐น6,
๐น8, ๐น9, ๐น10, ๐น11, and ๐น
12. By the ensemble of assignment learn-
ing strategy, DATLBO algorithm achieves promising resultson unimodal andmultimodal functions. DATLBO algorithmoutperforms TLBO, DGSTLBO, ETLBO, NSTLBO, andDLTLBO algorithms over nine, nine, seven, nine, and nineout of twelve test functions, respectively. The convergencegraphs comparison on twelve test functions for ๐ท = 30derived from six relevant TLBO algorithms are shown inFigure 2. From the convergence graphs, we can see thatour DATLBO algorithm has better convergence speed andsolution quality in most cases than other five relevant TLBOalgorithms.Therefore, it is interesting to note that the overallperformance of DATLBO algorithm is significantly betterthan original TLBO, ETLBO, NSTLBO, DGSTLBO, andDLTLBO algorithm.
4. Case Study for Milling Operation
4.1. Problem Description. In order to compare the perfor-mance of the DATLBO algorithm with the other algorithms,a case study from [25] is used to test. A work piece hasfour machined features, namely, a step, pocket, and two slots.The milling operation schematic is shown in Figure 3. Thethree-dimensional view is in Figure 3(a), the front view is inFigure 3(b), and the top view is in Figure 3(c). The objectiveis to find optimummachining parameters with themaximumprofit rate. Table 4 shows the limits of the cutting velocity andfeed per tooth for this case study. Fivemilling operations, facemilling, cornermilling, pocketmilling, slotmilling 1, and slotmilling 2, respectively, are in Table 3.The data for the tools foreach operation are in Table 5.
Themachine tool is a vertical CNCmillingmachine;๐๐
=8.5 kWand ๐ = 95%.Thework piecematerial is 10L50l leadedsteel with hardness = 225 BHN. Other data include ๐
๐= $25,
๐ถmat = $0.55, ๐ถ๐
= 1.45 $/min, ๐ถ๐
= 0.45 $/min, ๐ก๐
= 2min,
๐ก๐๐ก
= 0.5min, ๐ถ = 33.98 for HSS tools, ๐ค = 0.28, ๐ถ = 100.05for carbide tools, ๐พ
๐= 2.24, ๐ = 0.15 for HSS tools, ๐ = 0.53
for carbide tools, and ๐ = 0.14, ๐ถ๐น
= 7900, ๐ฅ๐น
= 1.0, ๐ฆ๐น
=0.75, ๐ข
๐น= 1.1, ๐
๐น= 1.3, ๐ค
๐น= 0.2, and ๐พFC = 0.25; all the
data are from [25].
4.2. Applications of DATLBOAlgorithm to Case Study. In thiscase study, Debโs heuristic constrained handling method isused to handle the constraints with the DATLBO and fiveother TLBO variants. A tournament selection operator isused in Debโs heuristic method [30] to select and comparethe two solutions. The following three heuristic rules areimplemented:
Rule 1: if one solution is feasible and the otherinfeasible, the feasible solution is preferred.Rule 2: if both the solutions are feasible, the solutionhaving the better objective function value is preferred.Rule 3: if both solutions are infeasible, the solutionhaving the least constraint violation is preferred.
These rules are implemented at the end of the teacher andlearner phases. Debโs constraint handling methods are usedto select the new solution. For this case study, the DATLBOand five other TLBO variants are run for 25 times and thetermination condition is a maximum number of 300,000function evaluation. The population size and the dimensionare set to 30 and 10, respectively.
Tables 6โ11 show the optimal feed per tooth and cuttingvelocity obtained by the DATLBO and five other TLBOvariants, respectively, for five operations when the maximumprofit rate is reached. Figures 4โ6 and Table 12 compare theunit cost, unit time, and profit rate obtained by the HB, FD,GA, DATLBO, and five other TLBO variants. Compared tothe HB solution, the unit cost decreases by 38%, 39%, 40%,41%, 43%, 41%, 45%, and 46%, the unit time decreases by41%, 44%, 46%, 48%, 48%, 47%, 47%, and 51%, and the profitrate increases by 2.51, 2.73, 2.94, 3.00, 3.18, 2.96, 3.25, and 3.66times for the FD, GA, TLBO, DGSTLBO, ETLBO, NSTLBO,DLTLBO, and DATLBO, respectively. The maximum profitrate given by the DATLBO algorithm is 3.31 $/min, which is
Mathematical Problems in Engineering 9
Table 2: Results of six algorithms over 25 independent times on 12 test functions of 30 variables with 300,000 FES.
Function Result TLBO DGSTLBO ETLBO NSTLBO DLTLBO DATLBO
๐น1
๐นmeanSD
2.63E โ 27โ1.22E โ 27
1.69E โ 09โ2.34E โ 09
1.49E โ 27โ2.11E โ 27
9.31E โ 20โ1.24E โ 19
6.32E โ 27โ2.55E โ 26
1.86E โ 284.02E โ 29
๐น2
๐นmeanSD
7.81E โ 09โ1.06E โ 08
1.33E โ 08โ1.75E โ 09
6.00E โ 10โ8.48E โ 10
6.01E โ 09โ2.05E โ 00
1.45E โ 14โ2.94E โ 15
1.34E โ 166.95E โ 17
๐น3
๐นmeanSD
1.30E โ 06โ7.28E โ 05
2.12E + 06โ8.88E + 05
2.73E + 05+4.57E + 05
1.81E + 06โ4.57E + 05
1.09E + 06โ3.67E + 04
3.66E + 032.68E + 03
๐น4
๐นmeanSD
3.58E + 02โ9.37E + 00
2.74E + 02โ1.69E + 02
2.01E + 03โ2.84E + 03
7.83E + 03โ3.48E + 03
3.65E โ 01+1.38E โ 02
1.65E + 021.44E + 02
๐น5
๐นmeanSD
2.46E + 03+1.05E + 03
4.26E + 03โ8.32E + 02
8.42E + 02โ1.88E + 03
4.24E + 03+1.22E + 03
3.27E + 04โ9.02E + 02
5.81E + 031.35E + 03
โ+โ
410
410
410
410
410
๐น6
๐นmeanSD
1.84E + 01+2.81E + 01
6.74E + 02โ7.14E + 02
1.39E + 01+3.10E + 01
9.67E + 01โ6.92E + 01
4.52E + 01+2.16E + 01
5.52E + 013.62E + 01
๐น7
๐นmeanSD
4.70E + 03โ1.45E โ 12
4.70E + 03โ6.78E โ 13
4.70E + 03โ1.49E + 03
4.70E + 03โ2.23E โ 12
4.70E + 03โ3.35E โ 14
4.75E + 039.59E โ 13
๐น8
๐นmeanSD
2.09E + 01โ3.52E โ 02
2.09E + 01โ4.71E โ 02
2.08E โ 01+6.35E โ 02
2.09E + 01โ4.17E โ 02
2.08E + 01โ5.19E โ 02
2.00E + 015.60E โ 02
๐น9
๐นmeanSD
8.86E + 01โ1.38E + 01
6.11E + 01+1.17E + 01
2.07E + 01โ4.63E + 01
1.06E + 02โ2.48E + 01
2.28E + 01โ2.36E + 01
9.95E โ 004.45E โ 00
๐น10
๐นmeanSD
1.39E + 024.17E + 01
1.09E + 02โ7.09E + 01
3.02E + 016.76E + 01
1.68E + 023.88E + 01
1.12E + 02โ3.35E + 01
1.07E + 012.09E + 01
๐น11
๐นmeanSD
3.25E + 01โ6.06E โ 00
1.87E + 01โ1.90E โ 00
6.10E โ 00โ1.36E โ 01
3.32E + 01โ4.29E โ 00
6.15E โ 00โ1.52E โ 01
6.04E โ 004.45E โ 00
๐น12
๐นmeanSD
7.53E + 03โ7.36E + 03
2.70E + 04โ1.20E + 04
1.15E + 03โ2.58E + 03
9.84E + 03โ7.83E + 03
4.05E + 06โ2.22E + 03
2.51E + 033.00E + 03
โ+โ
511
511
322
502
511
โ+โ
921
921
732
912
921
โโโ, โ+โ, and โโ โ denote that the performance of the corresponding algorithm is significantly worse than, significantly better than, and similar to that ofDATLBO, respectively.
Table 3: Milling operations.
Operation Operation type Tool number ๐๐(mm) ๐พ (mm) ๐
๐(mm) ๐น
๐(per)
1 Face milling 1 10 450 2 156,449.42 Corner milling 2 5 90 6 17,117.743 Pocket milling 2 10 450 5 17,117.744 Slot milling 1 3 10 32 โ 14,264.785 Slot milling 2 3 5 84 1 14,264.78
better than all the other optimization methods used for thesame model.
5. Summary and Conclusions
A novel version TLBO algorithm, the DATLBO algorithm,is proposed for cutting parameter selection of a multitoolmilling operation. In the proposed DATLBO algorithm,all the learners are divided into three categories based ontheir results in โLearner Phaseโ: good learners, moderatelearners, and poor ones. Good learners are self-motivated
and try to learn by themselves; each moderate learner usesa probabilistic approach to select one of good learners tolearn; each poor learner also uses a probabilistic approachto select several moderate learners to learn. The DATLBOis applied to a case study for a multitool milling selectionproblem based on maximum profit rate criterion with fivepractical technological constraints. Significant improvementsare obtained with the DATLBO algorithm in comparisonto the results by HB, FD, GA, TLBO, DGSTLBO, ETLBO,NSTLBO, and DLTLBO algorithms. These results show that
10 Mathematical Problems in Engineering
0 1 2 3โ30โ25โ20โ15โ10โ5
05
FES
log10(F(x)โF(x
โ))
(F1)ร10
50 0.5 1 1.5 2 2.5 3
3
4
5
6
7
8
9
FES
log10(F(x)โF(x
โ))
(F3)ร10
5
0 1 2 33.23.43.63.8
44.24.44.64.8
FES
log10(F(x)โF(x
โ))
(F5)
ร105
0 1 2 3
2
4
6
8
10
FES
log10(F(x)โF(x
โ))
(F6)
ร105
0 1 2 33.65
3.7
3.75
3.8
3.85
3.9
3.95
FES
log10(F(x)โF(x
โ))
(F7)ร10
50 1 2 3
1.3
1.305
1.31
1.315
1.32
1.325
1.33
1.335
FES
log10(F(x)โF(x
โ))
(F8)ร10
5
0 1 2 3โ2โ1
012345
FES
log10(F(x)โF(x
โ))
(F4)
ร105
0 1 2 3
โ10
โ5
0
5
FES
log10(F(x)โF(x
โ))
(F2)ร10
5
0 1 2 30.8
11.21.41.61.8
22.22.4
FES
log10(F(x)โF(x
โ))
(F9)ร10
5
0 1 2 3
1.3
1.4
1.5
1.6
FES
log10(F(x)โF(x
โ))
log10(F(x)โF(x
โ))
log10(F(x)โF(x
โ))
(F11)
ร105
0 1 2 33.5
4
4.5
5
5.5
6
6.5
FES(F12)
ร105
0 1 2 31.41.61.8
22.22.42.62.8
FES(F10)
ร105
NSTLBOTLBOETLBO
DGSTLBODLTLBODATLBO
NSTLBOTLBOETLBO
DGSTLBODLTLBODATLBO
NSTLBOTLBOETLBO
DGSTLBODLTLBODATLBO
Figure 2: Convergence graph of themean function error values derived from six algorithms versus the number of FES on twelve test functions.
the DATLBO algorithm is an important alternative foroptimization of machining parameters in multitool millingoperations.
Nomenclature
๐๐: Tool clearance angle (deg.)
๐๐, ๐๐: Axial and radial depths of cut (mm)
๐๐: Tool lead (corner) angle (deg.)
๐ด: Chip cross-sectional area (mm2)๐๐, ๐๐: Labor cost and overhead costs ($/min)
๐๐, ๐mat, ๐๐ก: Machining cost, cost of raw material per
part, and cutting tool cost ($)๐๐ข: Unit cost ($)
๐ถ: Cutting force equation constant๐ถ๐น: Cutting force coefficient
Mathematical Problems in Engineering 11
Slot 1
Step
Slot 2
(a) Three-dimensional view
10
10
A-A
5
120
30
(b) Front view80
40
30
20
601280
(c) Top view
Figure 3: Milling operation schematic.
HB
FD GA
TLBO
DG
STLB
O
ETLB
O
NST
LBO
DLT
LBO
DAT
LBO
0
2
4
6
8
10
12
14
16
18
20
Methods
Uni
t cos
t ($)
Figure 4: Comparison of unit cost obtained by various methods.
HB
FD GA
TLBO
DG
STLB
O
ETLB
O
NST
LBO
DLT
LBO
DAT
LBO
0
1
2
3
4
5
6
7
8
9
10
Methods
Uni
t tim
e (m
in)
Figure 5: Comparison of unit time obtained by various methods.
12 Mathematical Problems in Engineering
HB
FD GA
TLBO
DG
STLB
O
ETLB
O
NST
LBO
DLT
LBO
DAT
LBO
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
Methods
Profi
t rat
e ($/
min
)
Figure 6: Comparison of profit rate obtained by various methods.
Table 4: Cutting velocity and feed per tooth ranges.
Operation Operation typeCutting
velocity limits(m/min)
Feed pertooth limits(mm/tooth)
1 Face milling 60โ120 0.05โ0.42 Corner milling 40โ70 0.05โ0.53 Pocket milling 40โ70 0.05โ0.54 Slot milling 1 30โ50 0.05โ0.55 Slot milling 2 30โ50 0.05โ0.5
Table 5: Tools data.
Too Tool type Quality ๐ (mm) ๐ง Price ($) ๐๐
๐๐
1 Face mill Carbide 50 6 49.50 45 52 End mill HSS 10 4 7.55 0 53 End mill HSS 12 4 7.55 0 5
Table 6: Optimal results obtained by TLBO algorithm for millingcase study.
Cutting velocity Feed per tooth Unit cost Unit time Profit rate(m/min) (mm/tooth) ($) (min) ($/min)90.2536 0.3706
10.91 5.04 2.8060.8891 0.219650.3567 0.271234.5682 0.150939.3216 0.3991
Table 7: Optimal results obtained by DGSTLBO algorithm formilling case study.
Cutting velocity Feed per tooth Unit cost Unit time Profit rate(m/min) (mm/tooth) ($) (min) ($/min)89.3821 0.4872
10.90 4.92 2.8642.3654 0.403541.2251 0.490242.8636 0.466244.2648 0.4628
Table 8: Optimal results obtained by ETLBO algorithm for millingcase study.
Cutting velocity Feed per tooth Unit cost Unit time Profit rate(m/min) (mm/tooth) ($) (min) ($/min)91.7539 0.4638
10.54 4.87 2.9771.3328 0.389860.1682 0.402638.2543 0.450235.3826 0.4026
Table 9:Optimal results obtained byNSTLBOalgorithm formillingcase study.
Cutting velocity Feed per tooth Unit cost Unit time Profit rate(m/min) (mm/tooth) ($) (min) ($/min)91.3682 0.3831
10.89 5.01 2.8158.9875 0.253754.3568 0.363338.4828 0.208641.3682 0.4128
Table 10: Optimal results obtained by DLTLBO algorithm formilling case study.
Cutting velocity Feed per tooth Unit cost Unit time Profit rate(m/min) (mm/tooth) ($) (min) ($/min)90.4209 0.3905
9.97 4.98 3.0260.3259 0.268652.6538 0.356235.3864 0.248240.6637 0.5129
Table 11: Optimal results obtained by DATLBO algorithm formilling case study.
Cutting velocity Feed per tooth Unit cost Unit time Profit rate(m/min) (mm/tooth) ($) (min) ($/min)95.5829 0.3931
9.82 4.61 3.3143.6843 0.390243.8542 0.487940.2535 0.308541.9932 0.4873
Mathematical Problems in Engineering 13
Table 12: Comparison of results for milling case study by variousmethods.
Method ๐ถ๐ข, unit cost($)
๐๐ข, unit time(min)
๐๐, profit rate($/min)
Handbook [26] 18.36 9.40 0.71FD [25] 11.35 5.48 2.49GA [24] 11.11 5.22 2.65TLBO 10.91 5.04 2.80DGSTLBO 10.90 4.92 2.86ETLBO 10.54 4.87 2.97NSTLBO 10.89 5.01 2.81DLTLBO 9.97 4.98 3.02DATLBO 9.82 4.61 3.31
๐: Tool diameter (mm)๐: Machine tool efficiency factor๐๐ง: Feed per tooth (mm/tooth)
๐น๐, ๐น๐(per): Cutting force and permitted cutting force (N)
๐บ, ๐: Slenderness ratio and exponent ofslenderness ratio
๐: ๐th milling operation๐พ: Distance traveled by tool to perform
operation (mm)๐พFC: Correction factor of cutting force under
different experiment conditions๐: Number of machining operations๐, ๐: Tool life exponent and spindle speed (rpm)๐, ๐๐: Required power for the operation and motor
power (kW)๐๐: Profit rate ($/min)
๐๐น: Influential exponent tool on cutting force
๐: Contact proportion of cutting edge withwork piece per revolution
๐: A vector which is distributed randomly in[0, 1]
๐ ๐, ๐ ๐(at): Value of surface roughness and attainable
surface roughness (๐m)๐๐: Sale price of the product ($)
๐ก๐, ๐ก๐ , ๐ก๐๐ก: Machining time, setup time, and tool
changing time (min)Teacher: The best learner๐๐น: Teaching factor
๐, ๐๐ข: Tool life (min) and unit production time
(min)๐max: Maximum number of algorithm iterations๐ข๐น: Influential exponent axial depths of cut on
cutting forceV, Vhb, Vopt: Cutting velocity, recommended cutting
velocity by handbook, and optimum cuttingvelocity (m/min)
๐ค: Tool wear exponent๐ค๐น: Influential exponent spindle speed on
cutting force๐new,๐: ๐th learner๐ฅ๐น: Influential exponent radial depths of cut on
cutting force
๐ฆ๐น: Influential exponent feed per tooth oncutting force
๐ง: Number of tool teethฮฉ: Feasible region๐: Arbitrarily small real number๐: Constant between zero and one๐๐: Random vector sequence in probability space
๐โ: One convergence point with specificprobability
๐0: One convergence point with probability 1.
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper.
Acknowledgments
The authors wish to acknowledge the financial support forthis work from the National Natural Science Foundation ofChina (51575442 and 61402361) and the Shaanxi ProvinceEducation Department (11JS074).
References
[1] R. Gupta, J. L. Batra, and G. K. Lal, โDetermination of optimalsubdivision of depth of cut in multipass turning with con-straints,โ International Journal of Production Research, vol. 33,no. 9, pp. 2555โ2565, 1995.
[2] J.Wang, T.Kuriyagawa,X. P.Wei, andD.M.Guo, โOptimizationof cutting conditions for single pass turning operations using adeterministic approach,โ International Journal of Machine Tools& Manufacture, vol. 42, no. 9, pp. 1023โ1033, 2002.
[3] Y. C. Shin and Y. S. Joo, โOptimization of machining conditionswith practical constraints,โ International Journal of ProductionResearch, vol. 30, no. 12, pp. 2907โ2919, 1992.
[4] P. G. Petropoulos, โOptimal selection of machining rate vari-ables by geometric programming,โ International Journal ofProduction Research, vol. 11, no. 4, pp. 305โ314, 1973.
[5] M. S. Shunmugam, S. V. Bhaskara Reddy, and T. T. Narendran,โSelection of optimal conditions in multi-pass face-millingusing a genetic algorithm,โ International Journal of MachineTools & Manufacture, vol. 40, no. 3, pp. 401โ414, 2000.
[6] S. Li, Y. L. Liu, Y. Li, R. G. Landers, and L. Tang, โProcess plan-ning optimization for parallel drilling of blind holes using a twophase genetic algorithm,โ Journal of Intelligent Manufacturing,vol. 24, no. 4, pp. 791โ804, 2013.
[7] A. Krimpenis and G.-C. Vosniakos, โRough milling optimisa-tion for parts with sculptured surfaces using genetic algorithmsin a Stackelberg gam,โ Journal of Intelligent Manufacturing, vol.20, no. 4, pp. 447โ461, 2009.
[8] Y. Liu and C. Wang, โA modified genetic algorithm basedoptimisation of milling parameters,โ International Journal ofAdvancedManufacturing Technology, vol. 15, no. 11, pp. 796โ799,1999.
[9] Z. G. Wang, M. Rahman, Y. S. Wong, and J. Sun, โOptimizationof multi-pass milling using parallel genetic algorithm andparallel genetic simulated annealing,โ International Journal ofMachine Tools & Manufacture, vol. 45, no. 15, pp. 1726โ1734,2005.
14 Mathematical Problems in Engineering
[10] H. Oktem, โAn integrated study of surface roughness formodelling and optimization of cutting parameters during endmilling operation,โ International Journal of Advanced Manufac-turing Technology, vol. 43, no. 9-10, pp. 852โ861, 2009.
[11] S. Li, Y. Li, Y. Liu, and Y. Xu, โA GA-based NN approach formakespan estimation,โ Applied Mathematics and Computation,vol. 185, no. 2, pp. 1003โ1014, 2007.
[12] C. A. C. Antonio, C. F. Castro, and J. P. Davim, โOptimisation ofmulti-pass cutting parameters in face-milling based on geneticsearch,โ International Journal of Advanced Manufacturing Tech-nology, vol. 44, no. 11-12, pp. 1106โ1115, 2009.
[13] A. Zhou, B.-Y. Qu, H. Li, S.-Z. Zhao, P. N. Suganthan, and Q.Zhang, โMultiobjective evolutionary algorithms: a survey of thestate of the art,โ SwarmandEvolutionaryComputation, vol. 1, no.1, pp. 32โ49, 2011.
[14] O. Zarei, M. Fesanghary, B. Farshi, R. J. Saffar, andM. R. Razfar,โOptimization of multi-pass face-milling via harmony searchalgorithm,โ Journal of Materials Processing Technology, vol. 209,no. 5, pp. 2386โ2392, 2009.
[15] R. A. Mahdavinejad, N. Khani, and M. M. S. Fakhrabadi,โOptimization of milling parameters using artificial neuralnetwork and artificial immune system,โ Journal of MechanicalScience and Technology, vol. 26, no. 12, pp. 4097โ4104, 2012.
[16] J. F. Briceno, H. El-Mounayri, and S. Mukhopadhyay, โSelectingan artificial neural network for efficient modeling and accuratesimulation of the milling process,โ International Journal ofMachine Tools &Manufacture, vol. 42, no. 6, pp. 663โ674, 2002.
[17] R. Venkata Rao and P. J. Pawar, โParameter optimization of amulti-pass milling process using non-traditional optimizationalgorithms,โ Applied Soft Computing Journal, vol. 10, no. 2, pp.445โ456, 2010.
[18] G. C. Onwubolu, โPerformance-based optimization of multi-pass face milling operations using Tribes,โ International Journalof Machine Tools and Manufacture, vol. 46, no. 7-8, pp. 717โ727,2006.
[19] R. V. Rao, V. J. Savsani, and D. P. Vakharia, โTeaching-learning-based optimization: an optimization method for continuousnon-linear large scale problems,โ Information Sciences, vol. 183,no. 1, pp. 1โ15, 2012.
[20] F. Zou, L. Wang, X. Hei, D. Chen, and D. Yang, โTeaching-learning-based optimization with dynamic group strategy forglobal optimization,โ Information Sciences, vol. 273, pp. 112โ131,2014.
[21] L. Wang, F. Zou, X. Hei, D. Yang, D. Chen, and Q. Jiang, โAnimproved teachingโlearning-based optimization with neigh-borhood search for applications of ANN,โNeurocomputing, vol.143, pp. 231โ247, 2014.
[22] R. V. Rao and V. Patel, โAn elitist teaching-learning-based opti-mization algorithm for solving complex constrained optimiza-tion problems,โ International Journal of Industrial EngineeringComputations, vol. 3, no. 4, pp. 535โ560, 2012.
[23] F. Zou, L. Wang, D. Chen, and X. Hei, โAn improved teaching-learning-based optimization with differential learning and itsapplication,โ Mathematical Problems in Engineering, vol. 2015,Article ID 754562, 19 pages, 2015.
[24] A. Rฤฑza Yildiz, โA novel hybrid immune algorithm for globaloptimization in design and manufacturing,โ Robotics andComputer-Integrated Manufacturing, vol. 25, no. 2, pp. 261โ270,2009.
[25] M. Tolouei-Rad and I. M. Bidhendi, โOn the optimization ofmachining parameters for milling operations,โ International
Journal of Machine Tools & Manufacture, vol. 37, no. 1, pp. 1โ16,1997.
[26] Machinability Data Center, Machining Data Handbook, vol. 1,Machinability Data Center, 3rd edition, 1980.
[27] A. Askarzadeh, โBird mating optimizer: an optimization algo-rithm inspired by bird mating strategies,โ Communications inNonlinear Science and Numerical Simulation, vol. 19, no. 4, pp.1213โ1228, 2014.
[28] D. Sarkar and J.M.Modak, โPareto-optimal solutions formulti-objective optimization of fed-batch bioreactors using nondomi-nated sorting genetic algorithm,โ Chemical Engineering Science,vol. 60, no. 2, pp. 481โ492, 2005.
[29] Y. Wang, Z. Cai, and Q. Zhang, โDifferential evolution withcomposite trial vector generation strategies and control param-eters,โ IEEE Transactions on Evolutionary Computation, vol. 15,no. 1, pp. 55โ66, 2011.
[30] K. Deb, โAn efficient constraint handling method for geneticalgorithms,โ Computer Methods in Applied Mechanics and Engi-neering, vol. 186, no. 2โ4, pp. 311โ338, 2000.
Submit your manuscripts athttp://www.hindawi.com
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttp://www.hindawi.com
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Journal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
CombinatoricsHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
International Journal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
The Scientific World JournalHindawi Publishing Corporation http://www.hindawi.com Volume 2014
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com
Volume 2014 Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Stochastic AnalysisInternational Journal of