View
214
Download
1
Category
Preview:
Citation preview
Research of multi-objective optimization Based on Hybrid Genetic
Algorithm JIANG Hua
Computer & control college Guilin uniersity of electronic
technology Guilin, China
e-mail:jianghua@guet.edu.cn
XU GuiLin Principals and Executive Office Guangxi Mech.& Elec.Industry
School Nanning, China
e-mail:xgl6476@126.com
DENG Zhenrong Computer & control college Guilin uniersity of electronic
technology Guilin, China
zhrdeng@guet.edu.cn
Abstract: In the process of solving multi-objective Pareto
solution, the search ability in total area and the convergence
characteristics can be reinforced by self-adjusting of
aberrance probability in offspring evolution. Comparing
with the typical hybrid genetic algorithm, the more effective
optimization convergence can be obtained by using the
improved hybrid genetic algorithm in solution for
optimization problem. Numerical simulation based on some
typical examples demonstrate the effectiveness of the
proposed method.
Key words: Hybrid genetic algorithm, Pareto solution,
multi-objective optimization
I. INTRODUTION
In practical engineering application, multi- objective optimization questions often appear. The different objective functions can’t be compared and even mutual conflict. In the process of optimization, it needs to consider simultaneity optimization with many objective functions. Genetic algorithm is a type of total optimization arithmetic that developed in recent years. It is widely used in all kinds of complicate optimization questions because of its hidden parallel, randomicity and high robust. At present, genetic algorithm’s methods with application multi-objective optimization mainly are divided three types: multi-objective converges single objective, not Pareto method and Pareto method. The tradition method that multi-objective convert single objective is simple in arithmetic method design, high in computed efficiency, but it only can make an availability solution. Not Pareto method can solve many availability solutions, but these solutions usually focus on the port of the availability interface. The
optimization method based on Pareto method is that many objective values directly mapped to adaptive function. By comparing with the dominate relation of objective value, the effective solution muster will be found out. The type of method can solve many solution by one computed, but the method is complicate in arithmetic, low efficiency in bigger offspring dimensions. In this paper, a class of multi-objective optimization method based on hybrid genetic algorithm is proposed to obtain Pareto solution along the Pareto frontier. II. THE DESCRIPTION OF THE OPTIMIZATION
PROBLEM Consider a Consider commonly about multi-
objective minimize problem:
Minimize f(X)=(f1 (X),f 2 (X),…,f m (X)) (1)
S.t. g1 (X) ≤ 0 I=1,…,p
Where
TnxxxX ),...,,( 21=
nR special n vectors
)(Xgi restrict condition
)(Xfi sub-objective function
The range of value, which restrict condition decide the decision-making vector, is named “feasible area” ( Ω ).
In the investigation of multi-objective optimization, Pareto optimization solution (or effective solution) is the most basic concept. If
Ω∈*X , not exist Ω∈X , make
2009 Fifth International Joint Conference on INC, IMS and IDC
978-0-7695-3769-6/09 $26.00 © 2009 IEEE
DOI 10.1109/NCM.2009.136
1984
2009 Fifth International Joint Conference on INC, IMS and IDC
978-0-7695-3769-6/09 $26.00 © 2009 IEEE
DOI 10.1109/NCM.2009.136
1996
)()( *XfXf ≤ , and it at least has an objective in
f(X) value smaller than an objective value in )( *Xf .
So *X is Pareto optimization in equation (1) (or effective solution).
III. HYBRID GENETIC ALGORITHM AND ITS
IMPROVEMENT The simple genetic algorithm is prone to
prematurity convergence in the evolution initial stages, moreover, the offspring’s diversity descended makes convergence bad in the evolution evenly. Its limitations already have many different improving methods at present, such as dilamination genetic algorithm, self-adoptable genetic algorithm, genetic algorithm based on little biology circumstance technology, and hybrid genetic algorithm which makes the other optimization method amalgamate into the process of evolution. The hybrid genetic algorithm makes the usage of mutual supplement not only in the algorithm construction, but also in search mechanism and the evolution thought. And it provides a good method for solving high dimension complicated optimization problem.
In this paper, it adapts the hybrid strategy of genetic algorithm and simplex method (GASM). The analog annealing mechanism is introduced that make self-adopted adjust the population’s different probability with evolution process. Genetic algorithm has the ability of total search, but its convergence is bad in the round of optimization point, commonly, and it is an approximate optimization. Simplex method has very strong ability of local search. The efficient of search is high, but it is very easy to plunge local optimization because of enduring the effect of initial value. Using genetic algorithm search approximate optimization point in the total range, then approximate optimization point acts as initial value and using simplex method search in the local range. Combining the two methods may enhance convergence velocity greatly. Simulating anneal algorithm has the characteristic of jumping suddenly for probability. During the evolution, it not only accept the monomer for good adaptability, but also accept the monomer for bad adaptability as definite probability. When
simulating anneal mechanism is introduced, it can avoid inducing local optimization in the whole hog and enhancing the reliability of total optimization solution.
The thought of GASM algorithm’s improving: (1) Search Mechanism After using genetic algorithm finds out an
extremum point each time, it will begin local search taking the point as simplex method’s initial value. If better extremum point is searched, it creates new propagation population taking the new extremum point as reference point, or maintaining population unchanged.
(2) Intercrossing Operation Propagating population is divided into two
offspring namely f1 and f2. Using the type of floating-point intercrossing create the son offspring which is equivient to parent offspring in number.
Intercrossing algorithm
21)1(1 ffchild λλ +−=
21 )1(2 ffchild λλ −+=
Where λ is evenly distributing in the range of [0,1].
(3) Differention Operation Son offspring produces differention. The probability of differention is decided equation
)}/)(exp(,1min{ mavgmm fffP −−=
Where mf the best monomer adaptable
in son offspring, avgf son offspring’s average
adaptable
The more mf close to avgf , the more
offspring’s maturity is. It is easy to induce prematurity convergence. Prematurity convergence is avoided for
enhancing c probability mP and producing new
monomer. Differential probability and offspring maturity represent on the contrary toward. This indicates offspring self-adaptable adjusting process.
19851997
Differential algorithm X’=X+ηξ
Where X differential genetic
ξ random disorder obeyed to Kexi
distributing η using to adjust differential amplitude
(4) Select Operation In order to enhance offspring biodiversity, parent
offspring, son offspring of intercrossing and son offspring of differention form popnew, then produce selected operation.
In order to assure good unit pass down the next generation units, unit is direct reproduce to the next generation. The fitness of the unit is higher than it of average. The other units, its fitness is lower than it of average, reproduce to the next generation with
probability SP . SP is define as equation (4).
}/)exp(,1min{ kavgiS TffP −−= (4)
Where if is fitness of the unit, avgf is
average fitness, kT is anneal temperature. The lower
anneal temperature, the smaller the fitness of receiving unit of a sort. Therefore no converge can be
avoided. The no converge derives from largen of mP .
After selection operation, when the next population is bigger than the former population, we discard the individual which possess the lower fitness function value, otherwise we obtain new individual randomly and add to the new population. (5) The cyclic accelerated operation
As in multi-dimension complicated question, the method above still need more generations and bigger generations to convergent the optimum result. In order to reduce the cyclic times, it uses two optimum processes, that is, the result of first one as the initial value of the second one, so that the generations of two processes can be reduced efficiently.
IV. EVOLUTIONARY MULTI-OBJECTIVE OPTIMIZATION ALGORITHMS AND
REALIZATION
Above mentioned method for multi-objective optimization is an effective and rapid method for solving uniobjective optimization problems. Here presents a method for solving optimal solutions of multi-objective min of preference for Pareto by using the mentioned hybrid algorithms and minmax methods in multi-objective optimization algorithms. The clear process is shown as follows.
(1) Obtaining initial temperature 0T , the
number of population popsize, evolution time
generation, k=0 and m weights mwww ,..., 21 , where
[ ]1,0∈iw and1=iw
by using
max {{ } }ii fw~
max as fitness function, Multi-objective minimize problem is transformed into uniobjective minimize problem:
{ { }}ii fw~
maxmin , Where if~
is the value of objective functions after an empirical expression.
(2) The point of min fitness of propagate population is used as initial point of simplex algorithm
in part of search. minX is the point of min fitness of
propagate population, its fitness is minf , optX is the
optimization point of simplex search. The value of
fitness is
~
optf, if
optff <~
min , Then minopt XX =,
~
min
~ffopt =
. The
new propagate population is produced by using optX
as reference point: optnew XX η=, η are uniform
distributions from [-1,1]. (3) The operation of crossing, aberrance, and
choice is gone on according to the improved hybrid genetic algorithms which are mentioned above. The objective of the operation is propagating population.
(4) Condition of judge whether convergence is
ended: TstopT < or K > generation. The
19861998
program is over if satisfies, optXand each objective
function is output. Otherwise: 1+= kk , recede
warmly: )/log( 0 kTT = , turn into 2.
In the above algorithm, because of the authorize value in the field of [0,1] is random, single of muti-caculation can get the useful result on the boundary of Pareto in different direction.
V. EXAMPLES In this paper, two typical examples will be
presented to demonstrate the effectiveness of the proposed method. The software is implemented by using Matlab7.0.
min 13),1/(1 22
212
22
211 ++=++= xxfxxf }
S.t. ]3,3[1 −∈x ]5,5[2 −∈x
The number of offspring is 10. Evolution algebra is 20. Initial temperature is 200. The time of computing is 100. The 15 former results are shown in table 1.
TABLE 1 THE RESULT OF COMPUTED VALUE
times The results of problem 2 k x1 x2 f1 f2 1 -0.0000 -0.0000 1.0000 1.0000 2 -0.3088 -0.0000 0.9655 1.0955 3 0.8976 0.0000 0.5566 1.7980 4 -1.1900 0.0000 0.4328 2.4152 5 -0.4533 0.0000 0.8370 1.2056 6 0.7456 0.0000 0.6443 1.5643 7 0.6242 -0.0000 0.7197 1.3890 8 -1.3408 0.0000 0.3577 2.7892 9 0.9477 -0.0000 0.5256 1.8945 10 -1.5555 -0.0000 0.2944 3.4171
Based no the above results, figure 2 and figure 3 indicate that the algorithm can obtain a group of effective solution along the Pareto frontier and in the different directions. The time of computing is related to offspring model, evolution algorithm, the initial temperature. It takes the problem as an example. Each time of the computing is not more than 0.1 second. The condition is that offspring model is 10, evolution algorithm is 20, the initial temperature is 100.
VI. CONCLUSION
An improved hybrid genetic algorithm for quick convergence is presented in this paper. In the process of solving multi-objective Pareto solution, the search ability in total area and the convergence characteristics can be reinforced by self-adjusting of aberrance probability in offspring evolution. The design of algorithm is simple and doesn’t need to consider the distribution of sub-object value. Comparing with the typical hybrid genetic algorithm, the more effective optimization convergence can be obtained by using the improved hybrid genetic algorithm in solution for optimization problem. Numerical simulation results based on some typical examples demonstrate the effectiveness of the proposed method. It lays decision-making basis for designer. So the new algorithm can be used to solve other engineering optimization problems.
REFRENCES [1] Hu Yuda. A practical multiobjectives optimization Shanghai.
Shanghai scientific & Technical Publishers 1990
[2] Wang Ling. Optimization with Artificial Intelligence and Its
Application Beijing. Tsinghua University Press. 2001
[3] Zong Lingqun. A class of hybrid adaptive calculating method and
analysis on its performance. Systems Engineering-Theory &
Practice. 2001, (4): 14-18
[4] Wang ling, Zheng Dazhong. A class of simulated annealing
approach for multi-objective optimization. Computer Engineering
and Applications. 2002, (8): 4-5
ACKNOWLEDGMENT This work is supported by Natural Science Foundation of Guangxi No:
19871999
Recommended