SlideShare uma empresa Scribd logo
1 de 11
Baixar para ler offline
'
Contents nsrs available at Sc·eQCeO~rect

Computers & Industrial Engineering
lournal homepage: www.elsevier.com/locate/ e:aie

Multi-objective generic local search algorithm using Kohonen's neural map
Mehrdad Hakimi-Asiabar •. Seyyed Hassan Ghodsypour .... Reza Kerachian b
~Orrparrmt-nt

11/ lndr.otrw~l £rtt.Amuio0trthttv o/Ttdlf'IOitOloD. Vah-cml". T.-hron. rron

• (f'nrtt of £.tdtiKl'/01 MhmnKUttt (IIJNtf'l"' oud Mono,ttoi'II'IU. 'aru/ty of CMJ £ngmHnJ.,_ UnuttS•ly

ARTICLE

cf T~rtm. Tthtmt.

1Jan:

ABSTRACT

INFO

Anjcf~ IIISIOiy,

Re<e•vf'd 24 Stpttmbtr 20111
Rec~1vC'd in l"f'I.M'd fo1m l9 Ap11l l008
Acc~pled 9 O<'to~t 2008
Av.-ll•ble onhnt lS Octobfr 2008
K<yHOrds

MultJ...OblfC'IIW JCIt1:1oC laul 4f,l.r(h
S.lr OrJd.ni:nns m.lps
Vui..ble Ne~,&hborhoo<l St.l,(h .VNS)
Multi·ObJKnw f'VOiuHon..uy o~ laonthm
L#.unma

MuJtHtstrvotr Optl•thOI m.11WJtmtnt

Genetic Algornhms (C.As) 01re popul:mon b01seCgloiMI se01rch rncallod.s th01t CM CSCJJ)(! from lonl opum.&
uo~ps o~nd find the glob.&1 optim.& regtons. However. nt~r the optnnum set thf!lr inc~ns1flc.uion process i.s
of1en tn.&ccur~ee. This is beouse the se-arch stro~tegy of CAs IS completely prob•blhshc. Wtth • r•ndom
st.uch ne.u the optimum sets. there is o~ smi11 prob.,bihty to improve currenl <olutl<m. Anot h~r dr.awl>.lck
of the CAs is genetic drift. The CAs seo.rch process is a bl"dc box process .11 no one knows th.ll which
d
rfjlon Is being se.a.rched by the o~lgorithm .and it is poss1ble tl'wt GAs Stirch only.:. sm.aU re;ion in tht fu
slble Sp.l«. On the other h.lnd. GAs usu.ally do not use the existing infornl.ltlon .about th~ optimality
rea,ons in p.ut lter.itions.
In thiS p.aper• . new method C.l.l~ SOM·8.:1Sed Muhi..Ob,ttCIIVt CA (SBMOCA) I) proposed to improve
the renet1 dtvemty. ln SBMOCA. a gnd of neurons use fht concept of leo1rn11'1 rule of Stlf Orlo1n1zinc
c
M.ll' (SOM) mppornng by Vanable Neighborhood St•rc:h (VNS) le.am from genetiC .alaontlH1l lmprOVI"'
bo~h loc41 •nd &lob.ll S('.trch. SOMis • n.tUr.al networlc wh1ch '' C:•p•blt of le.1mma .md '"" tmprove the
tffic1ency ol d.at.a processm& •la:onthms. The VNS .al,onthm 15 dt'wlopc<l to t"nh.anc-e the loc.al St'.&rch effl
aotney 1n H~ £voluuon.ary Alronthms (EAs). The SOM usn .a multJ·obJ«tlvc lco~mm1 rule b.unS..on P4r·
tto domin.lnce- to tnin its neurons. The neurons gc•du..aUy move tow.nd better fanus .artJ, 1n somt
tf.l]t(tories in fe.asible s~ce. The know1tdge of optimum from in p.ut Jtner.ltions IS uvtd m fo•m of u~
jf'C'tonf'S Tht ti~l sute of the- neurons detemunes .a set of O('W solutions th..lt c.an bt ff'l4rdltd .aJ lhc
protubUtty dt11Saty dtunbutlOn fun<"tJOn of the high fi[ness .uus tn lM multl..obtf'Ctl'lf s~t 1M ntw
s.et of soluttons polentJ.liJy c.an tmpi'O'Jt l:be GAs ow-t.l.JI effi<•~Pncy.ln thot Luc s«ttnn of dus P'Jptf', eM
•pphutN!•ty o( thf' propos~ 4lgonthm 1s n.~.mu~ tn dew&opnJ opttm.&l poiK•otS (Of .1 rul work!
J'l.llU-obJI•cuw muln·r~rvon system whtch 1s .a non--luw:.n. non-«nvvt.tnulb·Obt«U'~ot OQ«umuhon
probl<m.
0 2008 EISC"V''t'r Ud. AU n&ha rHotrwd.

Evolut•on.uy Al1onthms (EAs) .ut prob.lbll1stic sench optimi·
uuon ltchmquts. wh1ch ~vt !>ten d~loptd bued on D.trw~n's
pnnCiples of N~tur'1 st~tton .and survrv.11J of the finest individui11S
m a popul~llon. EAs use comput,auon.al modtls of tvo1unonary
processes as key clcmcncs in the destgn .and tmpltmtnr.u•on of
computer-~sed problem solvong systems (Cordon. Moy•. & l.>rco.
2002; Goldlx'rg. 1989).
Th(f( ,)J'( a V.lfl~ty ~VOIUiiOO•lr'Y (Ompuurion.al models. There
have been four well defined EAs. whoch h•ve served iS rho basis for
che most of .1ct ivtties tn the field of evoluttOil.lf'Y computations: Genetic Algorothms (CAs)(Holl•nd. 1975: Mlchalrwicz. 1996). Evolutoon Str•tegies (S<hwefcl. lq7S; S..hwefcl, 1981, 1995), Cenoo ic
Po·oguonming (CP) (Fogo I. 19o2; Ko••· 1992) and Evolutoon•ry Pro-

or

• Conupond"'j -'UthOf', T~L •9811 6&4130l•IGG4&6497: f~x ! •98 21 664Jl02S.

l·md addrtHN

1991~ Ceneuc •lgorothms h.l-. ~nun­
liz.l'd in differenr fields of engmeenng much more th.an other
forms ofEIS.
lmn.tl developments In tvOiutton,ary opunuuuon modtls
cu~ on sing)t obJtcUVt dpphnllons. Jn the p.ut two dtc.&des.
-.ral Muln-<>bJOCtwe EAs such as Vec1or Ev•lu.oted Ctnetic Algo·
nthm (VEGA)(Sch•ffcr. 1984) •nd Non-domonated Soned Ceneloc
Algorirhms (NSCA) (Snno•·as & Deb. 1994 ) h•vo ~n proposed.
These early EAs often J>t>rfonned poorly, cons1d~nns two key
parameters: convergence ra.te c1nd d1verstty. Recent ,algonthms II let
S1rcngth Par01o Evolutionary SPEA (Zotzlcr & Thoele. 1999) •nd
NSCA-11 (Deb. Prat•p. Agarwal. & Mty•11van. 2002) l)<rform bertor
though, they still suffer from sim1lar dtficienc-its.
Dtb (2001) and V•n V.ldhuo
zcn •nd L<omont (2000) presented
comprehensive reviews .-nd classific~t•on
the most lmponant
.Jppro.1ches to genetic Jlgonthrn.s fo1 muiU--QbJN'tivr opl lffil7:4rion.
·
Lnely. Konc1k, Coit. .md Smith (2000) pl'esenttd ~n ovel'view and
rurori.JI describing GAs developed for problems with multiple
ob)e<toves. They concludtd that these methods dlffor pr>m•nly

gr•mmmg (EP) (Fogel.

1. Introduction

ahud:SV~·~~~

.c II (5 u

Chodsypour~ K~r.tiChiAnOut..lC.II' ( R.

Ktr.Kh.l.ln)._
0360 U5l/S $«front mmtr e 2001 liHvttr Lid. All r•,&tlrc ffS~rwd.
doi 10 1016jJ(i~.l00110010

ro-

or
'
M. Hakimi -Asiabar rr ai)Compurers & l11dusrrial Engineering 56 (2009) 1566-1576

from traditional GA by using specialized fitness functions and
introducing methods to promote solution diversity.
Many real -world problems do not satisfy necessary conditions
such as continuity. differentiability. convexity, etc. Therefore. they
can not be easily solved using traditional gradient-based optimization techniques. GAs have been considered as a practical optimization tool in many disciplines such as discontinuous multi-modal
objective functions, combinatorial {together with discrete, continuous or integer design variables). dynamic. severely nonlinear, and
non-d ifferen tiable, non-convex design spaces problems.
Another advantage of MOEAs is definition of Pareto front set
with an acceptable computational time. Traditional multi-objective algorithms define one solution in each run. The MOEAs usually
attempt to generate (or closely approximate) the entire Pareto
front in a single run and place emphasis on J.chieving solution
diversity so as to avoid local optima (Rangarajan. Ravindran. &
Reed, 2004).The advantages of GAs increasingly extend their applications. However. there are some drawbacks that limit their
efficiency.
The traditional GAs intensification process is not sufficiently
accurate. GAs usually find the area of good fitness qu ite easily.
However, finding the global optimal solution may be time-consuming and inaccurate. This is because the search strategy of GAs
is probabilistic. In a probabilistic search process. when a chromosome is far from the local optima, there is a SO% chance that a random search direction will simultaneously improve .all the
objectives. However. when a point is close to the Pareto set, the
size of proper descent/ascent cone is extremely narrow and there
is small probability that a random update improves the objective
functions (Brown & Smith, 2005). Thus with a random search strategy, GAs generally require a great number of iterations and they
converge slowly. especially in the neighborhood of the global opti mum. With a randomized reproduction strategy in which the
crossover points are determined randomly, the resulting children
are created without regard to the existing information about high
fitness regions. Therefore. the fitness of a child can deviate quite
widely from the fitness of its parents.
Another drawback of the GAs is genetic drift. The GAs exploration process is a black box and the diversity information obtained
from past generations is only implicitly and partially preserved in
the current genome. This bears the risk of a regeneration of individuals that have already bee n see n in the search process. Even
more problematic is the fact that the search can be negatively af~
fected by genetic drift. As a consequence, big parts of the search
space, potentially containing the global optimum. will never be explored. Thus there is a need for consistent exploration techniques
that do not repeat the same patterns in mutation process and also
can improve the diversity when increasing gene tic generations.
EAs are producing vast amounts of data during an optimization
run without sufficient usage of them. In each of the numerous gen erations, a large number of chromosomes is generated and eval uated (Drobics, Bodenhofer, & Winiwarter. 2001 ). This data can be
used to produce valuable insight to enhance EAs solution quality.
GAs are complete probabilistic search-based optimization models
because they do not use the knowledge aggregated about the optimality regions and searched areas in past iterations. Necessary
requirements are for example, processing incoming data such that
it creates some useful information that incrementally improves the
next generation population and new chromosomes should not be
created based on entire probabilistic processes. It is possible to extract and use previously computed knowledge in next generations.
Then it can be concluded that there is an area of improvement
in convergence rate and diversity of GAs. Thus there is a need to
new GAs that their exploitations are knowledge orie nted to accelerate the intensification process and also have better diversification to avoid genetic drift.

1567

In this paper, a background for developing a new GA-based
algorithm is provided in the next section. In Section 3. the new
algorithm is presented in detail. SBMOGA is developed based on
some well known ideas such as SOM learning rule, and VNS shaking process. The new algorithm can provide consistent diversity
without repeated evaluations and a systematic local and variable
neighborhood search. In Section 4, a complex real world problem
namely multi -objective multi-resetvoir operation management
problem is described and the optimization model formulation is
presented. It is clear that this problem is non-convex nonlinear.
In Section 5, the results of application of new algorithm to solve
the multi-reservoir operation problem is shown. In the last section,
conclusions and future research opportunities are presented.

2. BackgTOund
In previous section. the main advantages and disadvantages of
the traditional GA-based optimization models were described in
detail. In this section, literature regarding the models improving
the traditional GAs is reviewed.
A variety of techniques for incorporating local search methods
with EAs have been reponed. These techniques include Genetic Local Search (Merz & Freisleben, 1999), Genetic Hybrids (Fleurent &
Ferland, 1994), Random Multi-Start (Kernighan & Un, 1970) and
GRASP (Feo & Resende, 1989). Local search schemes such as gradient-based methods are efficient algorithms for refi n ing arbitrary
points in the search space into better solutions. Such algorithms
are called local search algorithms because they define neighborhoods, typically based on initial "coarse" solutions. The tenn 'local
search ' generally is applied for methods that cannot escape these
minima. Some hybridization schemes that will be used to d evelop
the proposed algorithm are discussed below.
2.1. Hybrid loco/ seorch GAs

Hybrid algorithms are a combination of two or more different
techniques. Hybridization of local search and evolutionary algorithms has complementary advantages and combines the strengths
of different approaches in order to overcome their weaknesses.
Evolutionary algorithms have been successfully hybridized with
other local searc h methods. Hybrid EAs have the local search
power of traditional methods thus their accuracy is better than
the ordinary EAs. Also the methods common ly take advantage of
the good global search capabilities of evolutionary algorithms then
they are robust against getting stuck at local optima. The hybridization of genetic algorithms and local search methods, called genetic local search, has been applied to a variety of singleobjective combinatorial problems. The role of the local search is
to enhance the intensification process in the genetic search (Arroyo
& Armenta no. 2005 ).
Grosan and Abrah<lm (2007 ) showed some possibilities for
hybridization of an evolutionary algorithm and also presented
some of the generic hybrid evolutionary architectures that has
been evolved during the last two decades. They also provided.:. review of some of the interesting hybrid frameworks.
The first Multi-Objective Genetic Local Search algorithms
(MOGLS} proposed by Ishibuchi and Murata ( 1998), which is called
IM-MOGLS algorithm (Arroyo & Armenta no. 2005). An iteration of
the IM -MOGlS algorithm starts with a population with N solutions
denoted by P. Then the operators of selection. recombination and
mutation are applied to the elements of P until reaching a population of N elite so lutions . These solutions are recorded in current
nondominated set. Then N elite solutions are randomly selected
from the current set of nondominated solutions (denoted by Po )
and a restricted local search is applied to each sol ution in Po. In this
1568

M. Hakimi-Asiabar l!r ai.} Campurers & Industrial Enginuring 56 (2009)

process. a limited number of random neighborhood search around
each solution of x E P0 is generated and if its fitness is better than
that of x. it replaces x. otherwise, the local search ini tiated from x
termin.ltes. The number of neighborhoods examined from each
x e P0 is limited. and a new population Pis formed to strtrt a new
generrttion. They show that their results are better than those ob~
trtined by the Vector Evaluated Genetic Algorithm (VEGA}, proposed by Schaffer ( 1984 ).

jaszkiewicz (2002 ) proposed another MOGLS algorithm. which
is called J-MOGLS. In this algorithm, each iteration starts by drawin g random we ights from a pre-specifled set of weights to define a
scalarizing function. The n. the best and distinct solution set (B)
according to such a scalarizing function is selected from the current set (CS ) to form a temporary population TP. Two randomly selected solutions of TP are recombined and generate an offspring
which is submitted to a local search. If the solution resulting from
the local search is better than the worst solution in TP, then it is included in CS and the set of the best nondominated so lutions is updated. Recombination is the only component of generic algorithms
that the J-MOGLS uses and it does not use the mut.ation operator.
jaszkiewicz applied his algorithm to several problems with tvvo
and three objectives and concluded that his method's quality of
solut ions is much better than those gene rated by the IM-MOGLS.
Arroyo and Armentano (2005) proposed a MOGLS algorithm
with a structure identical to the IM-MOGLS algorithm. However,
their components are quite different, i.e. in each iteration elite
sol utions from archive are used co develop offspring population.
Then. a local se<~rch is conducted around some elite solutions. If
there are some better solutions. they are inse rted instead of previous dominated solutions. Then based on new results, both population and archived elite solution will be revised. Their results were
better than IM-MOGi5 and very competitive with J-MOGLS.
Hakimi-Asiabar. Ghodsipo ur, Seifi. Kerachian, and O'Brien
(2008 ) developed a hybrid multi-objective gradient-based search
algorithms ro solve non-differentiable multi-modal objective functions problems. They showed their method's applicability using a
real world multi-objective multi-reservoir operating policy
definition.
2.2. Hybrid GA-SOM

One of the important techniques that can be found in literature
to improve GAs efficiency is using of Self-Organizing Maps {Kohonen, 1997). SOM is a learning algorithm that provides dimensionality reduction by compressing the data. via training. to a
reasonable number of units (neu rons) (Kohonen. 1997). SOM uses
a nerwork of neurons that manage to prese!Ve the topology of the
data space. The map consists of a grid of units that contain all significant information of the data set, while eliminating possible
noise data, outliers or data faults. Adjacent units on the map structure correspond to simi lar data patterns. allow to identify regions
of interest through various clustering techniques. SOM networks
are considered to be ca pable of handling problems of large dimensionality. Applications of SOM include clustering, feature extraction or feature evaluation from the trained map (Rauber, 1999:
Ultsch & Korus, 1995) and data mining(Drobics et al.. 2001: Kohonen er al., 2000 ). SOM can achieve a detailed approximation of
probability density of input (or output) distribution (Kubota.
Yamakawa, & Horio. 2005 ). In addition. the probability density of
the "binary" input vectors can be approximated (Yamakawa, Horio.
and Hirarsuka (2002 )).
As mentioned before. GAs use a completely random search
strategy in their processes. The search strategy can be done more
intelligently, if in generating a new population, the informa tion derived from past generation is considered. SOM is an appropriate
tool to achieve this strategy. The neurons network lattice orients

1566~1576

along the Pareto opti mal set through learning and interpolating
neighboring neurons shared information along the Pareto optimal
set. The SOM interpolation can adapt to straight or cu!Ved Pareto
optimal sets.
BUche (2003 ) described a recombination operator to interpolate
the parent population that uses SOM. The SOM renders itself easily
as a recombination operator. which defines a lower dimensional
interpolation of the parent population. BUche reported the SOM
recombination advantages as: While most recombination operators recombine two parents, the neurons of the SOM interpolate
a local subset of parents. Creating a simplex of neighboring neurons supports the transfer of information. This increases the possible .amount of informrttion to recombine.
Amour and Rettinger (2005 ) used the SOM to improve diversity
and prevent premature converge nce of generic algorithms. They
trained a SOM offline to Jearn the search space fitness. and then
used it to keep diversity in search space of a single objective problem. Kubota. Yamakawa. and Horio (2004. 2005) developed a strategy for reproduction of new seeds in single objective genetic
algorithms using SOM to maintain genetic diversity. They proposed
the reproduction strategy based on the SOM for both Bit-String GA
and Real-Coded GA to maintain the genetic diversity of the population. In thei r method, the weight vectors after learning are employed as the chromosomes of the next generation. In other
words, the populat ion of the next generation is obtained using a
learnt SOM.
2.3. Variable Neighborhood Search (VNS)

VNS (Hansen and Mladenivic. 2001) is a local search algorithm
that was developed to work with the EAs. In VNS. by defining a
neighborhood structure around elite solutions. systema tic search
is performed to increase local search accuracy. In this al gorithm.
a set of K neighborhood structure with decreasing radius around
elite solutions is defined (see Fig. 1). then, step by step. the neighborhood radius decreases according to the predefined structure
and search is performed for better solutions. Thi s algorithm is an
efficient method especially in problems with non-smooth
functions.
In the next section. a new hybrid multi-objective genetic algorithm is developed by taking ideas from SOM learning rule and
its units movement toward the high fitness regions. The VNS shaking search technique is utilized to improve intensification, to maintain diversity, and to implement intelligent random recombination
toward elite solutions to !ocate the optimality regions.
3. The proposed algorithm

In this section. a new method. which is called SCM-Based MultiObjective GA (SBMOGA). is developed to improve the efficiency of
existing multi-objective genetic algorithms. The method c.an improve convergence rare and diversity of GA solutions. The new

Fig. 1. A neighborhood structure around an el1te solutiOn.
1569

M. Hakimi-Asiabor tt al.f Computers & Industrial Engine-tring 56 (2009) 1566- 1576

algorithm improves convergence ra re such that decrease the overall run time. Improved rate or convergence and also bette r diversity
or solutions in line with the optimal rrom increases the quality or
solution.
In the SBMOGA. first a set or we igh t vectors are randomly defin ed such that they have uniform distributions in the feasible
space as SOM neu rons centers. Then in each generation of GA,
the neurons will be tra ined using best soluti ons in current population or GA {first frontier. Deb Goldberg, 1989) in terms of best mu lti-objective sol utions. The multi-objective learn ing rule is adapted
from SOM learning rule introduced by Koh onen ( 1997) and Kubota
er al. (2005 ). In the training process the neurons gradually move
toward hi gh fitn ess solutions and with their stochastic movements.
they can find new good solutions. If the current location of any
neuron is a local or globa l optimum. it does not move to other
points because new point's fi tness will be domina ted by the fitness
of current neurons centers. This process can refine t he quality of
the multi-objective genet ic algori thm solutions by processing t he
gene ra ted data and extracting the probability density of the opti mal front di stribution.
SOM consists of an input and a competitive layer that includes T
and M units, respectively {Fig. 2). in which T is the input vectors
length an d M is the number of SOM's neurons. The jch un it in the
competitive layer is con nected to all units in the inpu t layer by
the weight vector ~ • [w11 , .... Wj s.. ... , w1r], j • 1. 2, 3, ... , M. In th e
learning proces s, the weight vector is continuously updated toward the high fitness inpu t vectors using the learning rule. R,- [R11, .... R;~.: •... , R;rJ is the ith non-dominated solution in current
population.
After training, a neuron's we ight vector usi ng a new high fitness
chromosome, new weight vecto rs for neurons centers will be defined. The n th e fitness of new unit centers will be calculated. If
new vector's fitness va lu e fW"-• domi nates the fw•. based on Pa reto
1
• Otherwise the
dominance, then repla ce
is replaced by
past weight vector w, will be remain as latest neuron's weigh t
n
vector. When a neuron's center reaches to a local /globa l opt imum
region during next generations. it will not be changed. because the
currenc weight vector's fitness values are not dominated by t heir
neighborhood sol utions {see Fig. 3a ). The neuron center's probabilistic movemenc toward high fitness chromosomes in feasible
space of genetic algorithm. he lp to sea rch new regions {ex ploratio n) by evaluation of some points {see Fi g. 4 ). Thus those neurons
that will located on local minimum will remain at th eir positions
and those neurons tha t can move toward better fitness areas continue ro move.
After definition or a localjglobal regio n. a local search in neighborhood st ructure will be conduc ted with gradually decreasing
learning ratio. This process is the same as variable neighborhood
search method with dynamic neighborhood structure that provide
better inte nsification in local/global optima regions {see Fig. 3b).
This process can continue unril the neurons reach to loca l/globa l
optima.

w;

R - [R 1 ,

. .,

w;'-

3.1. TI1e learning ru le

Koh onen (1997) defi ned the training rule for the SOM as
follows:

w ~ w;' + >(n)
;•'

IIR-

w
;u

(1)

w
;r

1
whe rc.n represents a learning step and W1" and
are the weight
vecto rs of un it s before and after updati ng, respectively. <X(n ) is
learning ratio that is a monotonic decreasing fu nction of learning
steps and IR- w;' ll is the distance between input vector R and
the weight vector
Kubota er a l. (2 005) defined the learning ratio in a single objective genetic algorithm as:

w
;.

(2)

where,fR andfw}{ n) are the fitness values of elite chromosome Rand
respectively. d1 is di stance beneuron center's weight vector
tween the jth un it and the winner un it in the compet itive layer.
h(JR. d1) is a coefficient represented by:

w
;.

h(f,d, )

~ exp ( :;; )

(3)

Then they rewrote the learning ru le as:

w;•'

~ W1
"

+ f, h(f,.dJ ) (1 - [.11, 1) (R- w;')

(4)

The lea rni ng rule. developed to achieve the detailed approximati on of the probability density or the in put distribution by continuous updating the wei ght vector for sin gle-objective problems.
They showed that th is strategy can provide better results in a single objective gene tic algorithm. Now, a revised learning rule is
introduced for locati ng high fi tness areas in multi-objective environments. The proposed learning rule for a multi-objective space
is defined as:

w;•'(n ~ w;'(r) + Y,(tl. h(n). (ll;'( t ) -

W j(t ))

r~

1.2 .... T

(5)

Where:
I

Yj(t)

if R(r) dominate

~ { 0 Otherwise

Wj(r)

(6)

and h(n ) is learning ratio in genera ti on number n of genetic algorithm. The domination in multi-objective space is defined basedon Pareto dom inance.
In a maximization problem, when the GA's e lite solution fitness
{
value fR is large and fWJtn) is sma ll, then y1 c) - 1: and neuron is attracted toward the chromosome, and adversely. when fR is small
and[wJ(nJ is large, then yi r) = 0. By usingyAc) factor. the weight vectors with low fitness values are attra cted to the chromosomes with
high fi tness values.
3.2. Th e learning ratio

The learning ratio or step size of learni ng is defined based on
both SOM and VNS. The SOM's learning ratio and VNS neighborhood strucrure are monoronicly decreasing. In MSLS, the learning
step size is defi ned as a decreasing function of GA's generation
number. Learning Ratio or neighborhood distance in VNS can be
calculated from Eq. (7 ):

R, . .. . Rrl

h(n)

Competitive Layer
Fill;. 2. Input and output layers in SOM : input layer contains new chromosome-s
fro m GA o~.nd the output lo~.ye r conu m~ the un its (neurons) of SOM.

~

1f (k, - max(k2 - n)/1 00)

(7)

where. k 1 is a const<tnt for initia l learning ratio and k 2 is <t threshold
value to start decreasing the learning ra tio.
The proposed learning rule creates an intelligent probabilistic
local search technique in multi objective space. The movement trajectories of neurons are random because the locations of current
1570

M. Hakim1 -Asiabar

~r al. j Compur~rs

&'Industrial Enginecnng 56 (2009) 1566- 15 76

a

b

Fig. 3. The shakmg !raJectory-b.ued search concepu: (a) A neuron's center SNrch along the el1te soluuons. Dommated chromosomes .ue replaced WJt h new ones in a
troijec!Ory. (b) Chromosome neighborhood se.uch by monotonic decre.ising le.irning r.itio with non-domin.itC'd center m iteration nand n • 1. These condmons are rhe so~ me
as VNS algorithm w ith dynam ic neighborhood stmcture.

~---+=71''-t-~- ~dominated Solution
of Current population

Fig. 4. Probo~bJhsuc surch toward non -dominated (elite) solutions in each iteration
with .1 dynamic decre.1sing learning (neighborhood) ratio.

population's first frontie r solutions are not predefined and they are
defined probabilistically (see Fig. 4). This method is also intelli gent.
because the searc hes move toward elite solutions. The step by step
update by using Eq. {5) can generate new weight vectors (chromoso mes ) which are different from the present chromosomes based
on the distribution of the fitness values. This reproduction can preserve the genetic diversity and provide an effective search.
A grid of neurons with a random starting weight vectors, can
introduce both divers ity and local search accuracy by using multi ple parallel search sc hema and shaking probabilistic trajectorybased search strategy (Please refer to Fig. 5). The learning neurons
are parallel search trajectories that can defin e the Pareto optimal
regions and their probabi lity distribution functions.
3.2.1. The statement ofSBMOGA

Step 0. Initialize the NSGA-11 and SOMs parameters. Set the
crossover and mutation probabilities. the number of generations (Nag ). the number of neurons (Nos/) and the learning rule
parameters and set k2 = 1.
Step L Define the neurons starting weight vectors
(W 11 j = 1. 2. . Nos I) randomly such that these weight vectors
have a uni form distribu tion over the fea si ble search space. Then
evaluate the weight vectors based-on objective function values.
Step 2. Calculate the lea rning ratio using Eq. (7).
Step 3. Run the multi objective genetic algo1ithm and determine the population of chromoso mes in the nth generation.
Eva luate current population and determine the first frontier
chromosomes (elite solutions) of current population
(i = 1, ... , n, ).

Step
( W;

4.

Trai n

the

neurons

la st

weight

vectors

j - 1. 2 . .. Nosl) using ith chromosome of first frontie r

Fig. S. Paro~llel neurons th.H seuch high fitness arus in muJ[i modo1l mulri obje<tive
search space.

chromosomes { R ~) and the learning rule (Eq. (5)) and define
1
•
the new weight vectors
Step 5. Ca lc ui.He the fitn ess values of new weight vectors
1
j = 1. 2 . .... Nos/. If Wt 1 dom inates the
based on
.
Parero dom inance. then re place it with w;'· 1 •
Step 6. If i "' n 1 the n n 1 - n 1 +1 and go to step 4. If i - n 1 and
k 2 <Nag then k 2 • k2 + 1 and go ro the step 2. If i- n 1 and
k2 - Nag, go to step 7.

w
;-

w;-

w
;

Step 7. End.

Fig. 6 shows the relationship between the NSGA-11 and SOM
uni ts training al gorithm s. In this figure, Pr is parent popul.uion.
(6. is children population, and F_j is the jrh frontier population in

NSGA-11 algorithm and Lpr is neu ron centers population. In each
iteration r. a subset of LPr can be dominated by new neuron center
weight values and then is replaced by them. On the other hand. a
subset of Lpr which is not dom inated by new neuron's weight values. remains. The next neuron centers population content in iteration r+ 1, is defined by this two subsets.
Fig. 6 shows the learning process of SOM units population from
NSGA-11 first frontier elite solutions: after each genera tion. some
neurons move to better locations in terms of objective func ti ons
quality and a new neurons population is created. The flowchart
of the algorithm is shown in Fig. 7.
'
1571

M. Hakimi -Asiabar et a/. I Computers & Industrial Engim>ering 56 (2009) 1566- 1576

Nou-Dominated Sotting Algo l ithm II

Ttainiug the Self-Otga uizing

Fig. 6. A s ketch of NSGA-11 .md its relrttion to SOM's neurons le.nning process.

4. Case study

The optimal operation of multi-purpose multi-reseJVoir systems is a real world complex problem. There are many advances
in operation of reservoirs which are cited in the literature. L:lbadie
(2004 ) presented a state of the art review on mathematical programming and heuristic methods in optimal operation of multireservoir systems. He concluded that although there are a few
areas of application of optimization models with a richer or more
diverse history than in reservoir systems optimization and opportuni ties for real-world applications are enormous, actual implementations remain limited or have not been sustained.
In this section, the applicability of the proposed algorithm will
be examined in developing operating policies for the Karoon-Dez
multi-purpose multi -reservoir system. The Dez and Karoon reservoirs. with a tota l storage capacity of more than 6.4 billion cubic
meters (BCM ). form the most important reservoir system in south western Iran close to the Persian Gulf (see Fig. S ).
The system carry more than one-fifth of the Iran's surface water
supply ( Karamouz & Mousavi. 2003). The reservoirs have been constructed on the Karoon and Dez Rivers. The two rivers join together
at a location called Band-e-Gh ir, north ofthe City of Ahwaz. to form
the Great Karoon River. The average annual inflows of the Dez and
Karoon reservoirs are 8.5 and 13.1 (BCM ). respectively. The water
downstream of the Karoon and Dez dams, supply domestic. industrial, agricultural and agro-indu strial demands. Total water de mand downstream of the Dez and Karoon dams is estimated as
1.95 (BCM), from which 42% is a llocated to downstream of the
Dez dam (d 11 ) : 35% is allocated to downstream of the Karoon
dam between the Karoon reservoir and Band-e-Ghir (d 2 , ) and the
rest goes downstream of Band-e-Ghir to the Persian Gulf (d 3 , ).
There is also an environmental water demand equal to 0.62
(BCM ) as in-stream flow in the Great Karoon River {d 41 ) (see
Fig. 9). The reservoirs have a hydropower generation capacity of
1.15 million megawart hours (MWh ) per month.

Other model variables are:

,.¥in

"

Maximum storage volume of the ith reservoir
Minimum storage volume of the irh reservo ir
Storage volume of ith reservoir in time period t
Water head elevation in irh reservoir in time period r
Inflow to the ith river in time period t
Release from outlet of hydropower plant of ith reservoir in
time period t
Maximum capacity out let of hydropower plant of ith
reservoir
Minimum water required to activate the hydropower plant
of ilh reservoir
Water release from spillway of ith reservoir
Maximum spillway capacity for ith reservoir
Minimum spillway capacity for ith reservoir
Maximum release for ith reservoir
Minimum release for ith reservoir
jth water demand in time period t

4.2. Objecrive funccions

First objective function: minimizing unsatisfied water demand
3

MinZ1 =

T

2::: ~ (dJr - d11
j: l

1=1

).j1)

2

3

=

T

f" l

1= 1

L L dJr · (1 - /.j

2

(8)

1)

where
i.1, • dit -Xtit · Ru = 0
0 :; ,; : i.jl:;;,;;: 1

(9)
( 10)

4.1 . Model fonnularion

In this study, a mathematical model for monthly operation of
the Karoon and Dez reservoirs is developed considering the objectives of water supply to downstream demands and power generation. The decision variables of the optimization model are as
follows:

R1r Releases for reservoir i in time period r
Xii1 Percentage of outflow from reservoir i allocated to water
demand j in time period t (0 :;;; Xijr :;;; 1 ).
Ajr Satisfied portion of jth demand location in period r

• Second objective function: maximizing power generation

Maxl2 =

t

f=l

t K;.e;. rl lr · Htr (Su, Stt- t . Ra )

( 11 )

i= l

where
K1 Energy transfer coefficient of ith hydropower plant
e1 Efficiency index of ith hydropower plant
H11 Mean value of water head behind the ith reservoir whose
storage is equal to S1r
'
M. Hokimi-Asiobar t.>t ol. / Compurm & lndusl'riaf Engineering 56 (2009) 1566-1576

1572

Randomly generate fr~t genetiC popu!atrcn
and we.ght ;·ec:orsofS01.1 ulltts
E• 3IUJle !l'lt Cl¥11111: XIOUI,ncn JM l'.'f !Qtll

,edcrs ol SOU uml:S

t:ase~

en Ct jeat.e

.. lfld>COS

Recordnoo-<lommated solut10nsmthe

arctu·.-.

Record non-doi'Tiolnated
'olvt•ono; ofSQ IAu'l :.uc""""

Fig. 8. The Karoon and Dez nver- reservoir system m southwestern Iran.

D
R1:£L

Ko<oon

Kl'lr()OI'I

7

d4<

E;a!uate new chromosomes c;:~ns.def;ng
th•er obj>K:tr<e furc!lons <tnd r30krng

Fig. 9. A schematic diagram of the K,uoon- Dez river-reservoi r system.
Select the but solutrons based on theor
tanklngs and prodKe the n~xt pop<.II:Hron

L---<~~----- ----------

-------

condl1ton SOIII$1•cd?

C:::::.-:::::;_2:. l 1.r

_
1 1·H~••l

---~

_::>

_j_H
,

H r.l

1

HPP
FiJ. 10. The reservorr p.u•meters.

Fig. 7.

Th~

The effective water level height for producing hydropowe r energy
can be calculated as follow (see Fig. 10 for more details ):

flowchart of SBMOGA.

(13)
flu is usually a non -linear function of reservoir srorage volume,
which can be presented as:

The tail water height of released water from reselVoir i in time period c can be est imated as:

(12)

( 14)
M. Hakimi -Asiabar l't al.f(omputl'fS & Industrial Enginttring 56 (2009)

Therefore. the second objective function can be wrinen as follows :
l

T

:LK, e,. r., . (H(S., )- H(R., ))

=L

M~z,

' ""]

The hydropower energy production is a nonlinear and non-convex
function.

1. Water storage capacity constraims

~' 1 " ~ 511 ~ ~11.11
~fin ~ S21 :;;

t = 1, 2. 3 .... T
l = I. 2. 3 .... T

1i11!1

2. Water demand constraints
R tr · X11r = ).J, · dlr

0

~ Xn 1 ~

1

R2r · X22r = /.21 · d 2r

o ~ X22r

~

1

Rl r" XIlr + R2r · Xnr = /.Jr · d2r
X1r = i. lr · d lr

R1,-

0

~ ).11 :!i;

1

X2r = i.2r · d 2r

R2r-

0~

h,

~

1

lnst ream flows
d41

620

:;;l!:

3. Continuity equations
S1 ,1

= S1.o

52,= 52 .0

S, r-1

= 5 tr -

S2r .. 1 =

52t-

R1r

:s::;

Rlt (

ft.fa~

0 ~ R2, <; R'~I!U

= 1, 2.3 . .. .. T
= r 11 r - r21 r
R2r = r12 r - r22r
r

R1r

r = 1. 2.3 .... T
5. Capacity of hydropower plant outlets

r~'1"

:s::; r2.1r

:s::; ~~

rfln ~ r 2 2r :5; ~'f
r = 1.2.3 , ... , T
6. Spillway release capacity constraints

~r'

:5;

r1

1c

4.4. Application of the new algon"thm to solve the model
To so lve the reservoir system problem. first the algorithms
paramecers and constrai nt hand ling method should be defined
appropria tely. Constraint hand ling schemes for GAs are the penalty
and repair methods (Chootinan & Chen. 2006). The repair method
attempts to fix infeasible solutions by taking advantage of problem's characteristics. The repair method might be very effective,
if th e relationship between decision variab les and constraints
could be easily characterized. However. developing a repair procedure is usually problem-dependent and time-consuming when the
problem includes comp lex constrai nts. In this case. the repair
method for constraint handling is used in the solution process.
ln the in itial ization process of the GA. a set of probabi listic initial solut ions is used. In the multi object ive view, to assign fitness
to the solutions. the concept of Pareto dom inance is used. The
learning ratio parameters are defined as k 1 "" 1.7 and k2 • 10. Then.
by using Eq. (7 ), the learning ratio in generation n is defined as:
h(n) = 1/ (1.7 + max(10. n)/ 100)

This formulation causes the learni ng ratio in first to gene ration
will be sta tic. Then, it will decrease with a smooth low level rate,
which provides a relative ly hi gh exploration rare_
The problem is formulated for a 30 year rime horizon containing
360 monthly rim e steps. Therefore. eetch chromosome includes 720
variables for release values R,,. The solution values for variables X,1 r
and ;." are defined in an optimization process as dependent variables of R,r.
The neurons number is considered as 42 stochastic vectors generated for 42 neurons cente rs start points. Both of the objective
functions must be minimized (th e hydropower generation function
is multiplied by - 1).

+ /1r

R2 r + /2r

4. Reservoirs' release constraints
0

1573

( IS )

/ '0 

4.3. Optimization model constraints

1566~1576

This problem is a non-linear non-convex multi-objective optimization problem.

:s::; ~'f

5. Results and discussion
In this section. the results of the proposed model (NSGA 11-SOM
Based Le<trning) in developing operat ing policies for Karoon-Dez
river-reservoir systems are presented. To evalu.ate the model efficiency. its results are compared with the classical NSGA- 11 model.
The specified problem solved using a Pen tium 4 personal compu ter with 512 RAM and P4. 2.8 CPU_ Both algorithms have been
run with 20. 100. 500, and 1000 generations and the results are
presented in Table s 1-3. Table I shows the run rime of algorithms
for the four defined levels of generatio n. Table 2 shows the results
of 42 neurons tr.ained b.ased on best solutions of NSGA-11 algorithm
for four different runs. The results show that the model convergence is consistent with the number of generation so that the solutions of higher generation are better than those of low generation.
The data of Table 1 show thea the ru n time ofSBMOGA for similar number of generation is more rhan NSGA-11. This is because the
addition of SOM training process to each generation of the SBMOG A increased the one generation's computational complexiry. How-

rf~' ~ r12r :s::; ~~~

rf~' ~ r 1 1r :s::; ~ f
c = 1. 2. 3.. .. . T

1

7. Water head and tail water definition constraints

= a1 · S~ + b1 5., + c1
H~{IJ/ = a2 (Rit) 2 t b2 (Rn } t- C
2

Hir

H~, = H, - H~a.t

Tilbl~ I
The _.lgont hms run t1 me m fou r level. of generdtions (mmutes).

Number of genera tions

Running time
NSGA·Il

20
100
500
1000

5BMOGA

0.38
1.75
935
18.50

0.65
3.67
19.3
35.9
M. HaHmi-Asiabartt ai./Computm & Industrial Engineering 56 (1009) 1566- 1576

1574
Tab~

2
Tht ObJtCtlvt funct1ons

v~lues

Neuron No.

Number of generations

obtained from the propoSt-d model cons1denng 42 f'I('Urons.

100

20

"

10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42

359,529
355,612
359,118
352,873
351.568
356,080
355.591
355.913
351.766
355,930
352.316
352.771
359.080
355.520
359,099
356.055
352.148
358.098
359,709
359,867
356.340
359,704
355.275
359.324
359,439
362280
352.982
351,743
355.83 1
354.014
363,910
352.113
357,049
359.268
352,084
355.614
359,300
355.911
357.872
360,001
352,374
356,355

500

F2

F1

F2

661.445
743,7 18
788,238
178.156
675,785
468,484
472544
676,120
457.563
684,66 1
672,340
675,042
182.354
743.890
576,571
743,570
672.847
467268
672,954
686.592
472,669
683,634
463,262
496,185
675,799
744.054
476,643
675,455
744,713
473.601
675,076
675.355
473,645
804,086
673.962
744.253
803,725
496.474
680,485
467.284
674,411
476,052

349.400
354,549
354.176
352.873
346,531
349,918
345.282
352.013
35 1.766
155,599
352,087
352,661
353.445
354,405
358.622
354.802
352.081
353.824
352,690
359.382
356.279
355.712
350.260
352,936
353,841
359.416
346.991
346,972
354.663
349,336
358.904
347,101
355.678
356.211
346.984
354.465
356,199
355,973
346.712
351.222
352.244
348.651

680,498
804,760
858,983
178.156
697.816
686,304
546.473
754.730
457.563
763,654
752.315
753.674
616.803
805.007
676,092
804.492
752,095
918,306
920,990
757.706
598,301
763.064
682,743
792,392
846.955
911,052
653.230
697,429
805,304
547.467
834225
697.488
660,824
837,066
696271
805.135
836,793
191.993
701,134
854.818
753.262
570.679

ever. this can be discarded by improved rate of convergence of the
new algorithm.
Fig. 11 shows a graphical representation of the so lutions derived by SBMOGA after 100 generations and with 3.67 min run
time and the solutions derived by NSGA-11 after 1000 generations.
with 18.50 min run time (see Table 1 ).I t can be easily seen that the
results of SBMOGA after only 100 generations. outperformed the
NSGA-11 results after 1000 generations while the run time ofSBMOGA has been 20% less than NSGA-11 (one of fastest multi-objective
GAs ).

As shown in Table 3. the classical NSGA-11 provides a Pareto
front with 18. 18. 18. and 24 non-dominated solutions for four defined categories. As can be seen in Table 3, some of solutions of
NSGA-11 in each run are dominated by corresponding SBMOGA
method sol utions. In this tab le, the number of a solution of SBMOGA method that dominates a NSGA-11 solution is written in the
"Dominated by" column. For example, for a generation number
equal to 20, the neuron number 2 (see Table 2) dominates the solutions number 2 and 3 of NSGA-11. Then it can be concluded that in
this case study. the SBMOGA method has improved the NSGA- 11 final results in all cases.
Also some solutions de ri ved by SBMOGA method in 20 generations can dominate some solution of NSGA-11 in 1000 generations.

1000

F2

346,576
347,678
345,650
350,368
344,684
343,670
343.764
344,732
336.744
346,362
344,758
344,909
344.779
347.599
346,394
347.683
344.764
346,780
345,517
346.350
345,441
346.369
345.235
345,456
345 ,917
349,132
343363
344,759
347.62 1
345.332
348.822
344.838
342.395
348.593
344.790
347,611
348.589
144,766
344.658
348.371
344.771
342.677

F2

825,949
874,085
910.483
242.943
8282 18
725,161
726.115
828.395
619.626
829.582
827.622
828.152
837.820
874.112
814,083
874,047
827.588
910.657
926.27 1
829.756
750.825
829,462
777.454
866,678
828.540
943,121
747,450
828.122
874.476
75 1,084
938.514
828.129
788.805
921 ,831
827,892
874.240
921,764
5 19.040
829,319
789,267
827.886
822,950

"

F1
346,166
345.820
344,619
350.847
344,709
344,155
344.251
344,739
343.302
146.029
344,731
344,919
344,449
345,758
346,503
345.832
344,760
344,432
346.580
346.000
345.624
346,052
345.765
343.263
345.587
347,400
343,293
344,759
34 5,792
345.267
347,301
344.831
342.716
348.412
344,767
34 5,751
348.403
346.066
344,650
347,865
344,743
342.701

852,940
889.372
913,619
273.575
802,737
728,114
729.076
802.947
528.429
856.329
802,156
802.688
844,698
889.345
786,629
889,344
802,086
914.509
919,526
856.546
763.241
856,216
787,943
864,545
855.483
937,848
768,893
802,674
889.672
736,944
933,753
802.690
816.706
93 1,123
802,505
889,494
931,063
402,797
804.040
795,414
802.418
816,835

For example. the neuron number 41 can dominate solutions number 17, 1S. 19, 20, 21, 22, 23. and 24 of NSGA-11 in case of 1000 generations. This also shows the higher convergence ra te of SBMOGA
in comparison with the NSGA-11.
Another criterion for multi-objective solution quality is diversity measure. Diversity measure is a number in the range (0, 1],
where 1 corresponds to the best possible diversity and 0 corresponds to the worst possible diversity. In this paper a revised version of diversity measure described by 1<hare {2002 ) is used.
In this new diversity measure. there is not a need to a reference
set of solutions. To calculating the diversity measure following
steps are required:
1. Define a grid of KM in objective space in which M is the number of objectives and K is the number of grid cells in each
dimension.
2. Calculate following arrays:
1 if t he grid has a representative point in the range
b(m , k ) ,;; x < b(m, k - 1)

l! (m, k ) =

(
0

otherwise

b(m. k) is the grid edge in kch step of dimension m.
'
M. Ha ki mi-Asiabar rr al./ Computers & Industrial Enginrering 56 (2009) 1566- 15 76

1575

Table J
The P.m•to front of rhe problem obtained using the NSGA-11 model for 20, 100. 500, and 1000 geller.ations.

Solution

Number of generations

No.

Dominatffi

20

by

F1

F2

388,141.6
405,496.8
403,847.7
416.1322
417,945.5
432,953.5
440.894.2
412,087.7

813.347
723,788
706,276
835.633
848.826
989,272
1.043,934
72 1.1 51
814,299
899,610
870,436
1,047,148
679.413
176,989
155,239
161237
726.45 1
181.075

418.456.8

436,507.3
431 ,755.3

10
11
12
13
14
15
16
17

500,855.4

397.748.8
387,568.2
371,411.3

374.508.7
41 4,636.6
387,997.4

"

100

389,773.8

37
37
37
37
37

846, 163
1.027,269
945,288
990,993
1.043.934
721,151
978, 136
962,407
888.143
1.201,649
6 79.4 13
740,462
177,471
153,852
184.994
178,627

500

F1

F2

29
18
40

735,255
912.242

401.5133
396,698.3
407.882.7
410,534.3
429,530
440.894.2
412.087.7
426.984.1
422.721.8
430,270.8
456.668.9
397.748.8
413,059.9
386,237.8
365,471.6
393,048.1
387,058 .6

Domin<~te<1

Dominated
by

F2

F1

367.219
394,552
388.380
398,997
412,286
410,983
429.390
427,234
388,750
402,750
401 ,565
420.311
421,846
431,254
477,218
375.130
363,221
378,847

190,457
879.842
801,342
1.073,409
1,138,847
1,126.430
1.14 1,470
1,013,215
810,390
958, 16 1
883,162
962.999
949,169
1,092.299
1,148,629
204.636
169,429
192.255

11

18

37
37
30
30
30
30

19
20
21

22
23
24

SbM0GA

~-~
C•p:,;;:;-~jf,~l;-ii-;,-,-00-9 '~''G r;;;-,;-~
! + The NSGAU Opl •m.oo! Front iilfi;:n 10XIg:oneraucns

+

.
.....
! ""' _,.

+

......,....
~

.,... ...

.. +

... ..
,,

~-~--~---.J.............-•-----···---·---

3 .6

38

4

.1 .2

4 4

F1 IJI>'iill n>fa•d W~!e< [iem<ond (MtniY'IH')1COJM 3

46

ll

10 ~

Fig. 11 . G ra p hic ;~ ] co mp.1rison o f o ptimal from d efi ne-d by SBMOCA <~fte-r 100
ge-ne-ration and 3.67 min of run t ime- .1nd NSGA-ll a fte r 1000 ge-ne-rations and
18.5 m in ofrun ti me.

m

~

19
26

1000
F1

F2

368.577
394,552
388.380
399,702
404,632
406,109
444.041
427,234
387.800
399,220
401.74 5
417.538
421,846
430,024
387.222
385,522
393,801
376,804
363,495
367,001
382.600
378,019
376.174
382.25 3

186.399
879,842
801,34 2
1,055.495
1,157,574
1,062,778
1.209.903
1,013,215
703,727
1,000,038
888.584
1,013,192
949,169
1,107,611
6 10.466
205.659
868,675
190.349
173.224
157,982
197,921
19 3,645
190.107
194.746

Domin.Ht:d
by

16

10
14
14

9

6. Conclusions

~ 10~

0

by

1. 2 .. . M k

~

1.2 ... ,K

3. Define the divers ity measure as follows:
(16)

The diversity measure of SlGA that is calculated using Eq. ( 16) is
equal to 0.4222 while the diversity index of NSGA- 11 is equal to
0.2444 in a grid of 15 •15. In th is analysis the non-dominated solutions of SBMOGA at the 100 generations and the non-dominated
solutions of NSGA-11 at the 1000 generations is used. The results
show more than 42 % improvement in the diversity measure.

In this pa per. a new method called SOM-Based Multi-Objective
CA (SBMOCA) proposed to improve the effectiveness of a well known GA-based multi-objective optimization model namely
NSCA-11. In the new algorithm, in each generati on, a popu lation
of neurons will be t rained us ing el ite solutions ofGA's current population. The train ing rule of neurons is developed using the concepts of learning ru le of SOM and variable neighborhood search
algorithm to imp rove both local and global search. Moving the
ne urons ce nters in stochastic shak ing t rajectories in feasible search
space provides better results for exp loration and improves genetic
diversity. The training ratio (or training step size ) is considered as a
dynamic monotonic decreasing funct ion of the GA genera tion
number. This provides a variab le neighborhood search when one
neuron reaches to an optimal region. The neurons in SBMOGA algorithm uses the aggregated knowledge of optimum regions in past
generations and new we igh t vectors for neurons' centers are defined as a function of previous weight vectors. After some first gen erations, the neurons gradually attracted by local high fitness
regions and the searching process is conve rted to a exp loitation
process based on existing aggregated knowledge about optimum
set regions. Then random fluctuation s of current neurons weight
vectors within the high fitness areas can produce better
explorations.
The new method is not an independent algorithm. In this paper.
it was linked w ith NSGA- 11 and the model was applied to a realworld two-rese!Voir optimization problem. The results have
shown that the SBMOGA convergence to opt imum front is much
quicker than classical NSGA-11 and it can decrease the complexity
problems of EAs. Also the diversity of the non-dominated solutions
of SBMOGA was much better than the NSGA-11.
The main advantages of the proposed algorithm can be summarized as follows:
( 1) Development of a multi-objective learnable algorithm

based-on Kohonen's neural network.
'
1576

M. Hakimi-Miabar et al.jCompurers & Industrial Enginetring 56 (2009) 1566- 1576

(2) In the algorithm, the movement of SOM units centers in feasible search space toward elite so lut ions of GA supports performing an intelligent reprodu ction technique. As the
locations of GA's elite solutions are stochastic. t his provides
an inte lli gent stochastic expl oration.
(3) The knowledge of optimum area gathered in past genera-

tions of the algorithm is saved in form of neurons trajectories.
This process min imi zes the probability of reevaluating the
solutions.
(4 ) The shaking process of VNS in loG!. I areas around the elite
solutions provides better capability for d iversity and exploitation and enhances the GAs local se.arch accuracy.
(S) Developing a multi-objective leetrning rule for SOM.
(6) In mu lti-mod al objective functions thi s algorithm is capable
of findin g loca l and global optimums. This capab ility is
improved by increasing to the number of neurons.
(7) W hen a set of neurons co ncen trates on a region, it can show
a loca l or global optim.ll area or a cluster of Pareto-optima
solutions.
(8) The final position of neurons can be considered as an
enhancement of Pareto front presented by cl assical multiobjective generic algorithms.

The area of future research
In this paper, th e number of SOM's neurons considered as a
fixed number(42 in this case).lt has been seen that the maximum
number of solutions derived by SOM is equal to the number of its
neurons. A future area of research would be defining optimal number of SOM units. The number of units can also be rega rded as an
adaptive parameter to improve the model's efficiency.

References
Amour. H. R.• Rettinger. A. (2005). lntelligem exploration for genetic algorithms.
GECCO'OS, June 25-29. 2005. Washington. DC, USA.
Arroyo. j. E. C. & Armentano. V. A (2005). Gene ric local search for multi-objective
flows hop scheduling problems. European journal of Operorionol Rrseorch. 167,
7 17-738.
Brown, M__ & Smith, R. E. (2005). Direned multi-objective optimisation.
International journal of Computers Systems and Signals. 6( 1).
Biiche. D. (2003). Mul ti-objective evolmionary optimiz;~tion of gas turbmro
components. Ph.D. thesis. Swiss Federal Institute of Technology ZUrich.
Chootinan, P., & Chen. A. (2006). Constraint handling in genetic algorithms using a
gradient-b;~sed
repair mt-thod. Compute-rs & Operations Resrarch. 33.
2263-2281.
Cordon, 0., Moya. F.. & Zarco, C. (2002). A new evolutionary algorithm combining
simulated annealing .md genetic programming for relev.mce feedb.1ck in fuzzy
information rt'trievill s;ystems. Soft Compuring, 6. 308-319.
Deb, K. (2001 ). Mulri-objrcrive optrm1zarion using t'l.'ohsrionory o/gomhms. New York:
Wiley.
Deb. K.. Prat.1p. A. Agarwal. S., & Me-yilriviln. T. (2002). A fast and e-litist
multiObJeCtive genetic a lgorithm: NSGA-11. IEEE Transactions on Evolutionary
CompJ!l arion. 6. 182-197
Drobics. M.. Bodenhofer. U.. & Winiwarrer. W. (2001). Datil mining usmg synergies
between self-organizing m.1ps and inductive learning of fuzzy rules. ln}oinr 9rh
IFSA world congrt'SS and 20th NAFIPS mtrmational c:O/ifert'ncr
Feo. T.. So Resende. M. (1989). A probabilistic heuristiC for <1 computationally difficult
set rovt-ring problem. Oprranons Rrsrarch Lettrr>. B. 67-71
Fleurent, C., & Ferland, j. (1994). Genetic hybrids for the quadr.Hic assignment
problem. DIMACS Serirs in Discrere Marhrmarics and Throrerical Computer
Science, 16
Fogel. L J. {1962). Autonomous dUtomata. Industrial Rtstorch, 4. 14- 19.

Fogel. D. B. ( 1991 ). System identification trough simulattd rvolution. A mochinr
learning approach. USA: Ginn Press.
Goldberg. 0. E. (1989). Genrric ulgonrhms in search, opnmizotion & machinr /rarning.
Boston, MA: Addison-Wesley.
Grosdn. C., & Abrahilm. A. (2007). Hybrid evolutionary algorithms methodologies
architectures .1nd review's. Studies in Computational Intelligence (SCI ). 75, 1- 17.
Hakim-Asiab.1r. M .. Ghodsipom, S. H., St-iti. A.. Kerarhi.1n. R.. & O"Britn. C. (2008). A
multi-objective hybrid gradiem-ba5ed gene ric a lgorithm. !n Fifrrenth
intrmationol working seminar on production f'Conomics. lnnsbruck. Austria,
March 3-7,2008 (pp. 235-251 ).
H.1nsen, P., & Mladenivic. N. (2001 ). Vanable neighborhood se,uch: Principles and
ilpplications. European journal ofOprrational Rrsearch. 130,449-467
Holl.md. j. ( 1975). Adaptation in natural a11d artificial system.~. Ann Arbor: The
University of Michig<~n Press.
ls hibuchi. H., & Murata. T. (1998). A multi-objective generic local search algorithm
and its application to tlowshop scheduling. IEEE Transactions on Syst ems Man
and Cybernrrics, 28(3). 392-403.
j .Jszkiewicz, A. ( 2002 ). Genetic local search for multi -objective combinatorial
optimiution. Europton journal of Operananal Research. 137. 50--71.
Ka ra mouz. M., & MousJ.vi. S.]. (2003). Uncertilinty b.1sed operation of large scale
reservoir systems: Oez .1nd Karoon experience. journal of the American Warer
Resources Association. 01219. 961-975
Kernig han, B. W .. & lin, S. ( 1970). An efficient heuristic procedure for partitioning
graphs. Bell Systrm Trchniml journal, 49. 291-307.
Khare. V. {2002). Perform.1nce scdling of multi-objenive evolutionary .Jigomhms.
M.Sc thesis in Narur.1l Computation. Edgbaston. Birmingham B15 ~TI. UK:
Unive rsi ty of Birmingha m.
Kohonen. T. ( 997). Se-lf-organizing mops. Information sciencrs (2nd ed.). Springer.
Kohoncn. T.. Kaski. S., lagu s. K.. S.1lojarv1,j., Honkel.1,j .. PJ.iltero. V .. ct al. (2000). Self
organization of a mass1ve d ocument collection. IEEE TrOII50Ctions 011 Neural
Nerworks, II. 574-585.
Kon.1k. A. Coit, D. W .. & Smith, A. [_ (2006). Multi-objective optimization using
genetic algorithms: A tutorial. Rdiabifity Engineering and System Safety. 91.
992 - 1007.
Koza. j . (1992). Genrric programming. On thr programming ofcomputrrs by means of
natural selection. The MIT Press.
Kubota. R., Yamakilwil. T., & Horio. K ( 2004 ). Reproduction Striltegy b.1sed on selforganizing map for re<~l-coded genetic .1lgorithms. Neuro/lnfannotion Processing
- Lecrers and Reviews. 5(2 ).
Kubota, R., Yamakaw.1. T., & Horio, K. (200S). Reproduction str.1tegy based on selforganizing map for genetic .1lgorithms. /mernurional journal of Innovative
Computing, lnfonnation and Control /CtC International, 1349-4198. 1(4), 595-607.
Labadie, j. W. (2004 ). Optimal operation of multi reservoir systems: state-of the-art
review. journal of Worrr Reso11rcts Planning and Management, ( March/April).
Merz. P., & Freisleben. B. ( 1999). A comparison of memetic algorithms tilbu search
<1nd ant colonies for the qu.1dratic assignment problem. In /nternacional congress
on rvolutianary compuration ( CEC99 ) (p p. 2063-2070). IEEE Press.
Michillewicz, Z. ( 1996). Generic a/gorirhms .. data st rocrures '" evolutiun programs.
Springer-Verl.1g.
R.1ngarajan. A.. Ravindra n, A R., & Reed. P. (2004 ). An interactive multi-objective
evolutionary optimization algorithm. In Procerdings of thr 34th wtemorional
confrrence on computers & induscria/ engineering. 277-282.
R.1uber, A. ( 1999 ). LabeiSOM: On the la beli lg of self-organizing lll<IPS- In Proceedings
of international joint confrrence on nrurol networks. Wa$hington, DC
Schaffer, J.D. ( 1984). Some experiments in machine learning using vector evaluatrd
genetic algorithms. Ph.D. thesis. N,ulwille, TN: Vanderbilt University.
Schwefe1. H.-P. ( 1975). Evolutions striltegie und numerische Optimierung. Ph.D.
thesis.
5chwefel, H.-P. (1981 ). Numerical optimization of computer models. W iley Chichester.
Schwefel, H.-P. ( 199S). El.'o/ution optimum sreking. Sixth-generation computer
technology series. john Wiley .1nd Sons.
Sriniv.1s, N.. & Deb, K. ( 1994). Multi-objective function optimization using nondominated sorting genetic .1lgorithms. Evolutionary Computat ion journal. 1{3),
221-248.
Ultsch, A.. & Koms. D. ( 199S ). l ntegr<~tion of neural networks with knowledge-based
systems, In IEEE /nremorinnal Conference on Ntural Networks. Perch.
V<!n Veldhuizen, D. A, & lamont. G. B. {2000). Multiobj ective tvolutionary
.1lgorithms: Anillyzing the state-of-the-an. Evolutionary Comp utation journal.
8(2). 12S- 147.
Yam.1kaw<1, T., Horio. K., & Hir.1t suka, T. (2002). Advanced self-org.1nizing maps
using binary weight vector and its digiti! I hardware design. In Procf!tdings of the
9th imrrnotionol conftrence on nruml infonnation processwg (Vol. 3. pp. 1330-1335 ).
Zitzler, E.. & Thiele. L (1999 ). Multiobjective evolutionary algorithms: A
comp.1rative case study .1nd the strength Pareto .JpprD.lch. IEEE Tronsacrions
on Evolutionary Computation. 3(4). 2S7-271.

Mais conteúdo relacionado

Semelhante a Hakimi asiabar, m. 2009: multi-objective genetic local search algorithm using kohonens neural map

University Course Timetabling by using Multi Objective Genetic Algortihms
University Course Timetabling by using Multi Objective Genetic AlgortihmsUniversity Course Timetabling by using Multi Objective Genetic Algortihms
University Course Timetabling by using Multi Objective Genetic AlgortihmsHalil Kaşkavalcı
 
Artificial Intelligence in Robot Path Planning
Artificial Intelligence in Robot Path PlanningArtificial Intelligence in Robot Path Planning
Artificial Intelligence in Robot Path Planningiosrjce
 
EVOLUTION OF ONTOLOGY-BASED MAPPINGS
EVOLUTION OF ONTOLOGY-BASED MAPPINGSEVOLUTION OF ONTOLOGY-BASED MAPPINGS
EVOLUTION OF ONTOLOGY-BASED MAPPINGSAksw Group
 
Human genome project the mitre corporation - jason program office
Human genome project   the mitre corporation - jason program officeHuman genome project   the mitre corporation - jason program office
Human genome project the mitre corporation - jason program officePublicLeaker
 
Human genome project the mitre corporation - jason program office
Human genome project   the mitre corporation - jason program officeHuman genome project   the mitre corporation - jason program office
Human genome project the mitre corporation - jason program officePublicLeaks
 
Prediction of soil liquefaction using genetic programming
Prediction of soil liquefaction using genetic programmingPrediction of soil liquefaction using genetic programming
Prediction of soil liquefaction using genetic programmingAhmed Ebid
 
Bat Algorithm is Better Than Intermittent Search Strategy
Bat Algorithm is Better Than Intermittent Search StrategyBat Algorithm is Better Than Intermittent Search Strategy
Bat Algorithm is Better Than Intermittent Search StrategyXin-She Yang
 
A genetic algorithm approach to static job shop scheduling
A genetic algorithm approach to static job shop schedulingA genetic algorithm approach to static job shop scheduling
A genetic algorithm approach to static job shop schedulingNagendra Bvs
 
Making effective use of graphics processing units (GPUs) in computations
Making effective use of graphics processing units (GPUs) in computationsMaking effective use of graphics processing units (GPUs) in computations
Making effective use of graphics processing units (GPUs) in computationsOregon State University
 
A REVIEW OF PARTICLE SWARM OPTIMIZATION (PSO) ALGORITHM
A REVIEW OF PARTICLE SWARM OPTIMIZATION (PSO) ALGORITHMA REVIEW OF PARTICLE SWARM OPTIMIZATION (PSO) ALGORITHM
A REVIEW OF PARTICLE SWARM OPTIMIZATION (PSO) ALGORITHMIAEME Publication
 
BMI 201 - Investigating Term Reuse and Overlap in Biomedical Ontologies
BMI 201 - Investigating Term Reuse and Overlap in Biomedical OntologiesBMI 201 - Investigating Term Reuse and Overlap in Biomedical Ontologies
BMI 201 - Investigating Term Reuse and Overlap in Biomedical OntologiesMaulik Kamdar
 

Semelhante a Hakimi asiabar, m. 2009: multi-objective genetic local search algorithm using kohonens neural map (20)

ga-2.ppt
ga-2.pptga-2.ppt
ga-2.ppt
 
I045046066
I045046066I045046066
I045046066
 
Optimal combination of operators in Genetic Algorithmsfor VRP problems
Optimal combination of operators in Genetic Algorithmsfor VRP problemsOptimal combination of operators in Genetic Algorithmsfor VRP problems
Optimal combination of operators in Genetic Algorithmsfor VRP problems
 
40120130405011
4012013040501140120130405011
40120130405011
 
University Course Timetabling by using Multi Objective Genetic Algortihms
University Course Timetabling by using Multi Objective Genetic AlgortihmsUniversity Course Timetabling by using Multi Objective Genetic Algortihms
University Course Timetabling by using Multi Objective Genetic Algortihms
 
Artificial Intelligence in Robot Path Planning
Artificial Intelligence in Robot Path PlanningArtificial Intelligence in Robot Path Planning
Artificial Intelligence in Robot Path Planning
 
T01732115119
T01732115119T01732115119
T01732115119
 
EVOLUTION OF ONTOLOGY-BASED MAPPINGS
EVOLUTION OF ONTOLOGY-BASED MAPPINGSEVOLUTION OF ONTOLOGY-BASED MAPPINGS
EVOLUTION OF ONTOLOGY-BASED MAPPINGS
 
BioSB meeting 2015
BioSB meeting 2015BioSB meeting 2015
BioSB meeting 2015
 
Human genome project the mitre corporation - jason program office
Human genome project   the mitre corporation - jason program officeHuman genome project   the mitre corporation - jason program office
Human genome project the mitre corporation - jason program office
 
Human genome project the mitre corporation - jason program office
Human genome project   the mitre corporation - jason program officeHuman genome project   the mitre corporation - jason program office
Human genome project the mitre corporation - jason program office
 
Prediction of soil liquefaction using genetic programming
Prediction of soil liquefaction using genetic programmingPrediction of soil liquefaction using genetic programming
Prediction of soil liquefaction using genetic programming
 
C013141723
C013141723C013141723
C013141723
 
Bat Algorithm is Better Than Intermittent Search Strategy
Bat Algorithm is Better Than Intermittent Search StrategyBat Algorithm is Better Than Intermittent Search Strategy
Bat Algorithm is Better Than Intermittent Search Strategy
 
A genetic algorithm approach to static job shop scheduling
A genetic algorithm approach to static job shop schedulingA genetic algorithm approach to static job shop scheduling
A genetic algorithm approach to static job shop scheduling
 
Making effective use of graphics processing units (GPUs) in computations
Making effective use of graphics processing units (GPUs) in computationsMaking effective use of graphics processing units (GPUs) in computations
Making effective use of graphics processing units (GPUs) in computations
 
A REVIEW OF PARTICLE SWARM OPTIMIZATION (PSO) ALGORITHM
A REVIEW OF PARTICLE SWARM OPTIMIZATION (PSO) ALGORITHMA REVIEW OF PARTICLE SWARM OPTIMIZATION (PSO) ALGORITHM
A REVIEW OF PARTICLE SWARM OPTIMIZATION (PSO) ALGORITHM
 
50120130405008 2
50120130405008 250120130405008 2
50120130405008 2
 
BMI 201 - Investigating Term Reuse and Overlap in Biomedical Ontologies
BMI 201 - Investigating Term Reuse and Overlap in Biomedical OntologiesBMI 201 - Investigating Term Reuse and Overlap in Biomedical Ontologies
BMI 201 - Investigating Term Reuse and Overlap in Biomedical Ontologies
 
Genetic Algorithm
Genetic Algorithm Genetic Algorithm
Genetic Algorithm
 

Mais de ArchiLab 7

Fractal cities low resolution
Fractal cities low resolutionFractal cities low resolution
Fractal cities low resolutionArchiLab 7
 
P.Corning 2002 the re emergence of emergence
P.Corning 2002 the re emergence of emergenceP.Corning 2002 the re emergence of emergence
P.Corning 2002 the re emergence of emergenceArchiLab 7
 
John michael greer: an old kind of science cellular automata
John michael greer:  an old kind of science cellular automataJohn michael greer:  an old kind of science cellular automata
John michael greer: an old kind of science cellular automataArchiLab 7
 
Coates p: the use of genetic programming for applications in the field of spa...
Coates p: the use of genetic programming for applications in the field of spa...Coates p: the use of genetic programming for applications in the field of spa...
Coates p: the use of genetic programming for applications in the field of spa...ArchiLab 7
 
Coates p: the use of genetic programing in exploring 3 d design worlds
Coates p: the use of genetic programing in exploring 3 d design worldsCoates p: the use of genetic programing in exploring 3 d design worlds
Coates p: the use of genetic programing in exploring 3 d design worldsArchiLab 7
 
Coates p: genetic programming and spatial morphogenesis
Coates p: genetic programming and spatial morphogenesisCoates p: genetic programming and spatial morphogenesis
Coates p: genetic programming and spatial morphogenesisArchiLab 7
 
Coates p 1999: exploring 3_d design worlds using lindenmeyer systems and gen...
Coates  p 1999: exploring 3_d design worlds using lindenmeyer systems and gen...Coates  p 1999: exploring 3_d design worlds using lindenmeyer systems and gen...
Coates p 1999: exploring 3_d design worlds using lindenmeyer systems and gen...ArchiLab 7
 
Ying hua, c. (2010): adopting co-evolution and constraint-satisfaction concep...
Ying hua, c. (2010): adopting co-evolution and constraint-satisfaction concep...Ying hua, c. (2010): adopting co-evolution and constraint-satisfaction concep...
Ying hua, c. (2010): adopting co-evolution and constraint-satisfaction concep...ArchiLab 7
 
Sakanashi, h.; kakazu, y. (1994): co evolving genetic algorithm with filtered...
Sakanashi, h.; kakazu, y. (1994): co evolving genetic algorithm with filtered...Sakanashi, h.; kakazu, y. (1994): co evolving genetic algorithm with filtered...
Sakanashi, h.; kakazu, y. (1994): co evolving genetic algorithm with filtered...ArchiLab 7
 
Maher.m.l. 2006: a model of co evolutionary design
Maher.m.l. 2006: a model of co evolutionary designMaher.m.l. 2006: a model of co evolutionary design
Maher.m.l. 2006: a model of co evolutionary designArchiLab 7
 
Dorst, kees and cross, nigel (2001): creativity in the design process co evol...
Dorst, kees and cross, nigel (2001): creativity in the design process co evol...Dorst, kees and cross, nigel (2001): creativity in the design process co evol...
Dorst, kees and cross, nigel (2001): creativity in the design process co evol...ArchiLab 7
 
Daniel hillis 1991: co-evolving parasites sfi artificial life ii
Daniel hillis 1991: co-evolving parasites sfi artificial life iiDaniel hillis 1991: co-evolving parasites sfi artificial life ii
Daniel hillis 1991: co-evolving parasites sfi artificial life iiArchiLab 7
 
P bentley skumar: three ways to grow designs a comparison of evolved embryoge...
P bentley skumar: three ways to grow designs a comparison of evolved embryoge...P bentley skumar: three ways to grow designs a comparison of evolved embryoge...
P bentley skumar: three ways to grow designs a comparison of evolved embryoge...ArchiLab 7
 
P bentley: aspects of evolutionary design by computers
P bentley: aspects of evolutionary design by computersP bentley: aspects of evolutionary design by computers
P bentley: aspects of evolutionary design by computersArchiLab 7
 
M de landa: deleuze and use of ga in architecture
M de landa: deleuze and use of ga in architectureM de landa: deleuze and use of ga in architecture
M de landa: deleuze and use of ga in architectureArchiLab 7
 
Kumar bentley: computational embryology_ past, present and future
Kumar bentley: computational embryology_ past, present and futureKumar bentley: computational embryology_ past, present and future
Kumar bentley: computational embryology_ past, present and futureArchiLab 7
 
Derix 2010: mediating spatial phenomena through computational heuristics
Derix 2010:  mediating spatial phenomena through computational heuristicsDerix 2010:  mediating spatial phenomena through computational heuristics
Derix 2010: mediating spatial phenomena through computational heuristicsArchiLab 7
 
De garis, h. (1999): artificial embryology and cellular differentiation. ch. ...
De garis, h. (1999): artificial embryology and cellular differentiation. ch. ...De garis, h. (1999): artificial embryology and cellular differentiation. ch. ...
De garis, h. (1999): artificial embryology and cellular differentiation. ch. ...ArchiLab 7
 
Yamashiro, d. 2006: efficiency of search performance through visualising sear...
Yamashiro, d. 2006: efficiency of search performance through visualising sear...Yamashiro, d. 2006: efficiency of search performance through visualising sear...
Yamashiro, d. 2006: efficiency of search performance through visualising sear...ArchiLab 7
 
Tanaka, m 1995: ga-based decision support system for multi-criteria optimisa...
Tanaka, m 1995: ga-based decision support system for multi-criteria  optimisa...Tanaka, m 1995: ga-based decision support system for multi-criteria  optimisa...
Tanaka, m 1995: ga-based decision support system for multi-criteria optimisa...ArchiLab 7
 

Mais de ArchiLab 7 (20)

Fractal cities low resolution
Fractal cities low resolutionFractal cities low resolution
Fractal cities low resolution
 
P.Corning 2002 the re emergence of emergence
P.Corning 2002 the re emergence of emergenceP.Corning 2002 the re emergence of emergence
P.Corning 2002 the re emergence of emergence
 
John michael greer: an old kind of science cellular automata
John michael greer:  an old kind of science cellular automataJohn michael greer:  an old kind of science cellular automata
John michael greer: an old kind of science cellular automata
 
Coates p: the use of genetic programming for applications in the field of spa...
Coates p: the use of genetic programming for applications in the field of spa...Coates p: the use of genetic programming for applications in the field of spa...
Coates p: the use of genetic programming for applications in the field of spa...
 
Coates p: the use of genetic programing in exploring 3 d design worlds
Coates p: the use of genetic programing in exploring 3 d design worldsCoates p: the use of genetic programing in exploring 3 d design worlds
Coates p: the use of genetic programing in exploring 3 d design worlds
 
Coates p: genetic programming and spatial morphogenesis
Coates p: genetic programming and spatial morphogenesisCoates p: genetic programming and spatial morphogenesis
Coates p: genetic programming and spatial morphogenesis
 
Coates p 1999: exploring 3_d design worlds using lindenmeyer systems and gen...
Coates  p 1999: exploring 3_d design worlds using lindenmeyer systems and gen...Coates  p 1999: exploring 3_d design worlds using lindenmeyer systems and gen...
Coates p 1999: exploring 3_d design worlds using lindenmeyer systems and gen...
 
Ying hua, c. (2010): adopting co-evolution and constraint-satisfaction concep...
Ying hua, c. (2010): adopting co-evolution and constraint-satisfaction concep...Ying hua, c. (2010): adopting co-evolution and constraint-satisfaction concep...
Ying hua, c. (2010): adopting co-evolution and constraint-satisfaction concep...
 
Sakanashi, h.; kakazu, y. (1994): co evolving genetic algorithm with filtered...
Sakanashi, h.; kakazu, y. (1994): co evolving genetic algorithm with filtered...Sakanashi, h.; kakazu, y. (1994): co evolving genetic algorithm with filtered...
Sakanashi, h.; kakazu, y. (1994): co evolving genetic algorithm with filtered...
 
Maher.m.l. 2006: a model of co evolutionary design
Maher.m.l. 2006: a model of co evolutionary designMaher.m.l. 2006: a model of co evolutionary design
Maher.m.l. 2006: a model of co evolutionary design
 
Dorst, kees and cross, nigel (2001): creativity in the design process co evol...
Dorst, kees and cross, nigel (2001): creativity in the design process co evol...Dorst, kees and cross, nigel (2001): creativity in the design process co evol...
Dorst, kees and cross, nigel (2001): creativity in the design process co evol...
 
Daniel hillis 1991: co-evolving parasites sfi artificial life ii
Daniel hillis 1991: co-evolving parasites sfi artificial life iiDaniel hillis 1991: co-evolving parasites sfi artificial life ii
Daniel hillis 1991: co-evolving parasites sfi artificial life ii
 
P bentley skumar: three ways to grow designs a comparison of evolved embryoge...
P bentley skumar: three ways to grow designs a comparison of evolved embryoge...P bentley skumar: three ways to grow designs a comparison of evolved embryoge...
P bentley skumar: three ways to grow designs a comparison of evolved embryoge...
 
P bentley: aspects of evolutionary design by computers
P bentley: aspects of evolutionary design by computersP bentley: aspects of evolutionary design by computers
P bentley: aspects of evolutionary design by computers
 
M de landa: deleuze and use of ga in architecture
M de landa: deleuze and use of ga in architectureM de landa: deleuze and use of ga in architecture
M de landa: deleuze and use of ga in architecture
 
Kumar bentley: computational embryology_ past, present and future
Kumar bentley: computational embryology_ past, present and futureKumar bentley: computational embryology_ past, present and future
Kumar bentley: computational embryology_ past, present and future
 
Derix 2010: mediating spatial phenomena through computational heuristics
Derix 2010:  mediating spatial phenomena through computational heuristicsDerix 2010:  mediating spatial phenomena through computational heuristics
Derix 2010: mediating spatial phenomena through computational heuristics
 
De garis, h. (1999): artificial embryology and cellular differentiation. ch. ...
De garis, h. (1999): artificial embryology and cellular differentiation. ch. ...De garis, h. (1999): artificial embryology and cellular differentiation. ch. ...
De garis, h. (1999): artificial embryology and cellular differentiation. ch. ...
 
Yamashiro, d. 2006: efficiency of search performance through visualising sear...
Yamashiro, d. 2006: efficiency of search performance through visualising sear...Yamashiro, d. 2006: efficiency of search performance through visualising sear...
Yamashiro, d. 2006: efficiency of search performance through visualising sear...
 
Tanaka, m 1995: ga-based decision support system for multi-criteria optimisa...
Tanaka, m 1995: ga-based decision support system for multi-criteria  optimisa...Tanaka, m 1995: ga-based decision support system for multi-criteria  optimisa...
Tanaka, m 1995: ga-based decision support system for multi-criteria optimisa...
 

Último

Design principles on typography in design
Design principles on typography in designDesign principles on typography in design
Design principles on typography in designnooreen17
 
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)jennyeacort
 
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一F dds
 
'CASE STUDY OF INDIRA PARYAVARAN BHAVAN DELHI ,
'CASE STUDY OF INDIRA PARYAVARAN BHAVAN DELHI ,'CASE STUDY OF INDIRA PARYAVARAN BHAVAN DELHI ,
'CASE STUDY OF INDIRA PARYAVARAN BHAVAN DELHI ,Aginakm1
 
FiveHypotheses_UIDMasterclass_18April2024.pdf
FiveHypotheses_UIDMasterclass_18April2024.pdfFiveHypotheses_UIDMasterclass_18April2024.pdf
FiveHypotheses_UIDMasterclass_18April2024.pdfShivakumar Viswanathan
 
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一Fi sss
 
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubai
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services DubaiDubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubai
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubaikojalkojal131
 
办理(UC毕业证书)查尔斯顿大学毕业证成绩单原版一比一
办理(UC毕业证书)查尔斯顿大学毕业证成绩单原版一比一办理(UC毕业证书)查尔斯顿大学毕业证成绩单原版一比一
办理(UC毕业证书)查尔斯顿大学毕业证成绩单原版一比一z xss
 
How to Empower the future of UX Design with Gen AI
How to Empower the future of UX Design with Gen AIHow to Empower the future of UX Design with Gen AI
How to Empower the future of UX Design with Gen AIyuj
 
PORTAFOLIO 2024_ ANASTASIYA KUDINOVA
PORTAFOLIO   2024_  ANASTASIYA  KUDINOVAPORTAFOLIO   2024_  ANASTASIYA  KUDINOVA
PORTAFOLIO 2024_ ANASTASIYA KUDINOVAAnastasiya Kudinova
 
Cosumer Willingness to Pay for Sustainable Bricks
Cosumer Willingness to Pay for Sustainable BricksCosumer Willingness to Pay for Sustainable Bricks
Cosumer Willingness to Pay for Sustainable Bricksabhishekparmar618
 
办理卡尔顿大学毕业证成绩单|购买加拿大文凭证书
办理卡尔顿大学毕业证成绩单|购买加拿大文凭证书办理卡尔顿大学毕业证成绩单|购买加拿大文凭证书
办理卡尔顿大学毕业证成绩单|购买加拿大文凭证书zdzoqco
 
办理学位证(TheAuckland证书)新西兰奥克兰大学毕业证成绩单原版一比一
办理学位证(TheAuckland证书)新西兰奥克兰大学毕业证成绩单原版一比一办理学位证(TheAuckland证书)新西兰奥克兰大学毕业证成绩单原版一比一
办理学位证(TheAuckland证书)新西兰奥克兰大学毕业证成绩单原版一比一Fi L
 
Architecture case study India Habitat Centre, Delhi.pdf
Architecture case study India Habitat Centre, Delhi.pdfArchitecture case study India Habitat Centre, Delhi.pdf
Architecture case study India Habitat Centre, Delhi.pdfSumit Lathwal
 
Pharmaceutical Packaging for the elderly.pdf
Pharmaceutical Packaging for the elderly.pdfPharmaceutical Packaging for the elderly.pdf
Pharmaceutical Packaging for the elderly.pdfAayushChavan5
 
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degreeyuu sss
 
专业一比一美国亚利桑那大学毕业证成绩单pdf电子版制作修改#真实工艺展示#真实防伪#diploma#degree
专业一比一美国亚利桑那大学毕业证成绩单pdf电子版制作修改#真实工艺展示#真实防伪#diploma#degree专业一比一美国亚利桑那大学毕业证成绩单pdf电子版制作修改#真实工艺展示#真实防伪#diploma#degree
专业一比一美国亚利桑那大学毕业证成绩单pdf电子版制作修改#真实工艺展示#真实防伪#diploma#degreeyuu sss
 
Untitled presedddddddddddddddddntation (1).pptx
Untitled presedddddddddddddddddntation (1).pptxUntitled presedddddddddddddddddntation (1).pptx
Untitled presedddddddddddddddddntation (1).pptxmapanig881
 
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一diploma 1
 
Call Girls Satellite 7397865700 Ridhima Hire Me Full Night
Call Girls Satellite 7397865700 Ridhima Hire Me Full NightCall Girls Satellite 7397865700 Ridhima Hire Me Full Night
Call Girls Satellite 7397865700 Ridhima Hire Me Full Nightssuser7cb4ff
 

Último (20)

Design principles on typography in design
Design principles on typography in designDesign principles on typography in design
Design principles on typography in design
 
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)
Call Us ✡️97111⇛47426⇛Call In girls Vasant Vihar༒(Delhi)
 
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
 
'CASE STUDY OF INDIRA PARYAVARAN BHAVAN DELHI ,
'CASE STUDY OF INDIRA PARYAVARAN BHAVAN DELHI ,'CASE STUDY OF INDIRA PARYAVARAN BHAVAN DELHI ,
'CASE STUDY OF INDIRA PARYAVARAN BHAVAN DELHI ,
 
FiveHypotheses_UIDMasterclass_18April2024.pdf
FiveHypotheses_UIDMasterclass_18April2024.pdfFiveHypotheses_UIDMasterclass_18April2024.pdf
FiveHypotheses_UIDMasterclass_18April2024.pdf
 
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一
 
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubai
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services DubaiDubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubai
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubai
 
办理(UC毕业证书)查尔斯顿大学毕业证成绩单原版一比一
办理(UC毕业证书)查尔斯顿大学毕业证成绩单原版一比一办理(UC毕业证书)查尔斯顿大学毕业证成绩单原版一比一
办理(UC毕业证书)查尔斯顿大学毕业证成绩单原版一比一
 
How to Empower the future of UX Design with Gen AI
How to Empower the future of UX Design with Gen AIHow to Empower the future of UX Design with Gen AI
How to Empower the future of UX Design with Gen AI
 
PORTAFOLIO 2024_ ANASTASIYA KUDINOVA
PORTAFOLIO   2024_  ANASTASIYA  KUDINOVAPORTAFOLIO   2024_  ANASTASIYA  KUDINOVA
PORTAFOLIO 2024_ ANASTASIYA KUDINOVA
 
Cosumer Willingness to Pay for Sustainable Bricks
Cosumer Willingness to Pay for Sustainable BricksCosumer Willingness to Pay for Sustainable Bricks
Cosumer Willingness to Pay for Sustainable Bricks
 
办理卡尔顿大学毕业证成绩单|购买加拿大文凭证书
办理卡尔顿大学毕业证成绩单|购买加拿大文凭证书办理卡尔顿大学毕业证成绩单|购买加拿大文凭证书
办理卡尔顿大学毕业证成绩单|购买加拿大文凭证书
 
办理学位证(TheAuckland证书)新西兰奥克兰大学毕业证成绩单原版一比一
办理学位证(TheAuckland证书)新西兰奥克兰大学毕业证成绩单原版一比一办理学位证(TheAuckland证书)新西兰奥克兰大学毕业证成绩单原版一比一
办理学位证(TheAuckland证书)新西兰奥克兰大学毕业证成绩单原版一比一
 
Architecture case study India Habitat Centre, Delhi.pdf
Architecture case study India Habitat Centre, Delhi.pdfArchitecture case study India Habitat Centre, Delhi.pdf
Architecture case study India Habitat Centre, Delhi.pdf
 
Pharmaceutical Packaging for the elderly.pdf
Pharmaceutical Packaging for the elderly.pdfPharmaceutical Packaging for the elderly.pdf
Pharmaceutical Packaging for the elderly.pdf
 
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
 
专业一比一美国亚利桑那大学毕业证成绩单pdf电子版制作修改#真实工艺展示#真实防伪#diploma#degree
专业一比一美国亚利桑那大学毕业证成绩单pdf电子版制作修改#真实工艺展示#真实防伪#diploma#degree专业一比一美国亚利桑那大学毕业证成绩单pdf电子版制作修改#真实工艺展示#真实防伪#diploma#degree
专业一比一美国亚利桑那大学毕业证成绩单pdf电子版制作修改#真实工艺展示#真实防伪#diploma#degree
 
Untitled presedddddddddddddddddntation (1).pptx
Untitled presedddddddddddddddddntation (1).pptxUntitled presedddddddddddddddddntation (1).pptx
Untitled presedddddddddddddddddntation (1).pptx
 
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一
 
Call Girls Satellite 7397865700 Ridhima Hire Me Full Night
Call Girls Satellite 7397865700 Ridhima Hire Me Full NightCall Girls Satellite 7397865700 Ridhima Hire Me Full Night
Call Girls Satellite 7397865700 Ridhima Hire Me Full Night
 

Hakimi asiabar, m. 2009: multi-objective genetic local search algorithm using kohonens neural map

  • 1. ' Contents nsrs available at Sc·eQCeO~rect Computers & Industrial Engineering lournal homepage: www.elsevier.com/locate/ e:aie Multi-objective generic local search algorithm using Kohonen's neural map Mehrdad Hakimi-Asiabar •. Seyyed Hassan Ghodsypour .... Reza Kerachian b ~Orrparrmt-nt 11/ lndr.otrw~l £rtt.Amuio0trthttv o/Ttdlf'IOitOloD. Vah-cml". T.-hron. rron • (f'nrtt of £.tdtiKl'/01 MhmnKUttt (IIJNtf'l"' oud Mono,ttoi'II'IU. 'aru/ty of CMJ £ngmHnJ.,_ UnuttS•ly ARTICLE cf T~rtm. Tthtmt. 1Jan: ABSTRACT INFO Anjcf~ IIISIOiy, Re<e•vf'd 24 Stpttmbtr 20111 Rec~1vC'd in l"f'I.M'd fo1m l9 Ap11l l008 Acc~pled 9 O<'to~t 2008 Av.-ll•ble onhnt lS Octobfr 2008 K<yHOrds MultJ...OblfC'IIW JCIt1:1oC laul 4f,l.r(h S.lr OrJd.ni:nns m.lps Vui..ble Ne~,&hborhoo<l St.l,(h .VNS) Multi·ObJKnw f'VOiuHon..uy o~ laonthm L#.unma MuJtHtstrvotr Optl•thOI m.11WJtmtnt Genetic Algornhms (C.As) 01re popul:mon b01seCgloiMI se01rch rncallod.s th01t CM CSCJJ)(! from lonl opum.& uo~ps o~nd find the glob.&1 optim.& regtons. However. nt~r the optnnum set thf!lr inc~ns1flc.uion process i.s of1en tn.&ccur~ee. This is beouse the se-arch stro~tegy of CAs IS completely prob•blhshc. Wtth • r•ndom st.uch ne.u the optimum sets. there is o~ smi11 prob.,bihty to improve currenl <olutl<m. Anot h~r dr.awl>.lck of the CAs is genetic drift. The CAs seo.rch process is a bl"dc box process .11 no one knows th.ll which d rfjlon Is being se.a.rched by the o~lgorithm .and it is poss1ble tl'wt GAs Stirch only.:. sm.aU re;ion in tht fu slble Sp.l«. On the other h.lnd. GAs usu.ally do not use the existing infornl.ltlon .about th~ optimality rea,ons in p.ut lter.itions. In thiS p.aper• . new method C.l.l~ SOM·8.:1Sed Muhi..Ob,ttCIIVt CA (SBMOCA) I) proposed to improve the renet1 dtvemty. ln SBMOCA. a gnd of neurons use fht concept of leo1rn11'1 rule of Stlf Orlo1n1zinc c M.ll' (SOM) mppornng by Vanable Neighborhood St•rc:h (VNS) le.am from genetiC .alaontlH1l lmprOVI"' bo~h loc41 •nd &lob.ll S('.trch. SOMis • n.tUr.al networlc wh1ch '' C:•p•blt of le.1mma .md '"" tmprove the tffic1ency ol d.at.a processm& •la:onthms. The VNS .al,onthm 15 dt'wlopc<l to t"nh.anc-e the loc.al St'.&rch effl aotney 1n H~ £voluuon.ary Alronthms (EAs). The SOM usn .a multJ·obJ«tlvc lco~mm1 rule b.unS..on P4r· tto domin.lnce- to tnin its neurons. The neurons gc•du..aUy move tow.nd better fanus .artJ, 1n somt tf.l]t(tories in fe.asible s~ce. The know1tdge of optimum from in p.ut Jtner.ltions IS uvtd m fo•m of u~ jf'C'tonf'S Tht ti~l sute of the- neurons detemunes .a set of O('W solutions th..lt c.an bt ff'l4rdltd .aJ lhc protubUtty dt11Saty dtunbutlOn fun<"tJOn of the high fi[ness .uus tn lM multl..obtf'Ctl'lf s~t 1M ntw s.et of soluttons polentJ.liJy c.an tmpi'O'Jt l:be GAs ow-t.l.JI effi<•~Pncy.ln thot Luc s«ttnn of dus P'Jptf', eM •pphutN!•ty o( thf' propos~ 4lgonthm 1s n.~.mu~ tn dew&opnJ opttm.&l poiK•otS (Of .1 rul work! J'l.llU-obJI•cuw muln·r~rvon system whtch 1s .a non--luw:.n. non-«nvvt.tnulb·Obt«U'~ot OQ«umuhon probl<m. 0 2008 EISC"V''t'r Ud. AU n&ha rHotrwd. Evolut•on.uy Al1onthms (EAs) .ut prob.lbll1stic sench optimi· uuon ltchmquts. wh1ch ~vt !>ten d~loptd bued on D.trw~n's pnnCiples of N~tur'1 st~tton .and survrv.11J of the finest individui11S m a popul~llon. EAs use comput,auon.al modtls of tvo1unonary processes as key clcmcncs in the destgn .and tmpltmtnr.u•on of computer-~sed problem solvong systems (Cordon. Moy•. & l.>rco. 2002; Goldlx'rg. 1989). Th(f( ,)J'( a V.lfl~ty ~VOIUiiOO•lr'Y (Ompuurion.al models. There have been four well defined EAs. whoch h•ve served iS rho basis for che most of .1ct ivtties tn the field of evoluttOil.lf'Y computations: Genetic Algorothms (CAs)(Holl•nd. 1975: Mlchalrwicz. 1996). Evolutoon Str•tegies (S<hwefcl. lq7S; S..hwefcl, 1981, 1995), Cenoo ic Po·oguonming (CP) (Fogo I. 19o2; Ko••· 1992) and Evolutoon•ry Pro- or • Conupond"'j -'UthOf', T~L •9811 6&4130l•IGG4&6497: f~x ! •98 21 664Jl02S. l·md addrtHN 1991~ Ceneuc •lgorothms h.l-. ~nun­ liz.l'd in differenr fields of engmeenng much more th.an other forms ofEIS. lmn.tl developments In tvOiutton,ary opunuuuon modtls cu~ on sing)t obJtcUVt dpphnllons. Jn the p.ut two dtc.&des. -.ral Muln-<>bJOCtwe EAs such as Vec1or Ev•lu.oted Ctnetic Algo· nthm (VEGA)(Sch•ffcr. 1984) •nd Non-domonated Soned Ceneloc Algorirhms (NSCA) (Snno•·as & Deb. 1994 ) h•vo ~n proposed. These early EAs often J>t>rfonned poorly, cons1d~nns two key parameters: convergence ra.te c1nd d1verstty. Recent ,algonthms II let S1rcngth Par01o Evolutionary SPEA (Zotzlcr & Thoele. 1999) •nd NSCA-11 (Deb. Prat•p. Agarwal. & Mty•11van. 2002) l)<rform bertor though, they still suffer from sim1lar dtficienc-its. Dtb (2001) and V•n V.ldhuo zcn •nd L<omont (2000) presented comprehensive reviews .-nd classific~t•on the most lmponant .Jppro.1ches to genetic Jlgonthrn.s fo1 muiU--QbJN'tivr opl lffil7:4rion. · Lnely. Konc1k, Coit. .md Smith (2000) pl'esenttd ~n ovel'view and rurori.JI describing GAs developed for problems with multiple ob)e<toves. They concludtd that these methods dlffor pr>m•nly gr•mmmg (EP) (Fogel. 1. Introduction ahud:SV~·~~~ .c II (5 u Chodsypour~ K~r.tiChiAnOut..lC.II' ( R. Ktr.Kh.l.ln)._ 0360 U5l/S $«front mmtr e 2001 liHvttr Lid. All r•,&tlrc ffS~rwd. doi 10 1016jJ(i~.l00110010 ro- or
  • 2. ' M. Hakimi -Asiabar rr ai)Compurers & l11dusrrial Engineering 56 (2009) 1566-1576 from traditional GA by using specialized fitness functions and introducing methods to promote solution diversity. Many real -world problems do not satisfy necessary conditions such as continuity. differentiability. convexity, etc. Therefore. they can not be easily solved using traditional gradient-based optimization techniques. GAs have been considered as a practical optimization tool in many disciplines such as discontinuous multi-modal objective functions, combinatorial {together with discrete, continuous or integer design variables). dynamic. severely nonlinear, and non-d ifferen tiable, non-convex design spaces problems. Another advantage of MOEAs is definition of Pareto front set with an acceptable computational time. Traditional multi-objective algorithms define one solution in each run. The MOEAs usually attempt to generate (or closely approximate) the entire Pareto front in a single run and place emphasis on J.chieving solution diversity so as to avoid local optima (Rangarajan. Ravindran. & Reed, 2004).The advantages of GAs increasingly extend their applications. However. there are some drawbacks that limit their efficiency. The traditional GAs intensification process is not sufficiently accurate. GAs usually find the area of good fitness qu ite easily. However, finding the global optimal solution may be time-consuming and inaccurate. This is because the search strategy of GAs is probabilistic. In a probabilistic search process. when a chromosome is far from the local optima, there is a SO% chance that a random search direction will simultaneously improve .all the objectives. However. when a point is close to the Pareto set, the size of proper descent/ascent cone is extremely narrow and there is small probability that a random update improves the objective functions (Brown & Smith, 2005). Thus with a random search strategy, GAs generally require a great number of iterations and they converge slowly. especially in the neighborhood of the global opti mum. With a randomized reproduction strategy in which the crossover points are determined randomly, the resulting children are created without regard to the existing information about high fitness regions. Therefore. the fitness of a child can deviate quite widely from the fitness of its parents. Another drawback of the GAs is genetic drift. The GAs exploration process is a black box and the diversity information obtained from past generations is only implicitly and partially preserved in the current genome. This bears the risk of a regeneration of individuals that have already bee n see n in the search process. Even more problematic is the fact that the search can be negatively af~ fected by genetic drift. As a consequence, big parts of the search space, potentially containing the global optimum. will never be explored. Thus there is a need for consistent exploration techniques that do not repeat the same patterns in mutation process and also can improve the diversity when increasing gene tic generations. EAs are producing vast amounts of data during an optimization run without sufficient usage of them. In each of the numerous gen erations, a large number of chromosomes is generated and eval uated (Drobics, Bodenhofer, & Winiwarter. 2001 ). This data can be used to produce valuable insight to enhance EAs solution quality. GAs are complete probabilistic search-based optimization models because they do not use the knowledge aggregated about the optimality regions and searched areas in past iterations. Necessary requirements are for example, processing incoming data such that it creates some useful information that incrementally improves the next generation population and new chromosomes should not be created based on entire probabilistic processes. It is possible to extract and use previously computed knowledge in next generations. Then it can be concluded that there is an area of improvement in convergence rate and diversity of GAs. Thus there is a need to new GAs that their exploitations are knowledge orie nted to accelerate the intensification process and also have better diversification to avoid genetic drift. 1567 In this paper, a background for developing a new GA-based algorithm is provided in the next section. In Section 3. the new algorithm is presented in detail. SBMOGA is developed based on some well known ideas such as SOM learning rule, and VNS shaking process. The new algorithm can provide consistent diversity without repeated evaluations and a systematic local and variable neighborhood search. In Section 4, a complex real world problem namely multi -objective multi-resetvoir operation management problem is described and the optimization model formulation is presented. It is clear that this problem is non-convex nonlinear. In Section 5, the results of application of new algorithm to solve the multi-reservoir operation problem is shown. In the last section, conclusions and future research opportunities are presented. 2. BackgTOund In previous section. the main advantages and disadvantages of the traditional GA-based optimization models were described in detail. In this section, literature regarding the models improving the traditional GAs is reviewed. A variety of techniques for incorporating local search methods with EAs have been reponed. These techniques include Genetic Local Search (Merz & Freisleben, 1999), Genetic Hybrids (Fleurent & Ferland, 1994), Random Multi-Start (Kernighan & Un, 1970) and GRASP (Feo & Resende, 1989). Local search schemes such as gradient-based methods are efficient algorithms for refi n ing arbitrary points in the search space into better solutions. Such algorithms are called local search algorithms because they define neighborhoods, typically based on initial "coarse" solutions. The tenn 'local search ' generally is applied for methods that cannot escape these minima. Some hybridization schemes that will be used to d evelop the proposed algorithm are discussed below. 2.1. Hybrid loco/ seorch GAs Hybrid algorithms are a combination of two or more different techniques. Hybridization of local search and evolutionary algorithms has complementary advantages and combines the strengths of different approaches in order to overcome their weaknesses. Evolutionary algorithms have been successfully hybridized with other local searc h methods. Hybrid EAs have the local search power of traditional methods thus their accuracy is better than the ordinary EAs. Also the methods common ly take advantage of the good global search capabilities of evolutionary algorithms then they are robust against getting stuck at local optima. The hybridization of genetic algorithms and local search methods, called genetic local search, has been applied to a variety of singleobjective combinatorial problems. The role of the local search is to enhance the intensification process in the genetic search (Arroyo & Armenta no. 2005 ). Grosan and Abrah<lm (2007 ) showed some possibilities for hybridization of an evolutionary algorithm and also presented some of the generic hybrid evolutionary architectures that has been evolved during the last two decades. They also provided.:. review of some of the interesting hybrid frameworks. The first Multi-Objective Genetic Local Search algorithms (MOGLS} proposed by Ishibuchi and Murata ( 1998), which is called IM-MOGLS algorithm (Arroyo & Armenta no. 2005). An iteration of the IM -MOGlS algorithm starts with a population with N solutions denoted by P. Then the operators of selection. recombination and mutation are applied to the elements of P until reaching a population of N elite so lutions . These solutions are recorded in current nondominated set. Then N elite solutions are randomly selected from the current set of nondominated solutions (denoted by Po ) and a restricted local search is applied to each sol ution in Po. In this
  • 3. 1568 M. Hakimi-Asiabar l!r ai.} Campurers & Industrial Enginuring 56 (2009) process. a limited number of random neighborhood search around each solution of x E P0 is generated and if its fitness is better than that of x. it replaces x. otherwise, the local search ini tiated from x termin.ltes. The number of neighborhoods examined from each x e P0 is limited. and a new population Pis formed to strtrt a new generrttion. They show that their results are better than those ob~ trtined by the Vector Evaluated Genetic Algorithm (VEGA}, proposed by Schaffer ( 1984 ). jaszkiewicz (2002 ) proposed another MOGLS algorithm. which is called J-MOGLS. In this algorithm, each iteration starts by drawin g random we ights from a pre-specifled set of weights to define a scalarizing function. The n. the best and distinct solution set (B) according to such a scalarizing function is selected from the current set (CS ) to form a temporary population TP. Two randomly selected solutions of TP are recombined and generate an offspring which is submitted to a local search. If the solution resulting from the local search is better than the worst solution in TP, then it is included in CS and the set of the best nondominated so lutions is updated. Recombination is the only component of generic algorithms that the J-MOGLS uses and it does not use the mut.ation operator. jaszkiewicz applied his algorithm to several problems with tvvo and three objectives and concluded that his method's quality of solut ions is much better than those gene rated by the IM-MOGLS. Arroyo and Armentano (2005) proposed a MOGLS algorithm with a structure identical to the IM-MOGLS algorithm. However, their components are quite different, i.e. in each iteration elite sol utions from archive are used co develop offspring population. Then. a local se<~rch is conducted around some elite solutions. If there are some better solutions. they are inse rted instead of previous dominated solutions. Then based on new results, both population and archived elite solution will be revised. Their results were better than IM-MOGi5 and very competitive with J-MOGLS. Hakimi-Asiabar. Ghodsipo ur, Seifi. Kerachian, and O'Brien (2008 ) developed a hybrid multi-objective gradient-based search algorithms ro solve non-differentiable multi-modal objective functions problems. They showed their method's applicability using a real world multi-objective multi-reservoir operating policy definition. 2.2. Hybrid GA-SOM One of the important techniques that can be found in literature to improve GAs efficiency is using of Self-Organizing Maps {Kohonen, 1997). SOM is a learning algorithm that provides dimensionality reduction by compressing the data. via training. to a reasonable number of units (neu rons) (Kohonen. 1997). SOM uses a nerwork of neurons that manage to prese!Ve the topology of the data space. The map consists of a grid of units that contain all significant information of the data set, while eliminating possible noise data, outliers or data faults. Adjacent units on the map structure correspond to simi lar data patterns. allow to identify regions of interest through various clustering techniques. SOM networks are considered to be ca pable of handling problems of large dimensionality. Applications of SOM include clustering, feature extraction or feature evaluation from the trained map (Rauber, 1999: Ultsch & Korus, 1995) and data mining(Drobics et al.. 2001: Kohonen er al., 2000 ). SOM can achieve a detailed approximation of probability density of input (or output) distribution (Kubota. Yamakawa, & Horio. 2005 ). In addition. the probability density of the "binary" input vectors can be approximated (Yamakawa, Horio. and Hirarsuka (2002 )). As mentioned before. GAs use a completely random search strategy in their processes. The search strategy can be done more intelligently, if in generating a new population, the informa tion derived from past generation is considered. SOM is an appropriate tool to achieve this strategy. The neurons network lattice orients 1566~1576 along the Pareto opti mal set through learning and interpolating neighboring neurons shared information along the Pareto optimal set. The SOM interpolation can adapt to straight or cu!Ved Pareto optimal sets. BUche (2003 ) described a recombination operator to interpolate the parent population that uses SOM. The SOM renders itself easily as a recombination operator. which defines a lower dimensional interpolation of the parent population. BUche reported the SOM recombination advantages as: While most recombination operators recombine two parents, the neurons of the SOM interpolate a local subset of parents. Creating a simplex of neighboring neurons supports the transfer of information. This increases the possible .amount of informrttion to recombine. Amour and Rettinger (2005 ) used the SOM to improve diversity and prevent premature converge nce of generic algorithms. They trained a SOM offline to Jearn the search space fitness. and then used it to keep diversity in search space of a single objective problem. Kubota. Yamakawa. and Horio (2004. 2005) developed a strategy for reproduction of new seeds in single objective genetic algorithms using SOM to maintain genetic diversity. They proposed the reproduction strategy based on the SOM for both Bit-String GA and Real-Coded GA to maintain the genetic diversity of the population. In thei r method, the weight vectors after learning are employed as the chromosomes of the next generation. In other words, the populat ion of the next generation is obtained using a learnt SOM. 2.3. Variable Neighborhood Search (VNS) VNS (Hansen and Mladenivic. 2001) is a local search algorithm that was developed to work with the EAs. In VNS. by defining a neighborhood structure around elite solutions. systema tic search is performed to increase local search accuracy. In this al gorithm. a set of K neighborhood structure with decreasing radius around elite solutions is defined (see Fig. 1). then, step by step. the neighborhood radius decreases according to the predefined structure and search is performed for better solutions. Thi s algorithm is an efficient method especially in problems with non-smooth functions. In the next section. a new hybrid multi-objective genetic algorithm is developed by taking ideas from SOM learning rule and its units movement toward the high fitness regions. The VNS shaking search technique is utilized to improve intensification, to maintain diversity, and to implement intelligent random recombination toward elite solutions to !ocate the optimality regions. 3. The proposed algorithm In this section. a new method. which is called SCM-Based MultiObjective GA (SBMOGA). is developed to improve the efficiency of existing multi-objective genetic algorithms. The method c.an improve convergence rare and diversity of GA solutions. The new Fig. 1. A neighborhood structure around an el1te solutiOn.
  • 4. 1569 M. Hakimi-Asiabor tt al.f Computers & Industrial Engine-tring 56 (2009) 1566- 1576 algorithm improves convergence ra re such that decrease the overall run time. Improved rate or convergence and also bette r diversity or solutions in line with the optimal rrom increases the quality or solution. In the SBMOGA. first a set or we igh t vectors are randomly defin ed such that they have uniform distributions in the feasible space as SOM neu rons centers. Then in each generation of GA, the neurons will be tra ined using best soluti ons in current population or GA {first frontier. Deb Goldberg, 1989) in terms of best mu lti-objective sol utions. The multi-objective learn ing rule is adapted from SOM learning rule introduced by Koh onen ( 1997) and Kubota er al. (2005 ). In the training process the neurons gradually move toward hi gh fitn ess solutions and with their stochastic movements. they can find new good solutions. If the current location of any neuron is a local or globa l optimum. it does not move to other points because new point's fi tness will be domina ted by the fitness of current neurons centers. This process can refine t he quality of the multi-objective genet ic algori thm solutions by processing t he gene ra ted data and extracting the probability density of the opti mal front di stribution. SOM consists of an input and a competitive layer that includes T and M units, respectively {Fig. 2). in which T is the input vectors length an d M is the number of SOM's neurons. The jch un it in the competitive layer is con nected to all units in the inpu t layer by the weight vector ~ • [w11 , .... Wj s.. ... , w1r], j • 1. 2, 3, ... , M. In th e learning proces s, the weight vector is continuously updated toward the high fitness inpu t vectors using the learning rule. R,- [R11, .... R;~.: •... , R;rJ is the ith non-dominated solution in current population. After training, a neuron's we ight vector usi ng a new high fitness chromosome, new weight vecto rs for neurons centers will be defined. The n th e fitness of new unit centers will be calculated. If new vector's fitness va lu e fW"-• domi nates the fw•. based on Pa reto 1 • Otherwise the dominance, then repla ce is replaced by past weight vector w, will be remain as latest neuron's weigh t n vector. When a neuron's center reaches to a local /globa l opt imum region during next generations. it will not be changed. because the currenc weight vector's fitness values are not dominated by t heir neighborhood sol utions {see Fig. 3a ). The neuron center's probabilistic movemenc toward high fitness chromosomes in feasible space of genetic algorithm. he lp to sea rch new regions {ex ploratio n) by evaluation of some points {see Fi g. 4 ). Thus those neurons that will located on local minimum will remain at th eir positions and those neurons tha t can move toward better fitness areas continue ro move. After definition or a localjglobal regio n. a local search in neighborhood st ructure will be conduc ted with gradually decreasing learning ratio. This process is the same as variable neighborhood search method with dynamic neighborhood structure that provide better inte nsification in local/global optima regions {see Fig. 3b). This process can continue unril the neurons reach to loca l/globa l optima. w; R - [R 1 , . ., w;'- 3.1. TI1e learning ru le Koh onen (1997) defi ned the training rule for the SOM as follows: w ~ w;' + >(n) ;•' IIR- w ;u (1) w ;r 1 whe rc.n represents a learning step and W1" and are the weight vecto rs of un it s before and after updati ng, respectively. <X(n ) is learning ratio that is a monotonic decreasing fu nction of learning steps and IR- w;' ll is the distance between input vector R and the weight vector Kubota er a l. (2 005) defined the learning ratio in a single objective genetic algorithm as: w ;. (2) where,fR andfw}{ n) are the fitness values of elite chromosome Rand respectively. d1 is di stance beneuron center's weight vector tween the jth un it and the winner un it in the compet itive layer. h(JR. d1) is a coefficient represented by: w ;. h(f,d, ) ~ exp ( :;; ) (3) Then they rewrote the learning ru le as: w;•' ~ W1 " + f, h(f,.dJ ) (1 - [.11, 1) (R- w;') (4) The lea rni ng rule. developed to achieve the detailed approximati on of the probability density or the in put distribution by continuous updating the wei ght vector for sin gle-objective problems. They showed that th is strategy can provide better results in a single objective gene tic algorithm. Now, a revised learning rule is introduced for locati ng high fi tness areas in multi-objective environments. The proposed learning rule for a multi-objective space is defined as: w;•'(n ~ w;'(r) + Y,(tl. h(n). (ll;'( t ) - W j(t )) r~ 1.2 .... T (5) Where: I Yj(t) if R(r) dominate ~ { 0 Otherwise Wj(r) (6) and h(n ) is learning ratio in genera ti on number n of genetic algorithm. The domination in multi-objective space is defined basedon Pareto dom inance. In a maximization problem, when the GA's e lite solution fitness { value fR is large and fWJtn) is sma ll, then y1 c) - 1: and neuron is attracted toward the chromosome, and adversely. when fR is small and[wJ(nJ is large, then yi r) = 0. By usingyAc) factor. the weight vectors with low fitness values are attra cted to the chromosomes with high fi tness values. 3.2. Th e learning ratio The learning ratio or step size of learni ng is defined based on both SOM and VNS. The SOM's learning ratio and VNS neighborhood strucrure are monoronicly decreasing. In MSLS, the learning step size is defi ned as a decreasing function of GA's generation number. Learning Ratio or neighborhood distance in VNS can be calculated from Eq. (7 ): R, . .. . Rrl h(n) Competitive Layer Fill;. 2. Input and output layers in SOM : input layer contains new chromosome-s fro m GA o~.nd the output lo~.ye r conu m~ the un its (neurons) of SOM. ~ 1f (k, - max(k2 - n)/1 00) (7) where. k 1 is a const<tnt for initia l learning ratio and k 2 is <t threshold value to start decreasing the learning ra tio. The proposed learning rule creates an intelligent probabilistic local search technique in multi objective space. The movement trajectories of neurons are random because the locations of current
  • 5. 1570 M. Hakim1 -Asiabar ~r al. j Compur~rs &'Industrial Enginecnng 56 (2009) 1566- 15 76 a b Fig. 3. The shakmg !raJectory-b.ued search concepu: (a) A neuron's center SNrch along the el1te soluuons. Dommated chromosomes .ue replaced WJt h new ones in a troijec!Ory. (b) Chromosome neighborhood se.uch by monotonic decre.ising le.irning r.itio with non-domin.itC'd center m iteration nand n • 1. These condmons are rhe so~ me as VNS algorithm w ith dynam ic neighborhood stmcture. ~---+=71''-t-~- ~dominated Solution of Current population Fig. 4. Probo~bJhsuc surch toward non -dominated (elite) solutions in each iteration with .1 dynamic decre.1sing learning (neighborhood) ratio. population's first frontie r solutions are not predefined and they are defined probabilistically (see Fig. 4). This method is also intelli gent. because the searc hes move toward elite solutions. The step by step update by using Eq. {5) can generate new weight vectors (chromoso mes ) which are different from the present chromosomes based on the distribution of the fitness values. This reproduction can preserve the genetic diversity and provide an effective search. A grid of neurons with a random starting weight vectors, can introduce both divers ity and local search accuracy by using multi ple parallel search sc hema and shaking probabilistic trajectorybased search strategy (Please refer to Fig. 5). The learning neurons are parallel search trajectories that can defin e the Pareto optimal regions and their probabi lity distribution functions. 3.2.1. The statement ofSBMOGA Step 0. Initialize the NSGA-11 and SOMs parameters. Set the crossover and mutation probabilities. the number of generations (Nag ). the number of neurons (Nos/) and the learning rule parameters and set k2 = 1. Step L Define the neurons starting weight vectors (W 11 j = 1. 2. . Nos I) randomly such that these weight vectors have a uni form distribu tion over the fea si ble search space. Then evaluate the weight vectors based-on objective function values. Step 2. Calculate the lea rning ratio using Eq. (7). Step 3. Run the multi objective genetic algo1ithm and determine the population of chromoso mes in the nth generation. Eva luate current population and determine the first frontier chromosomes (elite solutions) of current population (i = 1, ... , n, ). Step ( W; 4. Trai n the neurons la st weight vectors j - 1. 2 . .. Nosl) using ith chromosome of first frontie r Fig. S. Paro~llel neurons th.H seuch high fitness arus in muJ[i modo1l mulri obje<tive search space. chromosomes { R ~) and the learning rule (Eq. (5)) and define 1 • the new weight vectors Step 5. Ca lc ui.He the fitn ess values of new weight vectors 1 j = 1. 2 . .... Nos/. If Wt 1 dom inates the based on . Parero dom inance. then re place it with w;'· 1 • Step 6. If i "' n 1 the n n 1 - n 1 +1 and go to step 4. If i - n 1 and k 2 <Nag then k 2 • k2 + 1 and go ro the step 2. If i- n 1 and k2 - Nag, go to step 7. w ;- w;- w ; Step 7. End. Fig. 6 shows the relationship between the NSGA-11 and SOM uni ts training al gorithm s. In this figure, Pr is parent popul.uion. (6. is children population, and F_j is the jrh frontier population in NSGA-11 algorithm and Lpr is neu ron centers population. In each iteration r. a subset of LPr can be dominated by new neuron center weight values and then is replaced by them. On the other hand. a subset of Lpr which is not dom inated by new neuron's weight values. remains. The next neuron centers population content in iteration r+ 1, is defined by this two subsets. Fig. 6 shows the learning process of SOM units population from NSGA-11 first frontier elite solutions: after each genera tion. some neurons move to better locations in terms of objective func ti ons quality and a new neurons population is created. The flowchart of the algorithm is shown in Fig. 7.
  • 6. ' 1571 M. Hakimi -Asiabar et a/. I Computers & Industrial Engim>ering 56 (2009) 1566- 1576 Nou-Dominated Sotting Algo l ithm II Ttainiug the Self-Otga uizing Fig. 6. A s ketch of NSGA-11 .md its relrttion to SOM's neurons le.nning process. 4. Case study The optimal operation of multi-purpose multi-reseJVoir systems is a real world complex problem. There are many advances in operation of reservoirs which are cited in the literature. L:lbadie (2004 ) presented a state of the art review on mathematical programming and heuristic methods in optimal operation of multireservoir systems. He concluded that although there are a few areas of application of optimization models with a richer or more diverse history than in reservoir systems optimization and opportuni ties for real-world applications are enormous, actual implementations remain limited or have not been sustained. In this section, the applicability of the proposed algorithm will be examined in developing operating policies for the Karoon-Dez multi-purpose multi -reservoir system. The Dez and Karoon reservoirs. with a tota l storage capacity of more than 6.4 billion cubic meters (BCM ). form the most important reservoir system in south western Iran close to the Persian Gulf (see Fig. S ). The system carry more than one-fifth of the Iran's surface water supply ( Karamouz & Mousavi. 2003). The reservoirs have been constructed on the Karoon and Dez Rivers. The two rivers join together at a location called Band-e-Gh ir, north ofthe City of Ahwaz. to form the Great Karoon River. The average annual inflows of the Dez and Karoon reservoirs are 8.5 and 13.1 (BCM ). respectively. The water downstream of the Karoon and Dez dams, supply domestic. industrial, agricultural and agro-indu strial demands. Total water de mand downstream of the Dez and Karoon dams is estimated as 1.95 (BCM), from which 42% is a llocated to downstream of the Dez dam (d 11 ) : 35% is allocated to downstream of the Karoon dam between the Karoon reservoir and Band-e-Ghir (d 2 , ) and the rest goes downstream of Band-e-Ghir to the Persian Gulf (d 3 , ). There is also an environmental water demand equal to 0.62 (BCM ) as in-stream flow in the Great Karoon River {d 41 ) (see Fig. 9). The reservoirs have a hydropower generation capacity of 1.15 million megawart hours (MWh ) per month. Other model variables are: ,.¥in " Maximum storage volume of the ith reservoir Minimum storage volume of the irh reservo ir Storage volume of ith reservoir in time period t Water head elevation in irh reservoir in time period r Inflow to the ith river in time period t Release from outlet of hydropower plant of ith reservoir in time period t Maximum capacity out let of hydropower plant of ith reservoir Minimum water required to activate the hydropower plant of ilh reservoir Water release from spillway of ith reservoir Maximum spillway capacity for ith reservoir Minimum spillway capacity for ith reservoir Maximum release for ith reservoir Minimum release for ith reservoir jth water demand in time period t 4.2. Objecrive funccions First objective function: minimizing unsatisfied water demand 3 MinZ1 = T 2::: ~ (dJr - d11 j: l 1=1 ).j1) 2 3 = T f" l 1= 1 L L dJr · (1 - /.j 2 (8) 1) where i.1, • dit -Xtit · Ru = 0 0 :; ,; : i.jl:;;,;;: 1 (9) ( 10) 4.1 . Model fonnularion In this study, a mathematical model for monthly operation of the Karoon and Dez reservoirs is developed considering the objectives of water supply to downstream demands and power generation. The decision variables of the optimization model are as follows: R1r Releases for reservoir i in time period r Xii1 Percentage of outflow from reservoir i allocated to water demand j in time period t (0 :;;; Xijr :;;; 1 ). Ajr Satisfied portion of jth demand location in period r • Second objective function: maximizing power generation Maxl2 = t f=l t K;.e;. rl lr · Htr (Su, Stt- t . Ra ) ( 11 ) i= l where K1 Energy transfer coefficient of ith hydropower plant e1 Efficiency index of ith hydropower plant H11 Mean value of water head behind the ith reservoir whose storage is equal to S1r
  • 7. ' M. Hokimi-Asiobar t.>t ol. / Compurm & lndusl'riaf Engineering 56 (2009) 1566-1576 1572 Randomly generate fr~t genetiC popu!atrcn and we.ght ;·ec:orsofS01.1 ulltts E• 3IUJle !l'lt Cl¥11111: XIOUI,ncn JM l'.'f !Qtll ,edcrs ol SOU uml:S t:ase~ en Ct jeat.e .. lfld>COS Recordnoo-<lommated solut10nsmthe arctu·.-. Record non-doi'Tiolnated 'olvt•ono; ofSQ IAu'l :.uc"""" Fig. 8. The Karoon and Dez nver- reservoir system m southwestern Iran. D R1:£L Ko<oon Kl'lr()OI'I 7 d4< E;a!uate new chromosomes c;:~ns.def;ng th•er obj>K:tr<e furc!lons <tnd r30krng Fig. 9. A schematic diagram of the K,uoon- Dez river-reservoi r system. Select the but solutrons based on theor tanklngs and prodKe the n~xt pop<.II:Hron L---<~~----- ---------- ------- condl1ton SOIII$1•cd? C:::::.-:::::;_2:. l 1.r _ 1 1·H~••l ---~ _::> _j_H , H r.l 1 HPP FiJ. 10. The reservorr p.u•meters. Fig. 7. Th~ The effective water level height for producing hydropowe r energy can be calculated as follow (see Fig. 10 for more details ): flowchart of SBMOGA. (13) flu is usually a non -linear function of reservoir srorage volume, which can be presented as: The tail water height of released water from reselVoir i in time period c can be est imated as: (12) ( 14)
  • 8. M. Hakimi -Asiabar l't al.f(omputl'fS & Industrial Enginttring 56 (2009) Therefore. the second objective function can be wrinen as follows : l T :LK, e,. r., . (H(S., )- H(R., )) =L M~z, ' ""] The hydropower energy production is a nonlinear and non-convex function. 1. Water storage capacity constraims ~' 1 " ~ 511 ~ ~11.11 ~fin ~ S21 :;; t = 1, 2. 3 .... T l = I. 2. 3 .... T 1i11!1 2. Water demand constraints R tr · X11r = ).J, · dlr 0 ~ Xn 1 ~ 1 R2r · X22r = /.21 · d 2r o ~ X22r ~ 1 Rl r" XIlr + R2r · Xnr = /.Jr · d2r X1r = i. lr · d lr R1,- 0 ~ ).11 :!i; 1 X2r = i.2r · d 2r R2r- 0~ h, ~ 1 lnst ream flows d41 620 :;;l!: 3. Continuity equations S1 ,1 = S1.o 52,= 52 .0 S, r-1 = 5 tr - S2r .. 1 = 52t- R1r :s::; Rlt ( ft.fa~ 0 ~ R2, <; R'~I!U = 1, 2.3 . .. .. T = r 11 r - r21 r R2r = r12 r - r22r r R1r r = 1. 2.3 .... T 5. Capacity of hydropower plant outlets r~'1" :s::; r2.1r :s::; ~~ rfln ~ r 2 2r :5; ~'f r = 1.2.3 , ... , T 6. Spillway release capacity constraints ~r' :5; r1 1c 4.4. Application of the new algon"thm to solve the model To so lve the reservoir system problem. first the algorithms paramecers and constrai nt hand ling method should be defined appropria tely. Constraint hand ling schemes for GAs are the penalty and repair methods (Chootinan & Chen. 2006). The repair method attempts to fix infeasible solutions by taking advantage of problem's characteristics. The repair method might be very effective, if th e relationship between decision variab les and constraints could be easily characterized. However. developing a repair procedure is usually problem-dependent and time-consuming when the problem includes comp lex constrai nts. In this case. the repair method for constraint handling is used in the solution process. ln the in itial ization process of the GA. a set of probabi listic initial solut ions is used. In the multi object ive view, to assign fitness to the solutions. the concept of Pareto dom inance is used. The learning ratio parameters are defined as k 1 "" 1.7 and k2 • 10. Then. by using Eq. (7 ), the learning ratio in generation n is defined as: h(n) = 1/ (1.7 + max(10. n)/ 100) This formulation causes the learni ng ratio in first to gene ration will be sta tic. Then, it will decrease with a smooth low level rate, which provides a relative ly hi gh exploration rare_ The problem is formulated for a 30 year rime horizon containing 360 monthly rim e steps. Therefore. eetch chromosome includes 720 variables for release values R,,. The solution values for variables X,1 r and ;." are defined in an optimization process as dependent variables of R,r. The neurons number is considered as 42 stochastic vectors generated for 42 neurons cente rs start points. Both of the objective functions must be minimized (th e hydropower generation function is multiplied by - 1). + /1r R2 r + /2r 4. Reservoirs' release constraints 0 1573 ( IS ) / '0 4.3. Optimization model constraints 1566~1576 This problem is a non-linear non-convex multi-objective optimization problem. :s::; ~'f 5. Results and discussion In this section. the results of the proposed model (NSGA 11-SOM Based Le<trning) in developing operat ing policies for Karoon-Dez river-reservoir systems are presented. To evalu.ate the model efficiency. its results are compared with the classical NSGA- 11 model. The specified problem solved using a Pen tium 4 personal compu ter with 512 RAM and P4. 2.8 CPU_ Both algorithms have been run with 20. 100. 500, and 1000 generations and the results are presented in Table s 1-3. Table I shows the run rime of algorithms for the four defined levels of generatio n. Table 2 shows the results of 42 neurons tr.ained b.ased on best solutions of NSGA-11 algorithm for four different runs. The results show that the model convergence is consistent with the number of generation so that the solutions of higher generation are better than those of low generation. The data of Table 1 show thea the ru n time ofSBMOGA for similar number of generation is more rhan NSGA-11. This is because the addition of SOM training process to each generation of the SBMOG A increased the one generation's computational complexiry. How- rf~' ~ r12r :s::; ~~~ rf~' ~ r 1 1r :s::; ~ f c = 1. 2. 3.. .. . T 1 7. Water head and tail water definition constraints = a1 · S~ + b1 5., + c1 H~{IJ/ = a2 (Rit) 2 t b2 (Rn } t- C 2 Hir H~, = H, - H~a.t Tilbl~ I The _.lgont hms run t1 me m fou r level. of generdtions (mmutes). Number of genera tions Running time NSGA·Il 20 100 500 1000 5BMOGA 0.38 1.75 935 18.50 0.65 3.67 19.3 35.9
  • 9. M. HaHmi-Asiabartt ai./Computm & Industrial Engineering 56 (1009) 1566- 1576 1574 Tab~ 2 Tht ObJtCtlvt funct1ons v~lues Neuron No. Number of generations obtained from the propoSt-d model cons1denng 42 f'I('Urons. 100 20 " 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 359,529 355,612 359,118 352,873 351.568 356,080 355.591 355.913 351.766 355,930 352.316 352.771 359.080 355.520 359,099 356.055 352.148 358.098 359,709 359,867 356.340 359,704 355.275 359.324 359,439 362280 352.982 351,743 355.83 1 354.014 363,910 352.113 357,049 359.268 352,084 355.614 359,300 355.911 357.872 360,001 352,374 356,355 500 F2 F1 F2 661.445 743,7 18 788,238 178.156 675,785 468,484 472544 676,120 457.563 684,66 1 672,340 675,042 182.354 743.890 576,571 743,570 672.847 467268 672,954 686.592 472,669 683,634 463,262 496,185 675,799 744.054 476,643 675,455 744,713 473.601 675,076 675.355 473,645 804,086 673.962 744.253 803,725 496.474 680,485 467.284 674,411 476,052 349.400 354,549 354.176 352.873 346,531 349,918 345.282 352.013 35 1.766 155,599 352,087 352,661 353.445 354,405 358.622 354.802 352.081 353.824 352,690 359.382 356.279 355.712 350.260 352,936 353,841 359.416 346.991 346,972 354.663 349,336 358.904 347,101 355.678 356.211 346.984 354.465 356,199 355,973 346.712 351.222 352.244 348.651 680,498 804,760 858,983 178.156 697.816 686,304 546.473 754.730 457.563 763,654 752.315 753.674 616.803 805.007 676,092 804.492 752,095 918,306 920,990 757.706 598,301 763.064 682,743 792,392 846.955 911,052 653.230 697,429 805,304 547.467 834225 697.488 660,824 837,066 696271 805.135 836,793 191.993 701,134 854.818 753.262 570.679 ever. this can be discarded by improved rate of convergence of the new algorithm. Fig. 11 shows a graphical representation of the so lutions derived by SBMOGA after 100 generations and with 3.67 min run time and the solutions derived by NSGA-11 after 1000 generations. with 18.50 min run time (see Table 1 ).I t can be easily seen that the results of SBMOGA after only 100 generations. outperformed the NSGA-11 results after 1000 generations while the run time ofSBMOGA has been 20% less than NSGA-11 (one of fastest multi-objective GAs ). As shown in Table 3. the classical NSGA-11 provides a Pareto front with 18. 18. 18. and 24 non-dominated solutions for four defined categories. As can be seen in Table 3, some of solutions of NSGA-11 in each run are dominated by corresponding SBMOGA method sol utions. In this tab le, the number of a solution of SBMOGA method that dominates a NSGA-11 solution is written in the "Dominated by" column. For example, for a generation number equal to 20, the neuron number 2 (see Table 2) dominates the solutions number 2 and 3 of NSGA-11. Then it can be concluded that in this case study. the SBMOGA method has improved the NSGA- 11 final results in all cases. Also some solutions de ri ved by SBMOGA method in 20 generations can dominate some solution of NSGA-11 in 1000 generations. 1000 F2 346,576 347,678 345,650 350,368 344,684 343,670 343.764 344,732 336.744 346,362 344,758 344,909 344.779 347.599 346,394 347.683 344.764 346,780 345,517 346.350 345,441 346.369 345.235 345,456 345 ,917 349,132 343363 344,759 347.62 1 345.332 348.822 344.838 342.395 348.593 344.790 347,611 348.589 144,766 344.658 348.371 344.771 342.677 F2 825,949 874,085 910.483 242.943 8282 18 725,161 726.115 828.395 619.626 829.582 827.622 828.152 837.820 874.112 814,083 874,047 827.588 910.657 926.27 1 829.756 750.825 829,462 777.454 866,678 828.540 943,121 747,450 828.122 874.476 75 1,084 938.514 828.129 788.805 921 ,831 827,892 874.240 921,764 5 19.040 829,319 789,267 827.886 822,950 " F1 346,166 345.820 344,619 350.847 344,709 344,155 344.251 344,739 343.302 146.029 344,731 344,919 344,449 345,758 346,503 345.832 344,760 344,432 346.580 346.000 345.624 346,052 345.765 343.263 345.587 347,400 343,293 344,759 34 5,792 345.267 347,301 344.831 342.716 348.412 344,767 34 5,751 348.403 346.066 344,650 347,865 344,743 342.701 852,940 889.372 913,619 273.575 802,737 728,114 729.076 802.947 528.429 856.329 802,156 802.688 844,698 889.345 786,629 889,344 802,086 914.509 919,526 856.546 763.241 856,216 787,943 864,545 855.483 937,848 768,893 802,674 889.672 736,944 933,753 802.690 816.706 93 1,123 802,505 889,494 931,063 402,797 804.040 795,414 802.418 816,835 For example. the neuron number 41 can dominate solutions number 17, 1S. 19, 20, 21, 22, 23. and 24 of NSGA-11 in case of 1000 generations. This also shows the higher convergence ra te of SBMOGA in comparison with the NSGA-11. Another criterion for multi-objective solution quality is diversity measure. Diversity measure is a number in the range (0, 1], where 1 corresponds to the best possible diversity and 0 corresponds to the worst possible diversity. In this paper a revised version of diversity measure described by 1<hare {2002 ) is used. In this new diversity measure. there is not a need to a reference set of solutions. To calculating the diversity measure following steps are required: 1. Define a grid of KM in objective space in which M is the number of objectives and K is the number of grid cells in each dimension. 2. Calculate following arrays: 1 if t he grid has a representative point in the range b(m , k ) ,;; x < b(m, k - 1) l! (m, k ) = ( 0 otherwise b(m. k) is the grid edge in kch step of dimension m.
  • 10. ' M. Ha ki mi-Asiabar rr al./ Computers & Industrial Enginrering 56 (2009) 1566- 15 76 1575 Table J The P.m•to front of rhe problem obtained using the NSGA-11 model for 20, 100. 500, and 1000 geller.ations. Solution Number of generations No. Dominatffi 20 by F1 F2 388,141.6 405,496.8 403,847.7 416.1322 417,945.5 432,953.5 440.894.2 412,087.7 813.347 723,788 706,276 835.633 848.826 989,272 1.043,934 72 1.1 51 814,299 899,610 870,436 1,047,148 679.413 176,989 155,239 161237 726.45 1 181.075 418.456.8 436,507.3 431 ,755.3 10 11 12 13 14 15 16 17 500,855.4 397.748.8 387,568.2 371,411.3 374.508.7 41 4,636.6 387,997.4 " 100 389,773.8 37 37 37 37 37 846, 163 1.027,269 945,288 990,993 1.043.934 721,151 978, 136 962,407 888.143 1.201,649 6 79.4 13 740,462 177,471 153,852 184.994 178,627 500 F1 F2 29 18 40 735,255 912.242 401.5133 396,698.3 407.882.7 410,534.3 429,530 440.894.2 412.087.7 426.984.1 422.721.8 430,270.8 456.668.9 397.748.8 413,059.9 386,237.8 365,471.6 393,048.1 387,058 .6 Domin<~te<1 Dominated by F2 F1 367.219 394,552 388.380 398,997 412,286 410,983 429.390 427,234 388,750 402,750 401 ,565 420.311 421,846 431,254 477,218 375.130 363,221 378,847 190,457 879.842 801,342 1.073,409 1,138,847 1,126.430 1.14 1,470 1,013,215 810,390 958, 16 1 883,162 962.999 949,169 1,092.299 1,148,629 204.636 169,429 192.255 11 18 37 37 30 30 30 30 19 20 21 22 23 24 SbM0GA ~-~ C•p:,;;:;-~jf,~l;-ii-;,-,-00-9 '~''G r;;;-,;-~ ! + The NSGAU Opl •m.oo! Front iilfi;:n 10XIg:oneraucns + . ..... ! ""' _,. + ......,.... ~ .,... ... .. + ... .. ,, ~-~--~---.J.............-•-----···---·--- 3 .6 38 4 .1 .2 4 4 F1 IJI>'iill n>fa•d W~!e< [iem<ond (MtniY'IH')1COJM 3 46 ll 10 ~ Fig. 11 . G ra p hic ;~ ] co mp.1rison o f o ptimal from d efi ne-d by SBMOCA <~fte-r 100 ge-ne-ration and 3.67 min of run t ime- .1nd NSGA-ll a fte r 1000 ge-ne-rations and 18.5 m in ofrun ti me. m ~ 19 26 1000 F1 F2 368.577 394,552 388.380 399,702 404,632 406,109 444.041 427,234 387.800 399,220 401.74 5 417.538 421,846 430,024 387.222 385,522 393,801 376,804 363,495 367,001 382.600 378,019 376.174 382.25 3 186.399 879,842 801,34 2 1,055.495 1,157,574 1,062,778 1.209.903 1,013,215 703,727 1,000,038 888.584 1,013,192 949,169 1,107,611 6 10.466 205.659 868,675 190.349 173.224 157,982 197,921 19 3,645 190.107 194.746 Domin.Ht:d by 16 10 14 14 9 6. Conclusions ~ 10~ 0 by 1. 2 .. . M k ~ 1.2 ... ,K 3. Define the divers ity measure as follows: (16) The diversity measure of SlGA that is calculated using Eq. ( 16) is equal to 0.4222 while the diversity index of NSGA- 11 is equal to 0.2444 in a grid of 15 •15. In th is analysis the non-dominated solutions of SBMOGA at the 100 generations and the non-dominated solutions of NSGA-11 at the 1000 generations is used. The results show more than 42 % improvement in the diversity measure. In this pa per. a new method called SOM-Based Multi-Objective CA (SBMOCA) proposed to improve the effectiveness of a well known GA-based multi-objective optimization model namely NSCA-11. In the new algorithm, in each generati on, a popu lation of neurons will be t rained us ing el ite solutions ofGA's current population. The train ing rule of neurons is developed using the concepts of learning ru le of SOM and variable neighborhood search algorithm to imp rove both local and global search. Moving the ne urons ce nters in stochastic shak ing t rajectories in feasible search space provides better results for exp loration and improves genetic diversity. The training ratio (or training step size ) is considered as a dynamic monotonic decreasing funct ion of the GA genera tion number. This provides a variab le neighborhood search when one neuron reaches to an optimal region. The neurons in SBMOGA algorithm uses the aggregated knowledge of optimum regions in past generations and new we igh t vectors for neurons' centers are defined as a function of previous weight vectors. After some first gen erations, the neurons gradually attracted by local high fitness regions and the searching process is conve rted to a exp loitation process based on existing aggregated knowledge about optimum set regions. Then random fluctuation s of current neurons weight vectors within the high fitness areas can produce better explorations. The new method is not an independent algorithm. In this paper. it was linked w ith NSGA- 11 and the model was applied to a realworld two-rese!Voir optimization problem. The results have shown that the SBMOGA convergence to opt imum front is much quicker than classical NSGA-11 and it can decrease the complexity problems of EAs. Also the diversity of the non-dominated solutions of SBMOGA was much better than the NSGA-11. The main advantages of the proposed algorithm can be summarized as follows: ( 1) Development of a multi-objective learnable algorithm based-on Kohonen's neural network.
  • 11. ' 1576 M. Hakimi-Miabar et al.jCompurers & Industrial Enginetring 56 (2009) 1566- 1576 (2) In the algorithm, the movement of SOM units centers in feasible search space toward elite so lut ions of GA supports performing an intelligent reprodu ction technique. As the locations of GA's elite solutions are stochastic. t his provides an inte lli gent stochastic expl oration. (3) The knowledge of optimum area gathered in past genera- tions of the algorithm is saved in form of neurons trajectories. This process min imi zes the probability of reevaluating the solutions. (4 ) The shaking process of VNS in loG!. I areas around the elite solutions provides better capability for d iversity and exploitation and enhances the GAs local se.arch accuracy. (S) Developing a multi-objective leetrning rule for SOM. (6) In mu lti-mod al objective functions thi s algorithm is capable of findin g loca l and global optimums. This capab ility is improved by increasing to the number of neurons. (7) W hen a set of neurons co ncen trates on a region, it can show a loca l or global optim.ll area or a cluster of Pareto-optima solutions. (8) The final position of neurons can be considered as an enhancement of Pareto front presented by cl assical multiobjective generic algorithms. The area of future research In this paper, th e number of SOM's neurons considered as a fixed number(42 in this case).lt has been seen that the maximum number of solutions derived by SOM is equal to the number of its neurons. A future area of research would be defining optimal number of SOM units. The number of units can also be rega rded as an adaptive parameter to improve the model's efficiency. References Amour. H. R.• Rettinger. A. (2005). lntelligem exploration for genetic algorithms. GECCO'OS, June 25-29. 2005. Washington. DC, USA. Arroyo. j. E. C. & Armentano. V. A (2005). Gene ric local search for multi-objective flows hop scheduling problems. European journal of Operorionol Rrseorch. 167, 7 17-738. Brown, M__ & Smith, R. E. (2005). Direned multi-objective optimisation. International journal of Computers Systems and Signals. 6( 1). Biiche. D. (2003). Mul ti-objective evolmionary optimiz;~tion of gas turbmro components. Ph.D. thesis. Swiss Federal Institute of Technology ZUrich. Chootinan, P., & Chen. A. (2006). Constraint handling in genetic algorithms using a gradient-b;~sed repair mt-thod. Compute-rs & Operations Resrarch. 33. 2263-2281. Cordon, 0., Moya. F.. & Zarco, C. (2002). A new evolutionary algorithm combining simulated annealing .md genetic programming for relev.mce feedb.1ck in fuzzy information rt'trievill s;ystems. Soft Compuring, 6. 308-319. Deb, K. (2001 ). Mulri-objrcrive optrm1zarion using t'l.'ohsrionory o/gomhms. New York: Wiley. Deb. K.. Prat.1p. A. Agarwal. S., & Me-yilriviln. T. (2002). A fast and e-litist multiObJeCtive genetic a lgorithm: NSGA-11. IEEE Transactions on Evolutionary CompJ!l arion. 6. 182-197 Drobics. M.. Bodenhofer. U.. & Winiwarrer. W. (2001). Datil mining usmg synergies between self-organizing m.1ps and inductive learning of fuzzy rules. ln}oinr 9rh IFSA world congrt'SS and 20th NAFIPS mtrmational c:O/ifert'ncr Feo. T.. So Resende. M. (1989). A probabilistic heuristiC for <1 computationally difficult set rovt-ring problem. Oprranons Rrsrarch Lettrr>. B. 67-71 Fleurent, C., & Ferland, j. (1994). Genetic hybrids for the quadr.Hic assignment problem. DIMACS Serirs in Discrere Marhrmarics and Throrerical Computer Science, 16 Fogel. L J. {1962). Autonomous dUtomata. Industrial Rtstorch, 4. 14- 19. Fogel. D. B. ( 1991 ). System identification trough simulattd rvolution. A mochinr learning approach. USA: Ginn Press. Goldberg. 0. E. (1989). Genrric ulgonrhms in search, opnmizotion & machinr /rarning. Boston, MA: Addison-Wesley. Grosdn. C., & Abrahilm. A. (2007). Hybrid evolutionary algorithms methodologies architectures .1nd review's. Studies in Computational Intelligence (SCI ). 75, 1- 17. Hakim-Asiab.1r. M .. Ghodsipom, S. H., St-iti. A.. Kerarhi.1n. R.. & O"Britn. C. (2008). A multi-objective hybrid gradiem-ba5ed gene ric a lgorithm. !n Fifrrenth intrmationol working seminar on production f'Conomics. lnnsbruck. Austria, March 3-7,2008 (pp. 235-251 ). H.1nsen, P., & Mladenivic. N. (2001 ). Vanable neighborhood se,uch: Principles and ilpplications. European journal ofOprrational Rrsearch. 130,449-467 Holl.md. j. ( 1975). Adaptation in natural a11d artificial system.~. Ann Arbor: The University of Michig<~n Press. ls hibuchi. H., & Murata. T. (1998). A multi-objective generic local search algorithm and its application to tlowshop scheduling. IEEE Transactions on Syst ems Man and Cybernrrics, 28(3). 392-403. j .Jszkiewicz, A. ( 2002 ). Genetic local search for multi -objective combinatorial optimiution. Europton journal of Operananal Research. 137. 50--71. Ka ra mouz. M., & MousJ.vi. S.]. (2003). Uncertilinty b.1sed operation of large scale reservoir systems: Oez .1nd Karoon experience. journal of the American Warer Resources Association. 01219. 961-975 Kernig han, B. W .. & lin, S. ( 1970). An efficient heuristic procedure for partitioning graphs. Bell Systrm Trchniml journal, 49. 291-307. Khare. V. {2002). Perform.1nce scdling of multi-objenive evolutionary .Jigomhms. M.Sc thesis in Narur.1l Computation. Edgbaston. Birmingham B15 ~TI. UK: Unive rsi ty of Birmingha m. Kohonen. T. ( 997). Se-lf-organizing mops. Information sciencrs (2nd ed.). Springer. Kohoncn. T.. Kaski. S., lagu s. K.. S.1lojarv1,j., Honkel.1,j .. PJ.iltero. V .. ct al. (2000). Self organization of a mass1ve d ocument collection. IEEE TrOII50Ctions 011 Neural Nerworks, II. 574-585. Kon.1k. A. Coit, D. W .. & Smith, A. [_ (2006). Multi-objective optimization using genetic algorithms: A tutorial. Rdiabifity Engineering and System Safety. 91. 992 - 1007. Koza. j . (1992). Genrric programming. On thr programming ofcomputrrs by means of natural selection. The MIT Press. Kubota. R., Yamakilwil. T., & Horio. K ( 2004 ). Reproduction Striltegy b.1sed on selforganizing map for re<~l-coded genetic .1lgorithms. Neuro/lnfannotion Processing - Lecrers and Reviews. 5(2 ). Kubota, R., Yamakaw.1. T., & Horio, K. (200S). Reproduction str.1tegy based on selforganizing map for genetic .1lgorithms. /mernurional journal of Innovative Computing, lnfonnation and Control /CtC International, 1349-4198. 1(4), 595-607. Labadie, j. W. (2004 ). Optimal operation of multi reservoir systems: state-of the-art review. journal of Worrr Reso11rcts Planning and Management, ( March/April). Merz. P., & Freisleben. B. ( 1999). A comparison of memetic algorithms tilbu search <1nd ant colonies for the qu.1dratic assignment problem. In /nternacional congress on rvolutianary compuration ( CEC99 ) (p p. 2063-2070). IEEE Press. Michillewicz, Z. ( 1996). Generic a/gorirhms .. data st rocrures '" evolutiun programs. Springer-Verl.1g. R.1ngarajan. A.. Ravindra n, A R., & Reed. P. (2004 ). An interactive multi-objective evolutionary optimization algorithm. In Procerdings of thr 34th wtemorional confrrence on computers & induscria/ engineering. 277-282. R.1uber, A. ( 1999 ). LabeiSOM: On the la beli lg of self-organizing lll<IPS- In Proceedings of international joint confrrence on nrurol networks. Wa$hington, DC Schaffer, J.D. ( 1984). Some experiments in machine learning using vector evaluatrd genetic algorithms. Ph.D. thesis. N,ulwille, TN: Vanderbilt University. Schwefe1. H.-P. ( 1975). Evolutions striltegie und numerische Optimierung. Ph.D. thesis. 5chwefel, H.-P. (1981 ). Numerical optimization of computer models. W iley Chichester. Schwefel, H.-P. ( 199S). El.'o/ution optimum sreking. Sixth-generation computer technology series. john Wiley .1nd Sons. Sriniv.1s, N.. & Deb, K. ( 1994). Multi-objective function optimization using nondominated sorting genetic .1lgorithms. Evolutionary Computat ion journal. 1{3), 221-248. Ultsch, A.. & Koms. D. ( 199S ). l ntegr<~tion of neural networks with knowledge-based systems, In IEEE /nremorinnal Conference on Ntural Networks. Perch. V<!n Veldhuizen, D. A, & lamont. G. B. {2000). Multiobj ective tvolutionary .1lgorithms: Anillyzing the state-of-the-an. Evolutionary Comp utation journal. 8(2). 12S- 147. Yam.1kaw<1, T., Horio. K., & Hir.1t suka, T. (2002). Advanced self-org.1nizing maps using binary weight vector and its digiti! I hardware design. In Procf!tdings of the 9th imrrnotionol conftrence on nruml infonnation processwg (Vol. 3. pp. 1330-1335 ). Zitzler, E.. & Thiele. L (1999 ). Multiobjective evolutionary algorithms: A comp.1rative case study .1nd the strength Pareto .JpprD.lch. IEEE Tronsacrions on Evolutionary Computation. 3(4). 2S7-271.