Retail

A NOVEL EVOLUTIONARY ALGORITHMS BASED ON NUMBER THEORETIC NET FOR NONLINEAR OPTIMIZATION

Description
A NOVEL EVOLUTIONARY ALGORITHMS BASED ON NUMBER THEORETIC NET FOR NONLINEAR OPTIMIZATION
Categories
Published
of 6
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Related Documents
Share
Transcript
  ADM2006 - Paper No.309  Proceedings of the International Conference on Advanced Design and Manufacture8-10 January, 2006, Harbin, China A NOVEL EVOLUTIONARY ALGORITHMS BASED ON NUMBER THEORETIC NET FORNONLINEAR OPTIMIZATION Fei Gao  Dep. of Mathematics, Wuhan University of Technology, 430070, P. R .China E-mail: e_e4@163.com  Abstract: How to detect global optimums which reside oncomplex function is an important problem in diverse scientific fields. Deterministic optimization strategies, such as the Newton–family algorithms, have been widely applied for thedetection of global optimums. However, in the case of discontinuous /non-differentiable /non-convex complex functions,this approach is not valid. In such cases, stochastic optimization strategies simulating evolution process have proved to be avaluable tool. In this paper, a novel evolutionary algorithmbased on Number Theoretic Net for detecting global optimums of Complex functions is introduced. It can be applied to any functions with multiple local optimums. With some established techniques, such as the ideas of genetic algorithms and  sequential number theoretic optimization to improve the propertyof convergence in large scale, the deflection and stretching of objective function to guarantee the detection of a different minimizer, it detects optimums of a function through adding  genetic operations to the feasible points generated by number theoretic net sequentially. The experiments done indicate that the proposed algorithm is robust and efficient.  Key words: Optimization, Evolutionary Algorithm, Number Theoretic Net, Mutation, Deflection 1 Introduction Detecting global optimums of complex functions is animportant problem in diverse scientific fields.Mathematically, complex functions are discontinuous/nondifferentiable/ non-convex full of local optimums. Anextensive literature review [1,2] indicates that most of theexisting deterministic optimization strategies for complexfunctions, such as the Newton–family algorithms, arelimited to the optimization of a local area with variousexcess stories using stochastic approaches to globallyoptimize the problems such as genetic algorithms,evolutionary algorithms, simulated annealing and etc.Stochastic methods have been proved to be capableof finding global optimum by an asymptotic convergence.These ideas require a primary stochastic evolving formand converge to one optimum. The samples generated byevolving operators such as selection, mutation andacceptance concentrate around from a local optimum toglobal optimum in terms of computational efforts needs.This problem may be solved by improving thedistribution of points so as to mine the information of variable up mostly. This paper proposes a novel approach Number theoretic evolutionary algorithm(NTEA) for global optimization of complex function by combining theidea of sequential number theoretic optimization(SNTO)[3~5] with the concepts from evolutionaryalgorithms. This method adds the updating rules with theideas of mutation and other genetic operations to thefeasible points generated by number theoretic net. Theexperiments being done show that NTEA has theadvantages of high precision and high robustness onmulti-model function’s optimization problems to a certainextent.The rest of this paper is organized as follows. InSection 2, firstly we introduce the background of our research, i.e. details of SNTO and the main idea of evolutionary algorithms (EA), then the previous work onextension of SNTO and EA is discussed. We propose anovel approach NTEA with established techniques throughfunction deflecting and adding evolutionary operationsand we examine the effectiveness of the novel method byapplying it into classical benchmark functions in Section4.And Section 5 summarizes the paper.  2 Backgrounds2.1 Number-theoretic methods’ sequential realization  Number theory method’s sequence realization SNTO is aglobal optimization approach based on number theory byHua L. K. et al. In this method number theoretic net isused to generate good points in terms of good lattice points scattered in feasible domains, then throughsequential contraction the searching field it find the globaloptimum theoretically.Definition 1 [4] : Suppo is a integer vector   s.t.   1 (;,,)()  s nhhsn <  1, hn () ij hhij i ≤ <≠ ≠ (,)1, i nh od),1,,1)/2,1,, nkn and greatestcommon measurewhere 1,,; is = =  (m(2 ii kikk  qkh i qnis = ==  i k  qst  .⎧⎪⎨⎪⎩= − , modify thecommon congruence method to make. 1 i kn ≤ ≤ , 1 (   then the lattice point set of generic vectors )  s nhh ; , , } n , .  defined by 1 {()1   kks  xxk  , , , = , nk   Px = = Suppose  f  (  x ) is continuous function on bound andclosed set, take a  NT  -net {  DP  = k  , 1, 2, … ,}and let k  = n {} nk   xx 1 max kn ∗ ∈ () nn stand for  point.  st  . ( n ) k   fx ∗ = =  fx ≤ ≤ , we expect n ∗ ∃ ∈ { k  }.  st  .  fx ()(  xn ) ∞ nn Mx ∗ ∗ → , ∗ →→ The essence of Number theory method is to replacerandom points by uniformly scattered points on  s -dimensional unit cube. When SNTO is used to solve themulti-modal function optimization problem, it requires notcomputing  f  (  x )’s derivative at point P but  f  (  x )’s value at P,and it’s independent of derivation and the choice of initial points. Now we give the main progress of SNTO [4] as below:Algorithm 1. SNTO :Step0: Initialization. t  =0,  D (0) =  D , a (0) = a , b (0) = b ;Step1: Generating a  NT  -net  P  ( t  ) uniformly distributed on[ a ( t  ) , b (t) ] by using a number theory method;Step2: Computing the new fitness. Note, ()()() [] ttt   Pab ∈ , (1) {}  x φ  − = , take ()()(1)() {} tttt   PxQ − ∈ ∪ = .  st  .   ()() ()( tt  )  fxfy = ≥ , where () t   yQ ∀ ∈ and   x ( t  ) , M  ( t  ) are the current best value of   x * , M  * ;Step3: Termination condition Judging. Given δ  > 0enough small, c ( t  ) =( b (t) - a ( t  ) )/2, if max c ( t  ) < δ  , then  x ( t  ) , M  ( t  ) are accept, otherwise go to Step 4;Step4: Region contraction. Define a new region  D ( t  +1) =[ a ( t  +1) , b (t+1) ] as (1)()()()(1)()()() max(,),(1,2,,min(,), tttt iiiitttt iiii axcaisbxcb γ γ  ++ ⎧= −=⎨= +⎩  )  where γ ∈ (0,1) is pre-given contraction ratio, generally γ =0.5, t =t+1, go to Step1; 2.2 Evolutionary algorithm Evolution has created wonders like us. There is much to be learned from Nature. Evolutionary computation is thestudy of computational systems which use ideas and getinspirations from natural evolution.The Evolutionary algorithm (EA) [6~9] has beenwidely used for solving optimization problems. In EAinitial population generated by random points in feasibleregion of problem which is likely to select the fittestindividuals so that those proven to have greater fitness for the environment survive at higher probabilities in the nextgeneration among those that forms a generation. The nextgeneration is formed through evolutionary operations suchas crossover, mutation and saving the best for respectiveindividuals. Solution searching is pursued by repeating aseries of these operations. We expect that the number of individual with higher fitness (that is, those closer tooptimal solutions) increases as the search make progress.Thereby an optimal solution can be achieved. The abovedescribes the basic concept of EA. 2.3 Previous studies  But as the other global optimization method, SNTO isexhaustive and deterministic, and it fails to jump off localoptima. In beginning of the method, it has a big sample points, but it doesn’t mine more information from these points. So mining these points may generate moreeffective points globally.Evolutionary algorithms is a kind of self-adaptiveheuristic method simulating evolution process effective on  complex functions’ optimization problems, and it’s similar to SNTO on "Multi-points", thus this gives the idea of combining the idea of SNTO with the concepts fromevolutionary algorithms.Some Previous studies of combination of evolutionary calculation technique with number theoreticmethod have been done in two ways mainly: First, usingnumber theoretic net to generate initial feasible points for EA [10]; second, parallel strategies [11] for SNTO usingsmall initial feasible population and using evolutionaryoperator crossover to these point. 3 NTEA and Numerical Experiments3.1 Proposed Technique NTEA Based on the above mentioned studies, We propose anovel number-theoretic evolutionary algorithm (NTEA) by doing evolutionary operations (mainly mutation) onequi-distribution set  P  n generated by number theoretic netin SNTO while keeping SNTO’s advantages in our technique, just modifying the step2 of SNTO, the NTEA isgiven as below in details:Algorithm 2. NTEAStep2:   Computing the new fitness. Note , do mutation toin probability ()()()()0 [ tttt   PPab = ∈ , 1 ] ()0 t   P  p , for  ()0 t i  xP  ∈ ,Let, ()0()1 ik ik ik   xtyrand  x xtyrand  ,,, +Δ , , =⎧=⎨−Δ , , =⎩ ()() tt kk   yba = − , (1) ()[1 bt T  tyyr  − Δ , = − ] , T   denotes the algorithm’s largest iteration number, t   is the iteration number now, b is coefficient, r  = rand  [0,1];Then we get , do crossover to as below: ()1 t   P  ()1 t   P  Suppose ()121 t   xxP  , ∈ 11 (1 kk  ,  β  is a random number on[0, 1], k  is random integer on [1, s ],then 1 ) 2 k   x  β  , , = +  x ,  β  − () (1)  x , and now we get; Note 1 t   P  {} φ  − = ()01 t  , take ()()(1) {} ttt  ()()2 tt   Px − ∈ ∪  P  ∪  P  ∪ Q =    st  . . ()() ()( tt  )  fxfy = ≥ , where () t   yQ ∀ ∈ )  x  and  x ( t  ) , M  ( t  ) are the current best value of   x * , M  * ;If the objective function f  (  x ) is full of local optimumsand more than one minimizer is needed, we chooseanother established techniques to guarantee the detectionof a different minimizer, such as deflection and stretchingare introduced. Suppose objective function is  f  (  x ), we usedeflection technique as below to generate the newobjective function  F  (  x ) [8] : ( ) 1 ()tanh k ii  Fxxx ∗= = ∏ 1 (  f  − ⎡ ⎤−⎣ ⎦  where i  x ∗ ( i =1,2,…, k  ) are k  minimizers founded,  λ ∈ (0,1). We also introduce stretching technique [8] to generatethe new objective functions G (  x ) and  H  (  x ) as newobjective functions: 12 ()()1sgn(()()))) ii  fx ∗ ∗ ⎡ ⎤−⎣ ⎦⎡ ⎤⎣ ⎦ 1sgn(()()()()tanh(()() ii Gxfxxxfx fxfx HxxGxGx  β  β δ  ∗∗ = + − ++ −= +− G   where  β  1 ,  β  2 , δ > 0.Fig.1. shows deflection and stretching effects on  f  (  x )=cos  x at x = π  Fig.1.Deflection and stretching effects on f  (  x )In this way, we see that the searching algorithms willnot locate x = π . So we can combine this technique with NTEA to find more minimizers needed more efficiently. 3.2 Benchmark functions To check the effective of these algorithms, we performed aseries experiments. First we define standard functions [1]introduced mainly by De Jones and often used asquantitative evaluation means for the benchmark test of   EA and other technique. They are defined as below:eg.1. f  1 (  x ,  y )=21.5+  x sin(4 π  x )+  y sin(20 π  y ), -3.0 ≤ ≤ 12.1,4.1y5.8, its global maximum value is known as38.827553 [1 . ≤ ≤ ] This is a classical multimodal function whose globaloptimum is surrounded with many local points.eg.2.  f  2 (  x ,  y )=-[20+  x sin (9 π  y ) +  y cos (25 π  x )], where (  x ,  y ) ∈  D = 222 {(,)9}  xyxy + ≤ , its minimum value-32.71788780688353, correspond global minim point is(-6.44002582194051, -6.27797204163553) [9] . Fig.2 is f  2  on  D , where its value out of  D is put as -40.Fig.2.    f  2 on  D    f  2 is a multimodal function difficult to be optimized.Its defini region  D is big,  x sin (9 π  y ) and  y cos (25 π  x )oscillate in different directions in their ways and it hasdeep valley clatters near to 4 points.eg.3.  Easom function f  3 (  x 1 ,  x 2 ) =-cos x 1 cos x 2  exp[-(  x 1 - π ) 2 -(  x 2 - π ) 2 ], its global minimum is known as -1,correspond global minim point is (- π , π ).eg.4.  Bohachevsk  1 function f  4 (  x 1 ,  x 2 ) =. 221212 20.3cos(3)0.4cos(4)0.7  xxxx π π  + − − + eg.5.  Bohachevsk  2 function f  5 (  x 1 ,  x 2 ) = 221212 20.3cos(3)cos(4)0.  xxxx π π  + − ⋅ + 3 .eg.6. Schaffer  function f  6 (  x 1 ,  x 2 ) =   ( ) 2221222212 sin0.50.51.00.001()  xx xx + −++ + . And it is calledMultimodal Sine Envelope Sine Wave Function [2] .eg.7. f  7 (  x 1 ,  x 2 ) = 220.252220.11212 ()1.0sin()  xxxx ⎡ ⎤+ + +⎣ ⎦ .eg.3.~eg.7. are defined in -100 ≤  x 1 ,  x 2 ≤ 100, their global minimum is known as 0, correspond global minim point is (0,0), and they are classic benchmark functions toevaluate EA’s global convergence and local exploring performance. 3.3 Comparisons of results among EA, SNTO andNETA For all the testified functions, we take the same genericvector for number net to generate good points as these:first cycle we take (987; 1,610), then we take (233; 1,144)in the latter cycles.To avoid non-convergence for long-playingcirculation we replace max c ( t  )< δ by min c ( t  )< δ ; andgenerally take δ =10 -15 , r  =0.6.And we add mutation operator in contraction region.Especially, for   f  1 we choose mutation ratio 0.05,require the newer optimal not smaller than old one.For   f  2 we take δ =10 -8 , r  =0.86, mutation ratio=0.85and we make the function value 3 at points out of definedregion and we take a new mutation method as this:Generate a random number  α on [0, 1], if  α > 0.95 or  α < 0.05, then mutation in defined region; otherwisemutation in correspondent contraction region.For   f  6 , we require the newer optimal not smaller thanold one not strictly.For   f  7 , we take δ =10 -30 and require the newer optimal bigger than old one not strictly. Now we use Matlab6.5 to program NTEA and its performance in contrasted to SNTO, simple evolutionaryAlgorithm SEA. The global search capability, convergencespeed and robustness of the algorithms proposed isexhibited in Fig3 ~ 6: Fig.3 is NTEA contrast simulation progress of the best, average and worst function value initeration for   f  1 . Fig.4 shows the local convergence of  NTEA for   f  2 . Fig.5 is the progress of optimization incontour of   Easom . Fig.6 is contrast simulation progress of the three algorithms–SEA, SNTO and NTEA for   f  6 . 0102030405060700510152025303540Worst function avalueAverage function avalueBest function avalueIteration number f  1   Fig.3. Comparison of Best, Average, Worst fitness for  f  1  −6.442 −6.4415 −6.441 −6.4405 −6.44 −6.4395 −6.439 −6.4385 −6.438−6.2805−6.28−6.2795−6.279−6.2785−6.278−6.2775−6.277−6.2765−6.276X       Y   Fig.4. Local convergence of NTEA for  f  2 in contour map 1234567891012345678910X       Y −0.8 −  0   . 6    −0 .4  −   0 .  4 −0.2 −0.2 1.4803e−016 1  .4   8   0   3   e− 0  1   6   1.4803e−016 1  .4   8   0   3   e− 0  1   6   1.4803e−016        1 .       4       8        0        3       e    −       0        1       6  1.4803e−016 1  .4   8   0   3   e− 0  1   6   1.4803e−016        1 .       4       8        0        3       e    −       0        1       6  1  .4  8  0  3  e  − 0  1  6   1.4 8 0 3 e − 0 16    Fig.5. Finding optimal for Easom in contour map     0 10 20 30 40 50 60 70 80 90 10000.050.10.150.20.250.30.35SEASNTONTEA f  6    Iteration number Fig.6. Comparison of Best fitness for f  6   Table1 and Table2 are the results of contrastedoptimal searched by NTEA, SNTO and SEA andcorrespondent optimal points to the optimal values inTable1 searched by NTEA. With deflection technique, we can detect 6minimizers of   f  1 with the same value:  No.1 [11.6255447026864, 5.72504424431332], No.2 [11.6255447027255, 5.7250442445237], No.3 [11.6255447026949, 5.72504424490065], No.4 [11.6255447027843, 5.72504424471521], No.5 [11.6255447027431, 5.72504424413162], No 6 [11.6255447027118, 5.72504424463833]And 9 minimizers of   f  2 with the same value are alsofounded: No.1 [-6.44002582207099, -6.2779720144943], No.2 [-6.44002582235322, -6.27797201356924], No.3 [-6.4400258222786, -6.2779720141167] No.4 [-6.44002582226788, -6.27797201447922], No.5 [-6.44002582235673, -6.27797201347218], No.6 [-6.44002582213847, -6.27797201162225], No.7 [-6.44002582211448, -6.27797201373397], No.8 [-6.44002582217536, -6.27797201223602], No.9 [-6.44002582208887, -6.27797201222687]From the Table1 and Table2, We could draw aconclusion that the performance of NTEA is much better than SNTO and SEA, especially for   f  1 , f  2, F  6,    F  7. And for    f  4 ,  f  5    NTEA does as well as SNTO but superior to SEAincreasingly. 4 Conclusions. Cases studies illustrate that NTEA With establishedtechniques in Section 3. for complex function optimizationcan improve the global convergence speed and has theadvantages of high precision and robustness to such acertain extent.In this paper we propose a technique in which SNTOis combined with the concept of evolutionary calculationtechnique, and verified its effective using standardfunctions. The improved-version NETA has inherited thefeatures of both: simple SNTO is strong at raw explorationand EA is robust on multimodal functions. At the end,through experiments, NETA gave better performance thaneither SNTO or EA.In the future works, we will do more researches onadjusting the parameters of genetic operations in NTEAand their relations to the algorithm’s stability to improvealgorithm’s robustness and precision much more. And we plan to apply this technique to more complicated anddifficult problems. We would like to develop other algorithms that perform well by combining EA’s conceptswith the proposed algorithms. ACKNOWLEDGEMENT We thank the anonymous reviewers for their helpfulremarks and comments.F. Gao was supported by the Foundation (Grant No.XJJ2004113), Project of educational research, and theUIRT Project (Grant No. A156, A157) granted by WuhanUniversity of Technology in People's Republic of China. References  [1]   Michalewicz Z. Genetic Algorithms +Data
Search
Similar documents
View more...
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks