| 查看: 902 | 回复: 1 | |||
| 本帖产生 1 个 翻译EPI ,点击这里进行查看 | |||
[交流]
求翻译文献
|
|||
|
Abstract: By combining the aspect of population in genetic algorithms (GAs) and the simulated annealing algorithm (SAA),a novel algorithm, called fast annealing evolutionary algorithm (FAEA), is proposed. The algorithm is similar to the annealing evolutionary algorithm (AEA), anda very fast annealing technique is adopted for the annealing procedure. By an application of the algorithm to the optimization of test functions anda comparison of the algorithm with other stochastic optimization methods, it is shown that the algorithm isa highly efficient optimization method. It was also applied in optimization of Lennard–Jones clusters and compared with other methods in this study. The results indicate that the algorithm isa good tool for the energy minimization problem. © 2002 John Wiley& Sons, Inc. J Comput Chem 23: 427–435, 2002; DOI 10.1002/jcc.10029 Key words: annealing evolutionary algorithm; global optimization; Lennard–Jones clusters Introduction Determination of global energetic minima of large molecules or atomic clusters is one of the challenging problems in computa- tional chemistry due to the large number of local minima. Because the number of local minima tends to grow exponentially with the number of molecules or atoms, the task of minimizing the energy of complicated molecules or atomic clusters is notoriously diffi- cult. To solve the problem, different optimization methods have been proposed, such as genetic algorithms, 1–5 simulated anneal- ing, 6–8 quantum annealing, 9 potential deformation, 10 hierarchical searches, 11 etc. Genetic algorithms (GAs) are known asa new kind of optimi- zation technique for tackling complicated optimization tasks. 12,13 A population of random bit (or digital) strings is used for starting solution trials. Then,a circular process of evaluation, selection, recombination, and mutation is repeated to yield an optimized solution based upon the simulation of natural selection, genetics, and evolution with Darwin’s principles of struggle for life and survival of the fittest. Therefore, GAs are based upon stochastic search heuristics wherein an explorative and an exploitative mech- anism cooperate, which is suitable to be used in optimizations of complex hypersurface witha lot of local minima or maxima. Within the past decade, applications of GAs to alloy systems, 14 atomic and molecular clusters, 15,16 molecular modeling, 17 protein design, 18 molecular recognition, 19 protein folding, 20 and confor- mation analysis 21 have been reported. Simulated annealing algorithm (SAA) isa stochastic optimiza- tion method based on the Monte Carlo importance-sampling tech- nique. 22 It starts from an initial point and takesa single point iterative strategy. The mechanism that it accept not only the evolved but also the degenerated solutions with the Metropolis acceptance criterion during its annealing procedure, makes the SAA be of the potentiality to find the global minimum instead of falling into local minima. Asa global optimization algorithm, SAA has been widely used to fitting nonconvex cost functions arising in a variety of problems, such as the fitting of curves, 23 conformation analysis, 24 analysis of atom–atom interaction, 6 and optimization of molecular clusters. 8 In this articlea novel algorithm combining the aspect of pop- ulation in GAs and simulated annealing procedure is proposed. Because this algorithm is based on the annealing evolutionary algorithm (AEA), 25 and the very fast annealing technique26,27 is adopted for the annealing procedure, it is called the fast annealing evolutionary algorithm (FAEA). To assess the algorithm, we applied FAEA toa set of standard test functions and compared the results with some other stochastic methods. This is the usual procedure for all proposed optimization methods. Furthermore, an application of the algorithm to the optimization of Lennard–Jones (LJ) clusters 28–30 is also investi- gated. It is shown that the algorithm isa good tool for energy Correspondence to: W. Cai; e-mail: wscai@ustc.edu.cn Contract/grant sponsor: Natural Science Foundation of China; contract/ grant number: 29975027 Contract/grant sponsor: Foundation of University of Science and Technology of China for Young Scientists. © 2002 John Wiley& Sons, Inc.minimization problems. Both the structure and the energy are in good agreement with theoretical and calculated results in the literature. Method Very Fast Annealing (VFA) Simulated annealing was essentially introduced as a Monte Carlo importance-sampling technique for doing large-dimension path integrals arising in statistical physics problem. The method con- sists of three functions, i.e., (1) g( x): probability density of state- space of D parameters x { xi ; i 1, D}; (2) h( x): probability density for acceptance of new cost-function given the just previous value; (3) T(k): schedule of annealing the temperature T in an- nealing-time steps k, i.e., of changing the volatility or fluctuation of the two previous probability densities. Generally, h( x) 1/(1 exp(E/T )) exp(E/T), but different g( x) and T(k) are used for different algorithms. To statistically assure that any point in x-space can be sampled infinitely often in annealing time (IOT), it suffices to prove that the products of probabilities of not generating a state x IOT for all annealing times successive to k0 yield zero, 26,27 k0 1 gk 0 (1) This is equivalent to k0 gk (2) VFA was proposed by Ingber, 26,27 in which different parameter has different annealing schedule. For a x-space of D parameters, a new solution xi [Ai , Bi ] is generated by xk1 i xk i yi Bi Ai (3) where k is annealing-time index, and yi sgnui 1 2Ti1 1 Ti 2ui 1 1 (4) where ui U[0, 1] is a random number with uniform distribution. The generating function corresponding to eq. (4) is g y i1 D 1 2yi Ti ln1 1/Ti (5) If the annealing schedule of temperature T(k) is calculated by Ti k T0i expci k1/D (6) the following equation can be obtained, k0 gk k0 i1 D 1 2yi ci 1 k (7) where ci mi exp(ni /D), and mi , ni are parameters to control the annealing schedule, which can be different for each specific problem. To simplify the calculation, in this study, a same value of mi and ni for every parameters is adopted. To compare Boltzmann annealing (BA), fast annealing (FA) and VFA, Boltzmann distribution used in BA, Cauchy distribution used in FA and the generating function g( y) in eq. (5) were illustrated in Figure 1, respectively. It is obvious that VFA has a fattest tail among them, permitting easier access to test local minima in the search for the desired global minimum. Due to the character of the generating function g( y) and annealing schedule T(k), the VFA will converge very fast. 26,27 But for complicated optimization problem it inevitably converges to local minima. Flowchart of Fast Annealing Evolutionary Algorithm (FAEA) Generally, an AEA combines the aspect of population and selec- tion procedure in GAs and the annealing procedure in SAA. The AEA improves the population of candidate solutions by selection operation instead of the single-point iterative strategy used by the SAA. Therefore, it has a larger possibility to escape from local minima than the SAA. 25 But, in our experience, for complicated optimization problem, such as the optimization of molecules or atomic clusters, the AEA often converges at a local minimum. By examination of the AEA calculations, it was found that the reason for the early convergence may be caused by the selection opera- tion. Although selection improves the fitness of the population, it will lose the diversity of the population. Therefore, in this study, selection operation is not used in the FAEA algorithm. Further- more, to make the algorithm converge effectively and guarantee that the algorithm finds the global minimum, VFA technique, similarity checks between individuals in population, and a local search procedure are adopted. The whole procedure of the FAEA is shown in Figure 2, and the following steps are included. 1. Prepare parameters for the algorithm, which include the m, n, ns (length of Markov chain), and initial temperature (T0) for the annealing schedule, population size (npop) for evolution, the number of parameters or atoms in clusters (N), and the range of each parameter. The stop criterion of the algorithm, Texit , will also be calculated by T0exp(m). 26,27 2. According to the atom number and population size, the pop- ulation is randomly initialized and evaluated. The value of every gene in each individual is generated by u(B A), where u U[0, 1] is a random number with uniform dis- tribution, and A, B are the range of the optimizing parameters. 3. According to eq. (6), calculate the current temperature T. 4. As discussed above, VFA converges very fast. It is important to keep diversity in the population in order to avoid converg- ing to local minima. Therefore, in the FAEA, a procedure to check similarity between individuals is developed, i.e., during 428 Cai and Shao • Vol. 23, No. 4 • Journal of Computational Chemistry |
» 猜你喜欢
求个博导看看
已经有16人回复
上海工程技术大学张培磊教授团队招收博士生
已经有4人回复
上海工程技术大学【激光智能制造】课题组招收硕士
已经有5人回复
求助院士们,这个如何合成呀
已经有4人回复
临港实验室与上科大联培博士招生1名
已经有9人回复
需要合成515-64-0,50g,能接单的留言
已经有4人回复
自荐读博
已经有4人回复
写了一篇“相变储能技术在冷库中应用”的论文,论文内容以实验为主,投什么期刊合适?
已经有6人回复
带资进组求博导收留
已经有10人回复
最近几年招的学生写论文不引自己组发的文章
已经有11人回复
» 抢金币啦!回帖就可以得到:
香港理工大学-应用生物与化学科技学系 招收2025年博士研究生
+2/142
上海大学管理学院阳发军教授课题组全职博士/博士后招聘启事
+1/89
江苏科技大学能源材料化学课题组张俊豪教授招收博士研究生1-2名
+1/85
导电高分子用什么工艺处理分子链的堆叠会更加规整???
+1/84
坐标北京不异地
+1/74
广州
+1/71
深圳理工大学梁国进课题组招聘研究助理教授、博后多名(电化学储能方向)
+1/44
Analytical Science Advances(Wiley出版社)长期征稿中...
+1/35
【AI、水文方向】香港科技大学(广州)研究助理招聘
+1/31
南京大学 统计与机器学习理论方向 博士招生
+1/30
暨南大学理工学院 光子技术研究院段宣明团队申请制读博招生
+1/30
上海大学 力工学院 锂电池方向 博士研究生招生
+1/29
复旦大学聂志鸿团队招聘聚电解质方向博士后和科研助理
+1/18
山东大学集成电路学院博士招生1名
+1/12
2026年中科院化学所优青 程靓团队招收有机化学、生物化学背景的博士研究生
+1/12
天津大学化学系吴立朋课题组申请考核制博士招生/博后招聘
+1/7
所以当初说在欧美生活𣎴用讲人情世故的人怎么生活的
+1/4
北京理工大学-集成电路与电子学院-国家杰青团队-招博士后及科研助理
+1/3
合肥工业大学 多智能体方向 2026 级博士招生
+1/3
山东大学集成电路学院博士招生1名
+1/2
2楼2010-12-20 17:03:23







回复此楼