scholarly journals Achieving Highly Scalable Evolutionary Real-Valued Optimization by Exploiting Partial Evaluations

2020 ◽  
pp. 1-27
Author(s):  
Anton Bouter ◽  
Tanja Alderliesten ◽  
Peter A.N. Bosman

It is known that to achieve efficient scalability of an Evolutionary Algorithm (EA), dependencies (also known as linkage) must be properly taken into account during variation. In a Gray-Box Optimization (GBO) setting, exploiting prior knowledge regarding these dependencies can greatly benefit optimization. We specifically consider the setting where partial evaluations are possible, meaning that the partial modification of a solution can be efficiently evaluated. Such problems are potentially very difficult, for example, non-separable, multimodal, and multiobjective. The Gene-pool Optimal Mixing Evolutionary Algorithm (GOMEA) can effectively exploit partial evaluations, leading to a substantial improvement in performance and scalability. GOMEA was recently shown to be extendable to real-valued optimization through a combination with the real-valued estimation of distribution algorithm AMaLGaM. In this article, we definitively introduce the Real-Valued GOMEA (RV-GOMEA), and introduce a new variant, constructed by combining GOMEA with what is arguably the best-known real-valued EA, the Covariance Matrix Adaptation Evolution Strategies (CMA-ES). Both variants of GOMEA are compared to L-BFGS and the Limited Memory CMA-ES (LM-CMA-ES). We show that both variants of RV-GOMEA achieve excellent performance and scalability in a GBO setting, which can be orders of magnitude better than that of EAs unable to efficiently exploit the GBO setting.

2013 ◽  
Vol 22 (04) ◽  
pp. 1350022
Author(s):  
YONGYONG NIU ◽  
ZIXING CAI ◽  
MIN JIN

In the past few years, evolutionary algorithm ensembles have gradually attracted more and more attention in the community of evolutionary computation. This paper proposes a novel evolutionary algorithm ensemble for global numerical optimization, named NEALE. In order to make a good tradeoff between the exploration and exploitation, NEALE is composed of two constituent algorithms, i.e., the composite differential evolution (CoDE) and the covariance matrix adaptation evolution strategy (CMA-ES). During the evolution, CoDE aims at probing more promising regions and refining the overall quality of the population, while the purposes of CMA-ES are to accelerate the convergence speed and to enhance the accuracy of the solutions. In addition, NEALE encourages the interaction between the constituent algorithms. In NEALE, the interaction is controlled by a predefined generation number and different interaction strategies are designed according to the features of the constituent algorithms. The performance of NEALE has been tested on 25 benchmark test functions developed for the special session on real-parameter optimization of the 2005 IEEE Congress on Evolutionary Computation (IEEE CEC2005). Compared with other state-of-the-art evolutionary algorithms and the individual constituent algorithms, NEALE performs significantly better than them.


2021 ◽  
Vol 2021 (5) ◽  
Author(s):  
Csaba Balázs ◽  
◽  
Melissa van Beekveld ◽  
Sascha Caron ◽  
Barry M. Dillon ◽  
...  

Abstract Optimisation problems are ubiquitous in particle and astrophysics, and involve locating the optimum of a complicated function of many parameters that may be computationally expensive to evaluate. We describe a number of global optimisation algorithms that are not yet widely used in particle astrophysics, benchmark them against random sampling and existing techniques, and perform a detailed comparison of their performance on a range of test functions. These include four analytic test functions of varying dimensionality, and a realistic example derived from a recent global fit of weak-scale supersymmetry. Although the best algorithm to use depends on the function being investigated, we are able to present general conclusions about the relative merits of random sampling, Differential Evolution, Particle Swarm Optimisation, the Covariance Matrix Adaptation Evolution Strategy, Bayesian Optimisation, Grey Wolf Optimisation, and the PyGMO Artificial Bee Colony, Gaussian Particle Filter and Adaptive Memory Programming for Global Optimisation algorithms.


2012 ◽  
Vol 215-216 ◽  
pp. 133-137
Author(s):  
Guo Shao Su ◽  
Yan Zhang ◽  
Zhen Xing Wu ◽  
Liu Bin Yan

Covariance matrix adaptation evolution strategy algorithm (CMA-ES) is a newly evolution algorithm. It has become a powerful tool for solving highly nonlinear multi-peak optimization problems. In many real-world optimization problems, the location of multiple optima is often required in a search space. In order to evaluate the solution, thousands of fitness function evaluations are involved that is a time consuming or expensive processes. Therefore, conventional stochastic optimization methods meet a special challenge for a very large number of problem function evaluations. Aiming to overcome the shortcoming of stochastic optimization methods in the high calculation cost, a truss optimal method based on CMA-ES algorithm is proposed and applied to solve the section and shape optimization problems of trusses. The study results show that the method is feasible and has the advantages of high accuracy, high efficiency and easy implementation.


Sign in / Sign up

Export Citation Format

Share Document