Integrating $$\varepsilon $$-dominance and RBF surrogate optimization for solving computationally expensive many-objective optimization problems

Author(s):  
Wenyu Wang ◽  
Taimoor Akhtar ◽  
Christine A. Shoemaker
2019 ◽  
Vol 31 (4) ◽  
pp. 689-702 ◽  
Author(s):  
Juliane Müller ◽  
Marcus Day

We introduce the algorithm SHEBO (surrogate optimization of problems with hidden constraints and expensive black-box objectives), an efficient optimization algorithm that employs surrogate models to solve computationally expensive black-box simulation optimization problems that have hidden constraints. Hidden constraints are encountered when the objective function evaluation does not return a value for a parameter vector. These constraints are often encountered in optimization problems in which the objective function is computed by a black-box simulation code. SHEBO uses a combination of local and global search strategies together with an evaluability prediction function and a dynamically adjusted evaluability threshold to iteratively select new sample points. We compare the performance of our algorithm with that of the mesh-based algorithms mesh adaptive direct search (MADS, NOMAD [nonlinear optimization by mesh adaptive direct search] implementation) and implicit filtering and SNOBFIT (stable noisy optimization by branch and fit), which assigns artificial function values to points that violate the hidden constraints. Our numerical experiments for a large set of test problems with 2–30 dimensions and a 31-dimensional real-world application problem arising in combustion simulation show that SHEBO is an efficient solver that outperforms the other methods for many test problems.


Author(s):  
Shufen Qin ◽  
Chan Li ◽  
Chaoli Sun ◽  
Guochen Zhang ◽  
Xiaobo Li

AbstractSurrogate-assisted evolutionary algorithms have been paid more and more attention to solve computationally expensive problems. However, model management still plays a significant importance in searching for the optimal solution. In this paper, a new method is proposed to measure the approximation uncertainty, in which the differences between the solution and its neighbour samples in the decision space, and the ruggedness of the objective space in its neighborhood are both considered. The proposed approximation uncertainty will be utilized in the surrogate-assisted global search to find a solution for exact objective evaluation to improve the exploration capability of the global search. On the other hand, the approximated fitness value is adopted as the infill criterion for the surrogate-assisted local search, which is utilized to improve the exploitation capability to find a solution close to the real optimal solution as much as possible. The surrogate-assisted global and local searches are conducted in sequence at each generation to balance the exploration and exploitation capabilities of the method. The performance of the proposed method is evaluated on seven benchmark problems with 10, 20, 30 and 50 dimensions, and one real-world application with 30 and 50 dimensions. The experimental results show that the proposed method is efficient for solving the low- and medium-dimensional expensive optimization problems by compared to the other six state-of-the-art surrogate-assisted evolutionary algorithms.


2018 ◽  
Vol 51 (2) ◽  
pp. 265-285 ◽  
Author(s):  
Abdulbaset Saad ◽  
Zuomin Dong ◽  
Brad Buckham ◽  
Curran Crawford ◽  
Adel Younis ◽  
...  

1986 ◽  
Vol 108 (3) ◽  
pp. 336-339 ◽  
Author(s):  
J. P. Karidis ◽  
S. R. Turns

An algorithm is presented for the efficient constrained or unconstrained minimization of computationally expensive objective functions. The method proceeds by creating and numerically optimizing a sequence of surrogate functions which are chosen to approximate the behavior of the unknown objective function in parameter-space. The Recursive Surrogate Optimization (RSO) technique is intended for design applications where the computational cost required to evaluate the objective function greatly exceeds both the cost of evaluating any domain constraints present and the cost associated with one iteration of a typical optimization routine. Efficient optimization is achieved by reducing the number of times that the objective function must be evaluated at the expense of additional complexity and computational cost associated with the optimization procedure itself. Comparisons of the RSO performance on eight widely used test problems to published performance data for other efficient techniques demonstrate the utility of the method.


2016 ◽  
Vol 138 (9) ◽  
Author(s):  
Kalyan Shankar Bhattacharjee ◽  
Hemant Kumar Singh ◽  
Tapabrata Ray

In engineering design optimization, evaluation of a single solution (design) often requires running one or more computationally expensive simulations. Surrogate assisted optimization (SAO) approaches have long been used for solving such problems, in which approximations/surrogates are used in lieu of computationally expensive simulations during the course of search. Existing SAO approaches often use the same type of approximation model to represent all objectives and constraints in all regions of the search space. The selection of a type of surrogate model over another is nontrivial and an a priori choice limits flexibility in representation. In this paper, we introduce a multi-objective evolutionary algorithm (EA) with multiple adaptive spatially distributed surrogates. Instead of a single global surrogate, local surrogates of multiple types are constructed in the neighborhood of each offspring solution and a multi-objective search is conducted using the best surrogate for each objective and constraint function. The proposed approach offers flexibility of representation by capitalizing on the benefits offered by various types of surrogates in different regions of the search space. The approach is also immune to illvalidation since approximated and truly evaluated solutions are not ranked together. The performance of the proposed surrogate assisted multi-objective algorithm (SAMO) is compared with baseline nondominated sorting genetic algorithm II (NSGA-II) and NSGA-II embedded with global and local surrogates of various types. The performance of the proposed approach is quantitatively assessed using several engineering design optimization problems. The numerical experiments demonstrate competence and consistency of SAMO.


Symmetry ◽  
2020 ◽  
Vol 12 (10) ◽  
pp. 1631
Author(s):  
Kittisak Chaiyotha ◽  
Tipaluck Krityakierne

Engineering optimization problems often involve computationally expensive black-box simulations of underlying physical phenomena. This paper compares the performance of four constrained optimization algorithms relying on a Gaussian process model and an infill sampling criterion under the framework of Bayesian optimization. The four infill sampling criteria include expected feasible improvement (EFI), constrained expected improvement (CEI), stepwise uncertainty reduction (SUR), and augmented Lagrangian (AL). Numerical tests were rigorously performed on a benchmark set consisting of nine constrained optimization problems with features commonly found in engineering, as well as a constrained structural engineering design optimization problem. Based upon several measures including statistical analysis, our results suggest that, overall, the EFI and CEI algorithms are significantly more efficient and robust than the other two methods, in the sense of providing the most improvement within a very limited number of objective and constraint function evaluations, and also in the number of trials for which a feasible solution could be located.


Sign in / Sign up

Export Citation Format

Share Document