scholarly journals Implicitly and densely discrete black-box optimization problems

2009 ◽  
Vol 3 (3) ◽  
pp. 475-482 ◽  
Author(s):  
Luis Nunes Vicente
Author(s):  
George H. Cheng ◽  
Adel Younis ◽  
Kambiz Haji Hajikolaei ◽  
G. Gary Wang

Mode Pursuing Sampling (MPS) was developed as a global optimization algorithm for optimization problems involving expensive black box functions. MPS has been found to be effective and efficient for problems of low dimensionality, i.e., the number of design variables is less than ten. A previous conference publication integrated the concept of trust regions into the MPS framework to create a new algorithm, TRMPS, which dramatically improved performance and efficiency for high dimensional problems. However, although TRMPS performed better than MPS, it was unproven against other established algorithms such as GA. This paper introduces an improved algorithm, TRMPS2, which incorporates guided sampling and low function value criterion to further improve algorithm performance for high dimensional problems. TRMPS2 is benchmarked against MPS and GA using a suite of test problems. The results show that TRMPS2 performs better than MPS and GA on average for high dimensional, expensive, and black box (HEB) problems.


Fresa implements a nature inspired plant propagation algorithm for the solution of single and multiple objective optimization problems. The method is population based and evolutionary. Treating the objective function as a black box, the implementation is able to solve problems exhibiting behaviour that is challenging for mathematical programming methods. Fresa is easily adapted to new problems which may benefit from bespoke representations of solutions by taking advantage of the dynamic typing and multiple dispatch capabilities of the Julia language. Further, the support for threads in Julia enables an efficient implementation on multi-core computers.


2019 ◽  
Vol 31 (4) ◽  
pp. 689-702 ◽  
Author(s):  
Juliane Müller ◽  
Marcus Day

We introduce the algorithm SHEBO (surrogate optimization of problems with hidden constraints and expensive black-box objectives), an efficient optimization algorithm that employs surrogate models to solve computationally expensive black-box simulation optimization problems that have hidden constraints. Hidden constraints are encountered when the objective function evaluation does not return a value for a parameter vector. These constraints are often encountered in optimization problems in which the objective function is computed by a black-box simulation code. SHEBO uses a combination of local and global search strategies together with an evaluability prediction function and a dynamically adjusted evaluability threshold to iteratively select new sample points. We compare the performance of our algorithm with that of the mesh-based algorithms mesh adaptive direct search (MADS, NOMAD [nonlinear optimization by mesh adaptive direct search] implementation) and implicit filtering and SNOBFIT (stable noisy optimization by branch and fit), which assigns artificial function values to points that violate the hidden constraints. Our numerical experiments for a large set of test problems with 2–30 dimensions and a 31-dimensional real-world application problem arising in combustion simulation show that SHEBO is an efficient solver that outperforms the other methods for many test problems.


Author(s):  
Laurens Bliek ◽  
Sicco Verwer ◽  
Mathijs de Weerdt

Abstract When a black-box optimization objective can only be evaluated with costly or noisy measurements, most standard optimization algorithms are unsuited to find the optimal solution. Specialized algorithms that deal with exactly this situation make use of surrogate models. These models are usually continuous and smooth, which is beneficial for continuous optimization problems, but not necessarily for combinatorial problems. However, by choosing the basis functions of the surrogate model in a certain way, we show that it can be guaranteed that the optimal solution of the surrogate model is integer. This approach outperforms random search, simulated annealing and a Bayesian optimization algorithm on the problem of finding robust routes for a noise-perturbed traveling salesman benchmark problem, with similar performance as another Bayesian optimization algorithm, and outperforms all compared algorithms on a convex binary optimization problem with a large number of variables.


2018 ◽  
Vol 51 (2) ◽  
pp. 265-285 ◽  
Author(s):  
Abdulbaset Saad ◽  
Zuomin Dong ◽  
Brad Buckham ◽  
Curran Crawford ◽  
Adel Younis ◽  
...  

2010 ◽  
Vol 37 (11) ◽  
pp. 1977-1986 ◽  
Author(s):  
Francisco Gortázar ◽  
Abraham Duarte ◽  
Manuel Laguna ◽  
Rafael Martí

2021 ◽  
Author(s):  
Klaus Johannsen ◽  
Nadine Goris ◽  
Bjørnar Jensen ◽  
Jerry Tjiputra

Abstract Optimization problems can be found in many areas of science and technology. Often, not only the global optimum, but also a (larger) number of near-optima are of interest. This gives rise to so-called multimodal optimization problems. In most of the cases, the number and quality of the optima is unknown and assumptions on the objective functions cannot be made. In this paper, we focus on continuous, unconstrained optimization in moderately high dimensional continuous spaces (<=10). We present a scalable algorithm with virtually no parameters, which performs well for general objective functions (non-convex, discontinuous). It is based on two well-established algorithms (CMA-ES, deterministic crowding). Novel elements of the algorithm are the detection of seed points for local searches and collision avoidance, both based on nearest neighbors, and a strategy for semi-sequential optimization to realize scalability. The performance of the proposed algorithm is numerically evaluated on the CEC2013 niching benchmark suite for 1-20 dimensional functions and a 9 dimensional real-world problem from constraint optimization in climate research. The algorithm shows good performance on the CEC2013 benchmarks and falls only short on higher dimensional and strongly inisotropic problems. In case of the climate related problem, the algorithm is able to find a high number (150) of optima, which are of relevance to climate research. The proposed algorithm does not require special configuration for the optimization problems considered in this paper, i.e. it shows good black-box behavior.


Author(s):  
Liqun Wang ◽  
Songqing Shan ◽  
G. Gary Wang

The presence of black-box functions in engineering design, which are usually computation-intensive, demands efficient global optimization methods. This work proposes a new global optimization method for black-box functions. The global optimization method is based on a novel mode-pursuing sampling (MPS) method which systematically generates more sample points in the neighborhood of the function mode while statistically covers the entire search space. Quadratic regression is performed to detect the region containing the global optimum. The sampling and detection process iterates until the global optimum is obtained. Through intensive testing, this method is found to be effective, efficient, robust, and applicable to both continuous and discontinuous functions. It supports simultaneous computation and applies to both unconstrained and constrained optimization problems. Because it does not call any existing global optimization tool, it can be used as a standalone global optimization method for inexpensive problems as well. Limitation of the method is also identified and discussed.


Sign in / Sign up

Export Citation Format

Share Document