A Comparative Study on Bayesian Optimization

Author(s):  
Lam Gia Thuan ◽  
Doina Logofatu
2021 ◽  
Author(s):  
Yuan Jin ◽  
Zheyi Yang ◽  
Shiran Dai ◽  
Yann Lebret ◽  
Olivier Jung

Abstract Many engineering problems involve complex constraints which can be computationally costly. To reduce the overall numerical cost, such constrained optimization problems are solved via surrogate models constructed on a Design of Experiment (DoE). Meanwhile, complex constraints may lead to infeasible initial DoE, which can be problematic for subsequent sequential optimization. In this study, we address constrained optimization problem in a Bayesian optimization framework. A comparative study is conducted to evaluate the performance of three approaches namely Expected Feasible Improvement (EFI) and slack Augmented Lagrangian method (AL) and Expected Improvement with Probabilistic Support Vector Machine in constraint handling with feasible or infeasible initial DoE. AL is capable to start sequential optimization with infeasible initial DoE, while EFI requires extra a priori enrichment to find at least one feasible sample. Empirical experiments are performed on both analytical functions and a low pressure turbine disc design problem. Through these benchmark problems, EFI and AL are shown to have overall similar performance in problems with inequality constraints. However, the performance of EIPSVM is affected strongly by the corresponding hyperparameter values. In addition, we show evidences that with an appropriate handling of infeasible initial DoE, EFI does not necessarily underperform compared with AL solving optimization problems with mixed inequality and equality constraints.


2021 ◽  
Author(s):  
Tomaz M. Suller ◽  
Eric O. Gomes ◽  
Henrique B. Oliveira ◽  
Lucas P. Cotrim ◽  
Amir M. Sa’ad ◽  
...  

This paper proposes a solution based on Multi-Layer Perceptron (MLP) to predict the offset of the center of gravity of an offshore platform. It also performs a comparative study with three optimization algorithms – Random Search, Simulated Annealing, and Bayesian Optimization (BO) – to find the best MLP architecture. Although BO obtained the best architecture in the shortest time, ablation studies developed in this paper with hyperparameters of the optimization process showed that the result is sensitive to them and deserves attention in the Neural Architecture Search process.


2020 ◽  
Author(s):  
Bruno Oliveira Ferreira de Souza ◽  
Éve‐Marie Frigon ◽  
Robert Tremblay‐Laliberté ◽  
Christian Casanova ◽  
Denis Boire

2001 ◽  
Vol 268 (6) ◽  
pp. 1739-1748
Author(s):  
Aitor Hierro ◽  
Jesus M. Arizmendi ◽  
Javier De Las Rivas ◽  
M. Angeles Urbaneja ◽  
Adelina Prado ◽  
...  

2001 ◽  
Vol 48 (2) ◽  
pp. 97-106 ◽  
Author(s):  
D. Martin ◽  
A. Arjona ◽  
I. Soto ◽  
N. Barquero ◽  
M. Viana ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document