scholarly journals Extrapolative Bayesian Optimization with Gaussian Process and Neural Network Ensemble Surrogate Models

2021 ◽  
Vol 3 (11) ◽  
pp. 2170077
Author(s):  
Yee-Fun Lim ◽  
Chee Koon Ng ◽  
U.S. Vaitesswar ◽  
Kedar Hippalgaonkar
2021 ◽  
Author(s):  
◽  
Mashall Aryan

<p>The solution to many science and engineering problems includes identifying the minimum or maximum of an unknown continuous function whose evaluation inflicts non-negligible costs in terms of resources such as money, time, human attention or computational processing. In such a case, the choice of new points to evaluate is critical. A successful approach has been to choose these points by considering a distribution over plausible surfaces, conditioned on all previous points and their evaluations. In this sequential bi-step strategy, also known as Bayesian Optimization, first a prior is defined over possible functions and updated to a posterior in the light of available observations. Then using this posterior, namely the surrogate model, an infill criterion is formed and utilized to find the next location to sample from. By far the most common prior distribution and infill criterion are Gaussian Process and Expected Improvement, respectively.    The popularity of Gaussian Processes in Bayesian optimization is partially due to their ability to represent the posterior in closed form. Nevertheless, the Gaussian Process is afflicted with several shortcomings that directly affect its performance. For example, inference scales poorly with the amount of data, numerical stability degrades with the number of data points, and strong assumptions about the observation model are required, which might not be consistent with reality. These drawbacks encourage us to seek better alternatives. This thesis studies the application of Neural Networks to enhance Bayesian Optimization. It proposes several Bayesian optimization methods that use neural networks either as their surrogates or in the infill criterion.    This thesis introduces a novel Bayesian Optimization method in which Bayesian Neural Networks are used as a surrogate. This has reduced the computational complexity of inference in surrogate from cubic (on the number of observation) in GP to linear. Different variations of Bayesian Neural Networks (BNN) are put into practice and inferred using a Monte Carlo sampling. The results show that Monte Carlo Bayesian Neural Network surrogate could performed better than, or at least comparably to the Gaussian Process-based Bayesian optimization methods on a set of benchmark problems.  This work develops a fast Bayesian Optimization method with an efficient surrogate building process. This new Bayesian Optimization algorithm utilizes Bayesian Random-Vector Functional Link Networks as surrogate. In this family of models the inference is only performed on a small subset of the entire model parameters and the rest are randomly drawn from a prior. The proposed methods are tested on a set of benchmark continuous functions and hyperparameter optimization problems and the results show the proposed methods are competitive with state-of-the-art Bayesian Optimization methods.  This study proposes a novel Neural network-based infill criterion. In this method locations to sample from are found by minimizing the joint conditional likelihood of the new point and parameters of a neural network. The results show that in Bayesian Optimization methods with Bayesian Neural Network surrogates, this new infill criterion outperforms the expected improvement.   Finally, this thesis presents order-preserving generative models and uses it in a variational Bayesian context to infer Implicit Variational Bayesian Neural Network (IVBNN) surrogates for a new Bayesian Optimization. This new inference mechanism is more efficient and scalable than Monte Carlo sampling. The results show that IVBNN could outperform Monte Carlo BNN in Bayesian optimization of hyperparameters of machine learning models.</p>


Energies ◽  
2020 ◽  
Vol 13 (23) ◽  
pp. 6360
Author(s):  
Georgios Gasparis ◽  
Wai Hou Lio ◽  
Fanzhong Meng

Fatigue damage of turbine components is typically computed by running a rain-flow counting algorithm on the load signals of the components. This process is not linear and time consuming, thus, it is non-trivial for an application of wind farm control design and optimisation. To compensate this limitation, this paper will develop and compare different types of surrogate models that can predict the short term damage equivalent loads and electrical power of wind turbines, with respect to various wind conditions and down regulation set-points, in a wind farm. More specifically, Linear Regression, Artificial Neural Network and Gaussian Process Regression are the types of the developed surrogate models in this work. The results showed that Gaussian Process Regression outperforms the other types of surrogate models and can effectively estimate the aforementioned target variables.


2021 ◽  
Author(s):  
◽  
Mashall Aryan

<p>The solution to many science and engineering problems includes identifying the minimum or maximum of an unknown continuous function whose evaluation inflicts non-negligible costs in terms of resources such as money, time, human attention or computational processing. In such a case, the choice of new points to evaluate is critical. A successful approach has been to choose these points by considering a distribution over plausible surfaces, conditioned on all previous points and their evaluations. In this sequential bi-step strategy, also known as Bayesian Optimization, first a prior is defined over possible functions and updated to a posterior in the light of available observations. Then using this posterior, namely the surrogate model, an infill criterion is formed and utilized to find the next location to sample from. By far the most common prior distribution and infill criterion are Gaussian Process and Expected Improvement, respectively.    The popularity of Gaussian Processes in Bayesian optimization is partially due to their ability to represent the posterior in closed form. Nevertheless, the Gaussian Process is afflicted with several shortcomings that directly affect its performance. For example, inference scales poorly with the amount of data, numerical stability degrades with the number of data points, and strong assumptions about the observation model are required, which might not be consistent with reality. These drawbacks encourage us to seek better alternatives. This thesis studies the application of Neural Networks to enhance Bayesian Optimization. It proposes several Bayesian optimization methods that use neural networks either as their surrogates or in the infill criterion.    This thesis introduces a novel Bayesian Optimization method in which Bayesian Neural Networks are used as a surrogate. This has reduced the computational complexity of inference in surrogate from cubic (on the number of observation) in GP to linear. Different variations of Bayesian Neural Networks (BNN) are put into practice and inferred using a Monte Carlo sampling. The results show that Monte Carlo Bayesian Neural Network surrogate could performed better than, or at least comparably to the Gaussian Process-based Bayesian optimization methods on a set of benchmark problems.  This work develops a fast Bayesian Optimization method with an efficient surrogate building process. This new Bayesian Optimization algorithm utilizes Bayesian Random-Vector Functional Link Networks as surrogate. In this family of models the inference is only performed on a small subset of the entire model parameters and the rest are randomly drawn from a prior. The proposed methods are tested on a set of benchmark continuous functions and hyperparameter optimization problems and the results show the proposed methods are competitive with state-of-the-art Bayesian Optimization methods.  This study proposes a novel Neural network-based infill criterion. In this method locations to sample from are found by minimizing the joint conditional likelihood of the new point and parameters of a neural network. The results show that in Bayesian Optimization methods with Bayesian Neural Network surrogates, this new infill criterion outperforms the expected improvement.   Finally, this thesis presents order-preserving generative models and uses it in a variational Bayesian context to infer Implicit Variational Bayesian Neural Network (IVBNN) surrogates for a new Bayesian Optimization. This new inference mechanism is more efficient and scalable than Monte Carlo sampling. The results show that IVBNN could outperform Monte Carlo BNN in Bayesian optimization of hyperparameters of machine learning models.</p>


2020 ◽  
Vol 27 (1) ◽  
pp. 70-82 ◽  
Author(s):  
Aleksandar Radonjić ◽  
Danijela Pjevčević ◽  
Vladislav Maraš

AbstractThis paper investigates the use of neural networks (NNs) for the problem of assigning push boats to barge convoys in inland waterway transportation (IWT). Push boat–barge convoy assignmentsare part of the daily decision-making process done by dispatchers in IWT companiesforwhich a decision support tool does not exist. The aim of this paper is to develop a Neural Network Ensemble (NNE) model that will be able to assist in push boat–barge convoy assignments based on the push boat power.The primary objective of this paper is to derive an NNE model for calculation of push boat Shaft Powers (SHPs) by using less than 100% of the experimental data available. The NNE model is applied to a real-world case of more than one shipping company from the Republic of Serbia, which is encountered on the Danube River. The solution obtained from the NNE model is compared toreal-world full-scale speed/power measurements carried out on Serbian push boats, as well as with the results obtained from the previous NNE model. It is found that the model is highly accurate, with scope for further improvements.


IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Wirot Yotsawat ◽  
Pakaket Wattuya ◽  
Anongnart Srivihok

2021 ◽  
Vol 71 ◽  
pp. 102029
Author(s):  
Evan Hann ◽  
Iulia A. Popescu ◽  
Qiang Zhang ◽  
Ricardo A. Gonzales ◽  
Ahmet Barutçu ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document