EVOLUTIONARY GAUSSIAN PROCESSES

2021 ◽  
pp. 1-17
Author(s):  
Robert Planas Casadevall ◽  
Nicholas Oune ◽  
Ramin Bostanabad

Abstract Emulation plays an important role in engineering design. However, most emulators such as Gaussian processes (GPs) are exclusively developed for interpolation/regression and their performance significantly deteriorates in extrapolation. To address this shortcoming, we introduce evolutionary Gaussian processes (EGPs) that aim to increase the extrapolation capabilities of GPs. An EGP differs from a GP in that its training involves automatic discovery of some free-form symbolic bases that explain the data reasonably well. In our approach, this automatic discovery is achieved via evolutionary programming (EP) which is integrated with GP modeling via maximum likelihood estimation, bootstrap sampling, and singular value decomposition. As we demonstrate via examples that include a host of analytical functions as well as an engineering problem on materials modeling, EGP can improve the performance of ordinary GPs in terms of not only extrapolation, but also interpolation/regression and numerical stability.

Author(s):  
Robert Planas ◽  
Nicholas Oune ◽  
Ramin Bostanabad

Abstract Emulation plays an indispensable role in engineering design. However, the majority of emulation methods are formulated for interpolation purposes and their performance significantly deteriorates in extrapolation. In this paper, we develop a method for extrapolation by integrating Gaussian processes (GPs) and evolutionary programming (EP). Our underlying assumption is that there is a set of free-form parametric bases that can model the data source reasonably well. Consequently, if we can find these bases via some training data over a region, we can do predictions outside of that region. To systematically and efficiently find these bases, we start by learning a GP without any parametric mean function. Then, a rich dataset is generated by this GP and subsequently used in EP to find some parametric bases. Afterwards, we retrain the GP while using the bases found by EP. This retraining essentially allows to validate and/or correct the discovered bases via maximum likelihood estimation. By iterating between GP and EP we robustly and efficiently find the underlying bases that can be used for extrapolation. We validate our approach with a host of analytical problems in the absence or presence of noise. We also study an engineering example on finding the constitutive law of a composite microstructure.


2001 ◽  
Author(s):  
John B. Ferris

Abstract Road profiles are typically characterized by their spectral content. It has been noted by several researches, however, that road profiles are generally nonstationary signals that contain significant irregularities such as potholes. Such signals are not well described in the spectral domain. The objective of this work is to describe road profiles in the spatial domain by developing a set of characteristic shapes onto which individual events can be cast. A set of analytical functions describing these shapes is also developed. In order to develop a set a characteristic shapes, more than a million events are investigated from a mixture of road types (from highways to gravel roads) and a variety of locations throughout the United States. A set of characteristic shapes is developed for each road type and location. Although the events were drawn from diverse samples, the resulting sets of characteristic shapes are nearly indistinguishable. This similarity allows a single set of characteristic shapes to describe events for a wide class of roads. Variations in these sets are discussed and used in deriving a set of orthogonal analytical functions that describe the characteristic shapes. Individual road events are then mapped onto this set of characteristic shapes. The implications of decomposing road events into these characteristic shapes are discussed.


Technometrics ◽  
2015 ◽  
Vol 57 (1) ◽  
pp. 87-99 ◽  
Author(s):  
Enrique del Castillo ◽  
Bianca M. Colosimo ◽  
Sam Davanloo Tajbakhsh

2021 ◽  
Author(s):  
Yuan Jin ◽  
Jin Chai ◽  
Olivier Jung

Abstract Thanks to their flexibility and robustness to overfitting, Gaussian Processes (GPs) are widely used as black-box function approximators. Deep Gaussian Processes (DGPs) are multilayer generations of GPs. The deep architecture alleviates the kernel dependance of GPs, while complicates model inference. The so-called doubly stochastic variational approach, which does not force the independence between layers, shows its effectiveness in large dataset classification and regression in the literature. Meanwhile, similar to deep neural network, DGPs also require application-specific architecture. In addition, the doubly stochastic process introduces extra hyperparameters, which further increases the difficulty in model definition and training. In this study, we apply doubly stochastic variational inference DGP as surrogate model on high-dimensional structural data regression drawn from turbomachinery area. A discrete optimizer, which is based on classification discriminating good solutions from bad ones, is utilized to realize automatic DGP model design and tuning. Empirical experiments are performed firstly on analytical functions to demonstrate the capability of DPGs in high-dimensional and non-stationary data handling. Two industrial turbomachinery problems with respectively 80 and 180 input dimensions are addressed. The first application consists in a turbine frame design problem. In the second application, DGP is used to describe the correlation between 3D blade profiles of a multi-stage low pressure turbine and the corresponding turbine total-total efficiency. Through these two applications, we show the applicability of the proposed automatically designed DGPs in turbomachinery area by highlighting their outperformance with respect to classic GPs.


2015 ◽  
Vol 0 (0) ◽  
Author(s):  
Natthasurang Yasungnoen ◽  
Pairote Sattayatham

AbstractIn this paper, we model the mortality rate in Thailand by using the Lee-Carter model. Three classical methods, i.e. Singular Value Decomposition (SVD), Weighted Least Square (WLS), and Maximum Likelihood Estimation (MLE) are used to estimate the parameters of the Lee-Carter model. With these methods, we investigate the goodness of fit for the mortality rate spanning the period 2003 to 2012. The fitted models are compared. The autoregressive moving average (ARIMA) is used to forecast the general index and mortality rate the time period from 2013 to 2022. As a result, we also forecast Thai life expectancy at birth.


Sign in / Sign up

Export Citation Format

Share Document