An adaptive sampling method for variable-fidelity surrogate models using improved hierarchical kriging

2017 ◽  
Vol 50 (1) ◽  
pp. 145-163 ◽  
Author(s):  
Jiexiang Hu ◽  
Qi Zhou ◽  
Ping Jiang ◽  
Xinyu Shao ◽  
Tingli Xie
Author(s):  
David A. Romero ◽  
Cristina H. Amon ◽  
Susan Finger

In order to reduce the time and resources devoted to design-space exploration during simulation-based design and optimization, the use of surrogate models, or metamodels, has been proposed in the literature. Key to the success of metamodeling efforts are the experimental design techniques used to generate the combinations of input variables at which the computer experiments are conducted. Several adaptive sampling techniques have been proposed to tailor the experimental designs to the specific application at hand, using the already-acquired data to guide further exploration of the input space, instead of using a fixed sampling scheme defined a priori. Though mixed results have been reported, it has been argued that adaptive sampling techniques can be more efficient, yielding better surrogate models with less sampling points. In this paper, we address the problem of adaptive sampling for single and multi-response metamodels, with a focus on Multi-stage Multi-response Bayesian Surrogate Models (MMBSM). We compare distance-optimal latin hypercube sampling, an entropy-based criterion and the maximum cross-validation variance criterion, originally proposed for one-dimensional output spaces and implemented in this paper for multi-dimensional output spaces. Our results indicate that, both for single and multi-response surrogate models, the entropy-based adaptive sampling approach leads to models that are more robust to the initial experimental design and at least as accurate (or better) when compared with other sampling techniques using the same number of sampling points.


2020 ◽  
Vol 62 (3) ◽  
pp. 1563-1578
Author(s):  
Tengfei Tang ◽  
Gang Yang ◽  
Dijia Zhang ◽  
Lei Lei ◽  
Baoren Li ◽  
...  

2019 ◽  
Vol 61 (4) ◽  
pp. 1515-1528 ◽  
Author(s):  
Kuo Tian ◽  
Zengcong Li ◽  
Xiangtao Ma ◽  
Haixin Zhao ◽  
Jiaxin Zhang ◽  
...  

Sensors ◽  
2020 ◽  
Vol 20 (18) ◽  
pp. 5332
Author(s):  
Carlos A. Duchanoy ◽  
Hiram Calvo ◽  
Marco A. Moreno-Armendáriz

Surrogate Modeling (SM) is often used to reduce the computational burden of time-consuming system simulations. However, continuous advances in Artificial Intelligence (AI) and the spread of embedded sensors have led to the creation of Digital Twins (DT), Design Mining (DM), and Soft Sensors (SS). These methodologies represent a new challenge for the generation of surrogate models since they require the implementation of elaborated artificial intelligence algorithms and minimize the number of physical experiments measured. To reduce the assessment of a physical system, several existing adaptive sequential sampling methodologies have been developed; however, they are limited in most part to the Kriging models and Kriging-model-based Monte Carlo Simulation. In this paper, we integrate a distinct adaptive sampling methodology to an automated machine learning methodology (AutoML) to help in the process of model selection while minimizing the system evaluation and maximizing the system performance for surrogate models based on artificial intelligence algorithms. In each iteration, this framework uses a grid search algorithm to determine the best candidate models and perform a leave-one-out cross-validation to calculate the performance of each sampled point. A Voronoi diagram is applied to partition the sampling region into some local cells, and the Voronoi vertexes are considered as new candidate points. The performance of the sample points is used to estimate the accuracy of the model for a set of candidate points to select those that will improve more the model’s accuracy. Then, the number of candidate models is reduced. Finally, the performance of the framework is tested using two examples to demonstrate the applicability of the proposed method.


2015 ◽  
Author(s):  
Shuji Miyamoto ◽  
Yi-Da Hsieh ◽  
Kohei Kotani ◽  
Sho Okubo ◽  
Hajime Inaba ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document