scholarly journals High-Dimensional Uncertainty Quantification via Active and Rank-Adaptive Tensor Regression

Author(s):  
Zichang He ◽  
Zheng Zhang
Author(s):  
Zequn Wang ◽  
Mingyang Li

Abstract Conventional uncertainty quantification methods usually lacks the capability of dealing with high-dimensional problems due to the curse of dimensionality. This paper presents a semi-supervised learning framework for dimension reduction and reliability analysis. An autoencoder is first adopted for mapping the high-dimensional space into a low-dimensional latent space, which contains a distinguishable failure surface. Then a deep feedforward neural network (DFN) is utilized to learn the mapping relationship and reconstruct the latent space, while the Gaussian process (GP) modeling technique is used to build the surrogate model of the transformed limit state function. During the training process of the DFN, the discrepancy between the actual and reconstructed latent space is minimized through semi-supervised learning for ensuring the accuracy. Both labeled and unlabeled samples are utilized for defining the loss function of the DFN. Evolutionary algorithm is adopted to train the DFN, then the Monte Carlo simulation method is used for uncertainty quantification and reliability analysis based on the proposed framework. The effectiveness is demonstrated through a mathematical example.


Author(s):  
Xi Cheng ◽  
Clément Henry ◽  
Francesco P. Andriulli ◽  
Christian Person ◽  
Joe Wiart

This paper focuses on quantifying the uncertainty in the specific absorption rate values of the brain induced by the uncertain positions of the electroencephalography electrodes placed on the patient’s scalp. To avoid running a large number of simulations, an artificial neural network architecture for uncertainty quantification involving high-dimensional data is proposed in this paper. The proposed method is demonstrated to be an attractive alternative to conventional uncertainty quantification methods because of its considerable advantage in the computational expense and speed.


2020 ◽  
Author(s):  
Zhouji Liang ◽  
Florian Wellmann

<p>Uncertainty quantification is an important aspect of geological modelling and model interpretation. Recent developments in geological modelling allow us to view the inversion as a problem in Bayesian inference, incorporating the uncertainties in the observations, the forward models and the prior knowledge from geologists. The sampling method Markov chain Monte Carlo (MCMC) is then often applied to solve this inference problem. However, this stochastic modelling approach is limited as the number of parameters increases to higher dimensions. To ensure an efficient sampling in a high dimensional problem, we take advantage of recent advances using Hessian-based MCMC methods in this work. The Hessian of the negative log posterior with respect to the input parameters is evaluated at the Maximum a Posteriori (MAP) point. A Laplace approximation of the posterior at the MAP is then given by the inverse of the local Hessian. This sampling approach provides a potentially less computationally expensive and more efficient way for high dimensional geological inverse modelling, especially in cases where parameters are highly correlated, a situation that commonly arises in geological modelling.</p>


Author(s):  
Hongyi Xu ◽  
Zhen Jiang ◽  
Daniel W. Apley ◽  
Wei Chen

Data-driven random process models have become increasingly important for uncertainty quantification (UQ) in science and engineering applications, due to their merit of capturing both the marginal distributions and the correlations of high-dimensional responses. However, the choice of a random process model is neither unique nor straightforward. To quantitatively validate the accuracy of random process UQ models, new metrics are needed to measure their capability in capturing the statistical information of high-dimensional data collected from simulations or experimental tests. In this work, two goodness-of-fit (GOF) metrics, namely, a statistical moment-based metric (SMM) and an M-margin U-pooling metric (MUPM), are proposed for comparing different stochastic models, taking into account their capabilities of capturing the marginal distributions and the correlations in spatial/temporal domains. This work demonstrates the effectiveness of the two proposed metrics by comparing the accuracies of four random process models (Gaussian process (GP), Gaussian copula, Hermite polynomial chaos expansion (PCE), and Karhunen–Loeve (K–L) expansion) in multiple numerical examples and an engineering example of stochastic analysis of microstructural materials properties. In addition to the new metrics, this paper provides insights into the pros and cons of various data-driven random process models in UQ.


Sign in / Sign up

Export Citation Format

Share Document