scholarly journals A dimension-reduced neural network-assisted approximate Bayesian computation for inverse heat conduction problems

Author(s):  
Yang Zeng

Abstract Due to the flexibility and feasibility of addressing ill-posed problems, the Bayesian method has been widely used in inverse heat conduction problems (IHCPs). However, in the real science and engineering IHCPs, the likelihood function of the Bayesian method is commonly computationally expensive or analytically unavailable. In this study, in order to circumvent this intractable likelihood function, the approximate Bayesian computation (ABC) is expanded to the IHCPs. In ABC, the high dimensional observations in the intractable likelihood function are equalized by their low dimensional summary statistics. Thus, the performance of the ABC depends on the selection of summary statistics. In this study, a machine learning-based ABC (ML-ABC) is proposed to address the complicated selections of the summary statistics. The Auto-Encoder (AE) is a powerful Machine Learning (ML) framework which can compress the observations into very low dimensional summary statistics with little information loss. In addition, in order to accelerate the calculation of the proposed framework, another neural network (NN) is utilized to construct the mapping between the unknowns and the summary statistics. With this mapping, given arbitrary unknowns, the summary statistics can be obtained efficiently without solving the time-consuming forward problem with numerical method. Furthermore, an adaptive nested sampling method (ANSM) is developed to further improve the efficiency of sampling. The performance of the proposed method is demonstrated with two IHCP cases.

2021 ◽  
Author(s):  
George Karabatsos

Abstract Approximate Bayesian Computation (ABC) can provide inferences from the (approximate) posterior distribution based on intractable likelihoods. The quality of ABC inferences relies on the choice of tolerance for the distance between the observed data summary statistics, and the pseudo-data summary statistics simulated from the likelihood, used within the context of an algorithm which samples from the approximate posterior. However, the ABC literature does not provide an automatic method to select the best tolerance level for the given dataset at hand, and in ABC practice finding the best tolerance level can be time consuming. This note introduces a fast automatic estimator of the tolerance, based on the parametric bootstrap. After the tolerance estimate is calculated, it can then be input into any suitable importance sampling or MCMC algorithm to approximate from the target approximate posterior distribution. This tolerance estimator is illustrated through ABC analyses of simulated and real datasets involving several intractable likelihood models. This includes the analysis of a real 23,000-node network dataset involving stochastic search model selection.


2016 ◽  
Author(s):  
Emma Saulnier ◽  
Olivier Gascuel ◽  
Samuel Alizon

AbstractPhylodynamics typically rely on likelihood-based methods to infer epidemiological parameters from dated phylogenies. These methods are essentially based on simple epidemiological models because of the difficulty in expressing the likelihood function analytically. Computing this function numerically raises additional challenges, especially for large phylogenies. Here, we use Approximate Bayesian Computation (ABC) to circumvent these problems. ABC is a likelihood-free method of parameter inference, based on simulation and comparison between target data and simulated data, using summary statistics. We simulated target trees under several epidemiological scenarios in order to assess the accuracy of ABC methods for inferring epidemiological parameter such as the basic reproduction number (R0), the mean duration of infection, and the effective host population size. We designed many summary statistics to capture the information in a phylogeny and its corresponding lineage-through-time plot. We then used the simplest ABC method, called rejection, and its modern derivative complemented with adjustment of the posterior distribution by regression. The availability of machine learning techniques including variable selection, motivated us to compute many summary statistics on the phylogeny. We found that ABC-based inference reaches an accuracy comparable to that of likelihood-based methods for birth-death models and can even outperform existing methods for more refined models and large trees. By re-analysing data from the early stages of the recent Ebola epidemic in Sierra Leone, we also found that ABC provides more realistic estimates than the likelihood-based methods, for some parameters. This work shows that the combination of ABC-based inference using many summary statistics and sophisticated machine learning methods able to perform variable selection is a promising approach to analyse large phylogenies and non-trivial models.


Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 312
Author(s):  
Ilze A. Auzina ◽  
Jakub M. Tomczak

Many real-life processes are black-box problems, i.e., the internal workings are inaccessible or a closed-form mathematical expression of the likelihood function cannot be defined. For continuous random variables, likelihood-free inference problems can be solved via Approximate Bayesian Computation (ABC). However, an optimal alternative for discrete random variables is yet to be formulated. Here, we aim to fill this research gap. We propose an adjusted population-based MCMC ABC method by re-defining the standard ABC parameters to discrete ones and by introducing a novel Markov kernel that is inspired by differential evolution. We first assess the proposed Markov kernel on a likelihood-based inference problem, namely discovering the underlying diseases based on a QMR-DTnetwork and, subsequently, the entire method on three likelihood-free inference problems: (i) the QMR-DT network with the unknown likelihood function, (ii) the learning binary neural network, and (iii) neural architecture search. The obtained results indicate the high potential of the proposed framework and the superiority of the new Markov kernel.


2008 ◽  
Vol 130 (3) ◽  
Author(s):  
Y. Hwang ◽  
S. Deng

The primary cause of gun barrel erosion is the heat generated by the shell as its travels along the barrel. Therefore, calculating the heat flux input to the gun bore is very important when investigating wear problems in the gun barrel and examining its thermomechanical properties. This paper employs the continuous-time analog Hopfield neural network (CHNN) to compute the temperature distribution in various forward heat conduction problems. An efficient technique is then proposed for the solution of inverse heat conduction problems using a three-layered backpropagation neural network (BPN). The weak generalization capacity of BPN networks when applied to the solution of nonlinear function approximations is improved by employing the Bayesian regularization algorithm. The CHNN scheme is used to calculate the temperature in a 155mm gun barrel and the trained BPN is then used to estimate the heat flux of the inner surface of the barrel. The results show that the proposed neural network analysis method successfully solves forward heat conduction problems and is capable of predicting the unknown parameters in inverse problems with an acceptable error.


Sensors ◽  
2020 ◽  
Vol 20 (11) ◽  
pp. 3197 ◽  
Author(s):  
Zhouquan Feng ◽  
Yang Lin ◽  
Wenzan Wang ◽  
Xugang Hua ◽  
Zhengqing Chen

A novel probabilistic approach for model updating based on approximate Bayesian computation with subset simulation (ABC-SubSim) is proposed for damage assessment of structures using modal data. The ABC-SubSim is a likelihood-free Bayesian approach in which the explicit expression of likelihood function is avoided and the posterior samples of model parameters are obtained using the technique of subset simulation. The novel contributions of this paper are on three fronts: one is the introduction of some new stopping criteria to find an appropriate tolerance level for the metric used in the ABC-SubSim; the second one is the employment of a hybrid optimization scheme to find finer optimal values for the model parameters; and the last one is the adoption of an iterative approach to determine the optimal weighting factors related to the residuals of modal frequency and mode shape in the metric. The effectiveness of this approach is demonstrated using three illustrative examples.


2019 ◽  
Vol 30 (3) ◽  
pp. 559-570
Author(s):  
Jukka Sirén ◽  
Samuel Kaski

Abstract Approximate Bayesian computation (ABC) and other likelihood-free inference methods have gained popularity in the last decade, as they allow rigorous statistical inference for complex models without analytically tractable likelihood functions. A key component for accurate inference with ABC is the choice of summary statistics, which summarize the information in the data, but at the same time should be low-dimensional for efficiency. Several dimension reduction techniques have been introduced to automatically construct informative and low-dimensional summaries from a possibly large pool of candidate summaries. Projection-based methods, which are based on learning simple functional relationships from the summaries to parameters, are widely used and usually perform well, but might fail when the assumptions behind the transformation are not satisfied. We introduce a localization strategy for any projection-based dimension reduction method, in which the transformation is estimated in the neighborhood of the observed data instead of the whole space. Localization strategies have been suggested before, but the performance of the transformed summaries outside the local neighborhood has not been guaranteed. In our localization approach the transformation is validated and optimized over validation datasets, ensuring reliable performance. We demonstrate the improvement in the estimation accuracy for localized versions of linear regression and partial least squares, for three different models of varying complexity.


Sign in / Sign up

Export Citation Format

Share Document