scholarly journals Surrogate-assisted Bayesian inversion for landscape and basin evolution models

2019 ◽  
Author(s):  
Rohitash Chandra ◽  
Danial Azam ◽  
Arpit Kapoor ◽  
R. Dietmar Mulller

Abstract. The complex and computationally expensive features of the forward landscape and sedimentary basin evolution models pose a major challenge in the development of efficient inference and optimization methods. Bayesian inference provides a methodology for estimation and uncertainty quantification of free model parameters. In our previous work, parallel tempering Bayeslands was developed as a framework for parameter estimation and uncertainty quantification for the landscape and basin evolution modelling software Badlands. Parallel tempering Bayeslands features high-performance computing with dozens of processing cores running in parallel to enhance computational efficiency. Although parallel computing is used, the procedure remains computationally challenging since thousands of samples need to be drawn and evaluated. In large-scale landscape and basin evolution problems, a single model evaluation can take from several minutes to hours, and in certain cases, even days. Surrogate-assisted optimization has been with successfully applied to a number of engineering problems This motivates its use in optimisation and inference methods suited for complex models in geology and geophysics. Surrogates can speed up parallel tempering Bayeslands by developing computationally inexpensive surrogates to mimic expensive models. In this paper, we present an application of surrogate-assisted parallel tempering where that surrogate mimics a landscape evolution model including erosion, sediment transport and deposition, by estimating the likelihood function that is given by the model. We employ a machine learning model as a surrogate that learns from the samples generated by the parallel tempering algorithm and the likelihood from the model. The entire framework is developed in a parallel computing infrastructure to take advantage of parallelization. The results show that the proposed methodology is effective in lowering the overall computational cost significantly while retaining the quality of solutions.

2020 ◽  
Vol 13 (7) ◽  
pp. 2959-2979
Author(s):  
Rohitash Chandra ◽  
Danial Azam ◽  
Arpit Kapoor ◽  
R. Dietmar Müller

Abstract. The complex and computationally expensive nature of landscape evolution models poses significant challenges to the inference and optimization of unknown model parameters. Bayesian inference provides a methodology for estimation and uncertainty quantification of unknown model parameters. In our previous work, we developed parallel tempering Bayeslands as a framework for parameter estimation and uncertainty quantification for the Badlands landscape evolution model. Parallel tempering Bayeslands features high-performance computing that can feature dozens of processing cores running in parallel to enhance computational efficiency. Nevertheless, the procedure remains computationally challenging since thousands of samples need to be drawn and evaluated. In large-scale landscape evolution problems, a single model evaluation can take from several minutes to hours and in some instances, even days or weeks. Surrogate-assisted optimization has been used for several computationally expensive engineering problems which motivate its use in optimization and inference of complex geoscientific models. The use of surrogate models can speed up parallel tempering Bayeslands by developing computationally inexpensive models to mimic expensive ones. In this paper, we apply surrogate-assisted parallel tempering where the surrogate mimics a landscape evolution model by estimating the likelihood function from the model. We employ a neural-network-based surrogate model that learns from the history of samples generated. The entire framework is developed in a parallel computing infrastructure to take advantage of parallelism. The results show that the proposed methodology is effective in lowering the computational cost significantly while retaining the quality of model predictions.


Author(s):  
Hui Huang ◽  
Jian Chen ◽  
Blair Carlson ◽  
Hui-Ping Wang ◽  
Paul Crooker ◽  
...  

Due to enormous computation cost, current residual stress simulation of multipass girth welds are mostly performed using two-dimensional (2D) axisymmetric models. The 2D model can only provide limited estimation on the residual stresses by assuming its axisymmetric distribution. In this study, a highly efficient thermal-mechanical finite element code for three dimensional (3D) model has been developed based on high performance Graphics Processing Unit (GPU) computers. Our code is further accelerated by considering the unique physics associated with welding processes that are characterized by steep temperature gradient and a moving arc heat source. It is capable of modeling large-scale welding problems that cannot be easily handled by the existing commercial simulation tools. To demonstrate the accuracy and efficiency, our code was compared with a commercial software by simulating a 3D multi-pass girth weld model with over 1 million elements. Our code achieved comparable solution accuracy with respect to the commercial one but with over 100 times saving on computational cost. Moreover, the three-dimensional analysis demonstrated more realistic stress distribution that is not axisymmetric in hoop direction.


2018 ◽  
Vol 35 (3) ◽  
pp. 380-388 ◽  
Author(s):  
Wei Zheng ◽  
Qi Mao ◽  
Robert J Genco ◽  
Jean Wactawski-Wende ◽  
Michael Buck ◽  
...  

Abstract Motivation The rapid development of sequencing technology has led to an explosive accumulation of genomic data. Clustering is often the first step to be performed in sequence analysis. However, existing methods scale poorly with respect to the unprecedented growth of input data size. As high-performance computing systems are becoming widely accessible, it is highly desired that a clustering method can easily scale to handle large-scale sequence datasets by leveraging the power of parallel computing. Results In this paper, we introduce SLAD (Separation via Landmark-based Active Divisive clustering), a generic computational framework that can be used to parallelize various de novo operational taxonomic unit (OTU) picking methods and comes with theoretical guarantees on both accuracy and efficiency. The proposed framework was implemented on Apache Spark, which allows for easy and efficient utilization of parallel computing resources. Experiments performed on various datasets demonstrated that SLAD can significantly speed up a number of popular de novo OTU picking methods and meanwhile maintains the same level of accuracy. In particular, the experiment on the Earth Microbiome Project dataset (∼2.2B reads, 437 GB) demonstrated the excellent scalability of the proposed method. Availability and implementation Open-source software for the proposed method is freely available at https://www.acsu.buffalo.edu/~yijunsun/lab/SLAD.html. Supplementary information Supplementary data are available at Bioinformatics online.


2020 ◽  
Author(s):  
Jonas Sukys ◽  
Marco Bacci

<div> <div>SPUX (Scalable Package for Uncertainty Quantification in "X") is a modular framework for Bayesian inference and uncertainty quantification. The SPUX framework aims at harnessing high performance scientific computing to tackle complex aquatic dynamical systems rich in intrinsic uncertainties,</div> <div>such as ecological ecosystems, hydrological catchments, lake dynamics, subsurface flows, urban floods, etc. The challenging task of quantifying input, output and/or parameter uncertainties in such stochastic models is tackled using Bayesian inference techniques, where numerical sampling and filtering algorithms assimilate prior expert knowledge and available experimental data. The SPUX framework greatly simplifies uncertainty quantification for realistic computationally costly models and provides an accessible, modular, portable, scalable, interpretable and reproducible scientific workflow. To achieve this, SPUX can be coupled to any serial or parallel model written in any programming language (e.g. Python, R, C/C++, Fortran, Java), can be installed either on a laptop or on a parallel cluster, and has built-in support for automatic reports, including algorithmic and computational performance metrics. I will present key SPUX concepts using a simple random walk example, and showcase recent realistic applications for catchment and lake models. In particular, uncertainties in model parameters, meteorological inputs, and data observation processes are inferred by assimilating available in-situ and remotely sensed datasets.</div> </div>


2014 ◽  
Vol 556-562 ◽  
pp. 4746-4749
Author(s):  
Bin Chu ◽  
Da Lin Jiang ◽  
Bo Cheng

This paper concerns about Large-scale mosaic for remote sensed images. Base on High Performance Computing system, we offer a method to decompose the problem and integrate them with logical and physical relationship. The mosaic of Large-scale remote sensed images has been improved both at performance and effectiveness.


Author(s):  
Gordon Bell ◽  
David H Bailey ◽  
Jack Dongarra ◽  
Alan H Karp ◽  
Kevin Walsh

The Gordon Bell Prize is awarded each year by the Association for Computing Machinery to recognize outstanding achievement in high-performance computing (HPC). The purpose of the award is to track the progress of parallel computing with particular emphasis on rewarding innovation in applying HPC to applications in science, engineering, and large-scale data analytics. Prizes may be awarded for peak performance or special achievements in scalability and time-to-solution on important science and engineering problems. Financial support for the US$10,000 award is provided through an endowment by Gordon Bell, a pioneer in high-performance and parallel computing. This article examines the evolution of the Gordon Bell Prize and the impact it has had on the field.


2014 ◽  
Vol 536-537 ◽  
pp. 892-899
Author(s):  
Jia Xin Hao ◽  
Zhi Qiang Zhao

The high-performance parallel computing (HPPC) has a better overall performance and higher productivity, for a generical large-scale army equipment system of systems (AESoS) simulation, and the runtime efficiency can be multiplied several tenfold to several hundredfold. The requirement analysis of simulation framework of AESoS based on HPPC was proposed. After the simulation framework of AESoS based on HPPC and its key techniques were discussed, the simulation framework of AESoS Based HPPC was designed. it is of great significance to offer certain references for the engineering application in the simulation fields of AESoS based on HPPC.


2020 ◽  
Author(s):  
Lucie Pheulpin ◽  
Vito Bacchi

<p>Hydraulic models are increasingly used to assess the flooding hazard. However, all numerical models are affected by uncertainties, related to model parameters, which can be quantified through Uncertainty Quantification (UQ) and Global Sensitivity Analysis (GSA). In traditional methods of UQ and GSA, the input parameters of the numerical models are considered to be independent which is actually rarely the case. The objective of this work is to proceed with UQ and GSA methods considering dependent inputs and comparing different methodologies. At our knowledge, there is no such application in the field of 2D hydraulic modelling.</p><p>At first the uncertain parameters of the hydraulic model are classified in groups of dependent parameters. Within this aim, it is then necessary to define the copulas that better represent these groups. Finally UQ and GSA based on copulas are performed. The proposed methodology is applied to the large scale 2D hydraulic model of the Loire River. However, as the model computation is high time-consuming, we used a meta-model instead of the initial model. We compared the results coming from the traditional methods of UQ and GSA (<em>i.e.</em> without taking into account the dependencies between inputs) and the ones coming from the new methods based on copulas. The results show that the dependence between inputs should not always be neglected in UQ and GSA.</p>


Author(s):  
Kai Zhou ◽  
Pei Cao ◽  
Jiong Tang

Uncertainty quantification is an important aspect in structural dynamic analysis. Since practical structures are complex and oftentimes need to be characterized by large-scale finite element models, component mode synthesis (CMS) method is widely adopted for order-reduced modeling. Even with the model order-reduction, the computational cost for uncertainty quantification can still be prohibitive. In this research, we utilize a two-level Gaussian process emulation to achieve rapid sampling and response prediction under uncertainty, in which the low- and high-fidelity data extracted from CMS and full-scale finite element model are incorporated in an integral manner. The possible bias of low-fidelity data is then corrected through high-fidelity data. For the purpose of reducing the emulation runs, we further employ Bayesian inference approach to calibrate the order-reduced model in a probabilistic manner conditioned on multiple predicted response distributions of concern. Case studies are carried out to validate the effectiveness of proposed methodology.


Sign in / Sign up

Export Citation Format

Share Document