Programming with equadratures: an open-source package for uncertainty quantification, dimension reduction, and optimisation

2022 ◽  
Author(s):  
Ashley D. Scillitoe ◽  
Chun Yui Wong ◽  
James C. Gross ◽  
Irene Virdis ◽  
Bryn N. Ubald ◽  
...  
2019 ◽  
Vol 142 (5) ◽  
Author(s):  
Lixiong Cao ◽  
Jie Liu ◽  
Chao Jiang ◽  
Zhantao Wu ◽  
Zheng Zhang

Abstract Evidence theory has the powerful feature to quantify epistemic uncertainty. However, the huge computational cost has become the main obstacle of evidence theory on engineering applications. In this paper, an efficient uncertainty quantification (UQ) method based on dimension reduction decomposition is proposed to improve the applicability of evidence theory. In evidence-based UQ, the extremum analysis is required for each joint focal element, which generally can be achieved by collocating a large number of nodes. Through dimension reduction decomposition, the response of any point can be predicted by the responses of corresponding marginal collocation nodes. Thus, a marginal collocation node method is proposed to avoid the call of original performance function at all joint collocation nodes in extremum analysis. Based on this, a marginal interval analysis method is further developed to decompose the multidimensional extremum searches for all joint focal elements into the combination of a few one-dimensional extremum searches. Because it overcomes the combinatorial explosion of computation caused by dimension, this proposed method can significantly improve the computational efficiency for evidence-based UQ, especially for the high-dimensional uncertainty problems. In each one-dimensional extremum search, as the response at each marginal collocation node is actually calculated by using the original performance function, the proposed method can provide a relatively precise result by collocating marginal nodes even for some nonlinear functions. The accuracy and efficiency of the proposed method are demonstrated by three numerical examples and two engineering applications.


2019 ◽  
Author(s):  
Leandro de Figueiredo ◽  
Dario Grana ◽  
Leonardo Azevedo ◽  
Mauro Roisenberg ◽  
Bruno Rodrigues

Author(s):  
Kevin Otto ◽  
Jiahui Wang ◽  
Tekin Uyan

AbstractThe design of systems today often involves computer simulation to assess performance and design margins. Understanding how variability erases design margin is important to assure adequacy of margins, especially in optimization efforts. In this paper, we develop a toolchain using open source code libraries in Python, and encapsulate it in Jupyter notebooks, to provide an open source, interactive uncertainty quantification and sensitivity analysis toolchain. This works generally with simulation tools, where a reference folder is created containing a script that reads an input file of parameter values and runs the simulation. With that easily created, the toolchain executes the necessary uncertainty quantification steps with replicates of that reference folder. This approach fits within a broader workflow outlined that defines the variation modes to study, maps to simulation inputs, and screens the variables for sensitivity before conducting an uncertainty quantification. An example is shown in the simulation analysis of a Stirling engine.


2013 ◽  
Author(s):  
Robert Stewart ◽  
Devin White ◽  
Marie Urban ◽  
April Morton ◽  
Clayton Webster ◽  
...  

Author(s):  
YP Ju

A common strategy to handle simulation-based uncertainty quantification problems is adopting a metamodel to replace time-demanding calculations such as computational fluid dynamics simulation or finite element analysis within Monte Carlo simulation process. However, most of the so far metamodel-assisted uncertainty quantification methods suffer from the ‘curse of dimensionality.’ The required number of evaluations, which determines the computational cost, increases exponentially as the dimensionality of the input uncertainty increases, resulting in unaffordable computational cost for high-dimensional problems. Another challenge emerges when the output uncertainties are a spatially varying field accommodating a huge number of spatial nodes. To solve these issues, here we propose a dimension-reduction metamodeling approach, in which active subspace method is utilized to reduce the input dimensionality and proper orthogonal decomposition method is utilized to reduce the output dimensionality of the spatially varying field. The relationship between the two methods is established by using the support vector regression model. Through uncertainty quantification of seven stochastic analytical functions and one stochastic convection-diffusion equation, the proposed approach was verified to be fairly accurate in propagating high-dimensional input uncertainties to either a scalar value or a spatially varying output. The accuracy and efficiency of the proposed approach in dealing with even more practical simulation-based problems were then validated by uncertainty quantification of a compressor cascade with stochastic protrusions/dents distributed on the blade surface. This work provides an effective and versatile approach for simulation-based high-dimensional uncertainty quantification problems.


Sign in / Sign up

Export Citation Format

Share Document