scholarly journals Uncertainty Quantification of Time-Dependent Quantities in a System with Adjustable Level of Smoothness

Author(s):  
Marks Legkovskis ◽  
Peter J Thomas ◽  
Michael Auinger

Abstract We summarise the results of a computational study involved with Uncertainty Quantification (UQ) in a benchmark turbulent burner flame simulation. UQ analysis of this simulation enables one to analyse the convergence performance of one of the most widely-used uncertainty propagation techniques, Polynomial Chaos Expansion (PCE) at varying levels of system smoothness. This is possible because in the burner flame simulations, the smoothness of the time-dependent temperature, which is the study's QoI is found to evolve with the flame development state. This analysis is deemed important as it is known that PCE cannot accurately surrogate non-smooth QoIs and thus perform convergent UQ. While this restriction is known and gets accounted for, there is no understanding whether there is a quantifiable scaling relationship between the PCE's convergence metrics and the level of QoI's smoothness. It is found that the level of QoI-smoothness can be quantified by its standard deviation allowing to observe the effect of QoI's level of smoothness on the PCE's convergence performance. It is found that for our flow scenario, there exists a power-law relationship between a comparative parameter, defined to measure the PCE's convergence performance relative to Monte Carlo sampling, and the QoI's standard deviation, which allows us to make a more weighted decision on the choice of the uncertainty propagation technique.

Author(s):  
Djamalddine Boumezerane

Abstract In this study, we use possibility distribution as a basis for parameter uncertainty quantification in one-dimensional consolidation problems. A Possibility distribution is the one-point coverage function of a random set and viewed as containing both partial ignorance and uncertainty. Vagueness and scarcity of information needed for characterizing the coefficient of consolidation in clay can be handled using possibility distributions. Possibility distributions can be constructed from existing data, or based on transformation of probability distributions. An attempt is made to set a systematic approach for estimating uncertainty propagation during the consolidation process. The measure of uncertainty is based on Klir's definition (1995). We make comparisons with results obtained from other approaches (probabilistic…) and discuss the importance of using possibility distributions in this type of problems.


2010 ◽  
Vol 14 (07) ◽  
pp. 592-604 ◽  
Author(s):  
Do Sung Huh ◽  
Sang Joon Choe

The recent interest in the application of density functional theory (DFT) has prompted us to test several functions in molecular geometries of methyl pheophorbides-a (MPa), an important starting material in photodynamic therapy (PDT). In this study, we report on tests for three popular DFT methods: M06-2X, B3LYP, and LSDA. Based on the standard deviation and the mean value, and by using the difference between optimized calculated value and experimental value in geometries, we drew the following conclusions: M06-2X/6-311+G(d,p) attained the smallest standard deviation of difference among the tested DFT methods in terms of bond length, whereas the standard deviation of bond angle in LSDA/6-311+G(d,p) was the smallest. In terms of absolute value, the mean value of LSDA/6-311+G(d,p) calculation was larger than that of M06-2X/6-311+G(d,p). We found that M06-2X/6-311+G(d,p) gave the best performance for MPa in the molecular geometries. The UV-visible spectrum was calculated with time-dependent density-functional theory (TD-DFT). Time-dependent M06-2X/6-311+G(d,p) gave the best performance for MPa in CH2Cl2 solution. In general, TD-DFT calculations in CH2Cl2 solution were more red-shifted compared with those in the solid state.


2010 ◽  
Vol 28 (3) ◽  
pp. 443-450 ◽  
Author(s):  
Xinjing Cai ◽  
Xiaobin Zou ◽  
Xinxin Wang ◽  
Liming Wang ◽  
Zhicheng Guan ◽  
...  

AbstractThe characteristic of the over-volted breakdown and the gaseous recovery in short nitrogen gaps was experimentally studied. It was found that the breakdown voltage of the gap changes from shot to shot even with the same experimental conditions and obeys Gaussian distribution. The over-volted factor is reduced with an increasing pressure. With a 2.7-mm gap the over-volted factors are 4.53 for 0.1 MPa pressure and 1.74 for 0.4 MPa. The over-volted breakdown voltage depends individually on the gap spacing d and the gas pressure p, rather than on the product of pd. An empirical formula of the breakdown voltage as a function of p and d was derived. The time-dependent recovery of the breakdown voltage, RVb, was obtained using a two-pulse technique. The second breakdown voltage also obeys Gaussian distribution, but it is normally with a smaller standard deviation especially when the interpulse spacing of Δt is relatively short. As a whole, RVb rises with the increase of Δt. However, an intermediate plateau is always observed and it starts when the second breakdown voltage is a little bit higher than the static breakdown voltage of the gap. The first rising edge of the RVb curve corresponds to the recovery of the electro-neutrality and the density. The intermediate plateau and the following rising edge take the spark gap much longer time to recover. The processes governing these two latter phases are as yet not fully clear. It is attributed to the delayed recombination of the residual nitrogen atoms on the cathode to produce the initial electrons for the second breakdown. An increase in pressure has resulted in an upward shift of the intermediate plateau and a shortening in the recovery time of the gaps. The second spark generally does not follow the path of the first spark.


2015 ◽  
Vol 24 (3) ◽  
pp. 307 ◽  
Author(s):  
Yaning Liu ◽  
Edwin Jimenez ◽  
M. Yousuff Hussaini ◽  
Giray Ökten ◽  
Scott Goodrick

Rothermel's wildland surface fire model is a popular model used in wildland fire management. The original model has a large number of parameters, making uncertainty quantification challenging. In this paper, we use variance-based global sensitivity analysis to reduce the number of model parameters, and apply randomised quasi-Monte Carlo methods to quantify parametric uncertainties for the reduced model. The Monte Carlo estimator used in these calculations is based on a control variate approach applied to the sensitivity derivative enhanced sampling. The chaparral fuel model, selected from Rothermel's 11 original fuel models, is studied as an example. We obtain numerical results that improve the crude Monte Carlo sampling by factors as high as three orders of magnitude.


Author(s):  
Yan Wang

Variability is inherent randomness in systems, whereas uncertainty is due to lack of knowledge. In this paper, a generalized multiscale Markov (GMM) model is proposed to quantify variability and uncertainty simultaneously in multiscale system analysis. The GMM model is based on a new imprecise probability theory that has the form of generalized interval, which is a Kaucher or modal extension of classical set-based intervals to represent uncertainties. The properties of the new definitions of independence and Bayesian inference are studied. Based on a new Bayes’ rule with generalized intervals, three cross-scale validation approaches that incorporate variability and uncertainty propagation are also developed.


Author(s):  
Dorin Drignei ◽  
Zissimos Mourelatos ◽  
Zhen Hu

This paper addresses the sensitivity analysis of time-dependent computer models. Often, in practice, we partition the inputs into a subset of inputs relevant to the application studied, and a complement subset of nuisance inputs that are not of interest. We propose sensitivity measures for the relevant inputs of such dynamic computer models. The subset of nuisance inputs is used to create replication-type information to help quantify the uncertainty of sensitivity measures (or indices) for the relevant inputs. The method is first demonstrated on an analytical example. Then we use the proposed method in an application about the safety of restraint systems in light tactical vehicles. The method indicates that chest deflection curves are more sensitive to the addition of pretensioners and load limiters than to the type of seatbelt.


Author(s):  
Gary A. Davis ◽  
Christopher Cheong

This paper describes a method for fitting predictive models that relate vehicle impact speeds to pedestrian injuries, in which results from a national sample are calibrated to reflect local injury statistics. Three methodological issues identified in the literature, outcome-based sampling, uncertainty regarding estimated impact speeds, and uncertainty quantification, are addressed by (i) implementing Bayesian inference using Markov Chain Monte Carlo sampling and (ii) applying multiple imputation to conditional maximum likelihood estimation. The methods are illustrated using crash data from the NHTSA Pedestrian Crash Data Study coupled with an exogenous sample of pedestrian crashes from Minnesota’s Twin Cities. The two approaches produced similar results and, given a reliable characterization of impact speed uncertainty, either approach can be applied in a jurisdiction having an exogenous sample of pedestrian crash severities.


Author(s):  
Zhen Hu ◽  
Sankaran Mahadevan ◽  
Xiaoping Du

Limited data of stochastic load processes and system random variables result in uncertainty in the results of time-dependent reliability analysis. An uncertainty quantification (UQ) framework is developed in this paper for time-dependent reliability analysis in the presence of data uncertainty. The Bayesian approach is employed to model the epistemic uncertainty sources in random variables and stochastic processes. A straightforward formulation of UQ in time-dependent reliability analysis results in a double-loop implementation procedure, which is computationally expensive. This paper proposes an efficient method for the UQ of time-dependent reliability analysis by integrating the fast integration method and surrogate model method with time-dependent reliability analysis. A surrogate model is built first for the time-instantaneous conditional reliability index as a function of variables with imprecise parameters. For different realizations of the epistemic uncertainty, the associated time-instantaneous most probable points (MPPs) are then identified using the fast integration method based on the conditional reliability index surrogate without evaluating the original limit-state function. With the obtained time-instantaneous MPPs, uncertainty in the time-dependent reliability analysis is quantified. The effectiveness of the proposed method is demonstrated using a mathematical example and an engineering application example.


Sign in / Sign up

Export Citation Format

Share Document