Monte-Carlo Simulation of the Theoretical Site Response Variability at Turkey Flat, California, Given the Uncertainty in the Geotechnically Derived Input Parameters

1993 ◽  
Vol 9 (4) ◽  
pp. 669-701 ◽  
Author(s):  
Edward H. Field ◽  
Klaus H. Jacob

In the weak-motion phase of the Turkey Flat blind-prediction effort, it was found that given a particular physical model of each sediment site, various theoretical techniques give similar estimates of the site response. However, it remained to be determined how uncertainties in the physical model parameters influence the theoretical predictions. We have studied this question by propagating the physical parameter uncertainties into the theoretical site-response predictions using monte-carlo simulations. The input-parameter uncertainties were estimated directly from the results of several independent geotechnical studies performed at Turkey Flat. While the computed results generally agree with empirical site-response estimates (average spectral ratios of earthquake recordings), we found that the uncertainties lead to a high degree of variability in the theoretical predictions. Most of this variability comes from poor constraints on the shear-wave velocity and thickness of a thin (∼2m) surface layer, and on the attenuation of the sediments. Our results suggest that in site-response studies which rely exclusively on geotechnically based theoretical predictions, it will be important that the variability resulting from input-parameter uncertainties is recognized and accounted for.

2002 ◽  
Vol 6 (5) ◽  
pp. 883-898 ◽  
Author(s):  
K. Engeland ◽  
L. Gottschalk

Abstract. This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC) analysis. The Bayesian method requires formulation of a likelihood function for the parameters and three alternative formulations are used. The first is a subjectively chosen objective function that describes the goodness of fit between the simulated and observed streamflow, as defined in the GLUE framework. The second and third formulations are more statistically correct likelihood models that describe the simulation errors. The full statistical likelihood model describes the simulation errors as an AR(1) process, whereas the simple model excludes the auto-regressive part. The statistical parameters depend on the catchments and the hydrological processes and the statistical and the hydrological parameters are estimated simultaneously. The results show that the simple likelihood model gives the most robust parameter estimates. The simulation error may be explained to a large extent by the catchment characteristics and climatic conditions, so it is possible to transfer knowledge about them to ungauged catchments. The statistical models for the simulation errors indicate that structural errors in the model are more important than parameter uncertainties. Keywords: regional hydrological model, model uncertainty, Bayesian analysis, Markov Chain Monte Carlo analysis


Author(s):  
Georg A. Mensah ◽  
Luca Magri ◽  
Jonas P. Moeck

Thermoacoustic instabilities are a major threat for modern gas turbines. Frequency-domain-based stability methods, such as network models and Helmholtz solvers, are common design tools because they are fast compared to compressible flow computations. They result in an eigenvalue problem, which is nonlinear with respect to the eigenvalue. Thus, the influence of the relevant parameters on mode stability is only given implicitly. Small changes in some model parameters, may have a great impact on stability. The assessment of how parameter uncertainties propagate to system stability is therefore crucial for safe gas turbine operation. This question is addressed by uncertainty quantification. A common strategy for uncertainty quantification in thermoacoustics is risk factor analysis. One general challenge regarding uncertainty quantification is the sheer number of uncertain parameter combinations to be quantified. For instance, uncertain parameters in an annular combustor might be the equivalence ratio, convection times, geometrical parameters, boundary impedances, flame response model parameters, etc. A new and fast way to obtain algebraic parameter models in order to tackle the implicit nature of the problem is using adjoint perturbation theory. This paper aims to further utilize adjoint methods for the quantification of uncertainties. This analytical method avoids the usual random Monte Carlo (MC) simulations, making it particularly attractive for industrial purposes. Using network models and the open-source Helmholtz solver PyHoltz, it is also discussed how to apply the method with standard modeling techniques. The theory is exemplified based on a simple ducted flame and a combustor of EM2C laboratory for which experimental data are available.


1997 ◽  
Vol 36 (5) ◽  
pp. 141-148 ◽  
Author(s):  
A. Mailhot ◽  
É. Gaume ◽  
J.-P. Villeneuve

The Storm Water Management Model's quality module is calibrated for a section of Québec City's sewer system using data collected during five rain events. It is shown that even for this simple model, calibration can fail: similarly a good fit between recorded data and simulation results can be obtained with quite different sets of model parameters, leading to great uncertainty on calibrated parameter values. In order to further investigate the lack of data and data uncertainty impacts on calibration, we used a new methodology based on the Metropolis Monte Carlo algorithm. This analysis shows that for a large amount of calibration data generated by the model itself, small data uncertainties are necessary to significantly decrease calibrated parameter uncertainties. This also confirms the usefulness of the Metropolis algorithm as a tool for uncertainty analysis in the context of model calibration.


Author(s):  
Abhijit Bhattacharyya ◽  
John Schueller ◽  
Brian Mann ◽  
Tony Schmitz ◽  
Michael Gomez

Abstract Empirical mathematical models of cutting forces in machining processes use experimentally determined input parameters to make predictions. A general method for propagation of input parameter uncertainties through such predictive models is developed. Sources of uncertainty are identified and classified. First, a classical uncertainty procedure is employed to estimate uncertainties associated with the data reduction equation using a first order Taylor series expansion. Small values of input parameter uncertainties justify this local linearization. Coverage factors required to estimate confidence intervals are computed based on appropriate underlying statistical distributions. A root sum of squares method yields the overall expanded uncertainty in force predictions. A popular model used for predicting cutting forces in end milling is selected to demonstrate the procedure, but the demonstrated approach is general. The analysis is applied to experimental data. Force predictions are quoted along with a confidence interval attached to them. An alternative analysis based on Monte Carlo simulations is also presented. This procedure yields different insights compared with the classical uncertainty analysis and complements it. Monte Carlo simulation provides combined uncertainties directly without sensitivity calculations. Classical uncertainty analysis reveals the impacts of random effects and systematic effects separately. This information can prompt the user to improve the experimental setup if the impact of systematic effects is observed to be comparatively large. The method of quoting an estimate of the uncertainty in force predictions presented in this paper will permit users to assess the suitability of given empirical force prediction models in specific applications.


2017 ◽  
Vol 34 (5) ◽  
pp. 1700-1723 ◽  
Author(s):  
Saurabh Prabhu ◽  
Sez Atamturktur ◽  
Scott Cogan

Purpose This paper aims to focus on the assessment of the ability of computer models with imperfect functional forms and uncertain input parameters to represent reality. Design/methodology/approach In this assessment, both the agreement between a model’s predictions and available experiments and the robustness of this agreement to uncertainty have been evaluated. The concept of satisfying boundaries to represent input parameter sets that yield model predictions with acceptable fidelity to observed experiments has been introduced. Findings Satisfying boundaries provide several useful indicators for model assessment, and when calculated for varying fidelity thresholds and input parameter uncertainties, reveal the trade-off between the robustness to uncertainty in model parameters, the threshold for satisfactory fidelity and the probability of satisfying the given fidelity threshold. Using a controlled case-study example, important modeling decisions such as acceptable level of uncertainty, fidelity requirements and resource allocation for additional experiments are shown. Originality/value Traditional methods of model assessment are solely based on fidelity to experiments, leading to a single parameter set that is considered fidelity-optimal, which essentially represents the values which yield the optimal compensation between various sources of errors and uncertainties. Rather than maximizing fidelity, this study advocates for basing model assessment on the model’s ability to satisfy a required fidelity (or error tolerance). Evaluating the trade-off between error tolerance, parameter uncertainty and probability of satisfying this predefined error threshold provides us with a powerful tool for model assessment and resource allocation.


Processes ◽  
2021 ◽  
Vol 9 (5) ◽  
pp. 745
Author(s):  
Dimitrios Meimaroglou ◽  
Sandrine Hoppe ◽  
Baptiste Boit

The kinetics of the hydrolysis and polycondensation reactions of saccharides have made the subject of numerous studies, due to their importance in several industrial sectors. The present work, presents a novel kinetic modeling framework that is specifically well-suited to reacting systems under strict moisture control that favor the polycondensation reactions towards the formation of high-degree polysaccharides. The proposed model is based on an extended and generalized kinetic scheme, including also the presence of polyols, and is formulated using two different numerical approaches, namely a deterministic one in terms of the method of moments and a stochastic kinetic Monte Carlo approach. Accordingly, the most significant advantages and drawbacks of each technique are clearly demonstrated and the most fitted one (i.e., the Monte Carlo method) is implemented for the modeling of the system under different conditions, for which experimental data were available. Through these comparisons it is shown that the model can successfully follow the evolution of the reactions up to the formation of polysaccharides of very high degrees of polymerization.


1969 ◽  
Vol 24 (10) ◽  
pp. 1449-1457
Author(s):  
H. Klingenberg ◽  
F. Sardei ◽  
W. Zimmermann

Abstract In continuation of the work on interaction between shock waves and magnetic fields 1,2 the experiments reported here measured the atomic and electron densities in the interaction region by means of an interferometric and a spectroscopic method. The transient atomic density was also calculated using a one-dimensional theory based on the work of Johnson3 , but modified to give an improved physical model. The experimental results were compared with the theoretical predictions.


2021 ◽  
Vol 11 (11) ◽  
pp. 5234
Author(s):  
Jin Hun Park ◽  
Pavel Pereslavtsev ◽  
Alexandre Konobeev ◽  
Christian Wegmann

For the stable and self-sufficient functioning of the DEMO fusion reactor, one of the most important parameters that must be demonstrated is the Tritium Breeding Ratio (TBR). The reliable assessment of the TBR with safety margins is a matter of fusion reactor viability. The uncertainty of the TBR in the neutronic simulations includes many different aspects such as the uncertainty due to the simplification of the geometry models used, the uncertainty of the reactor layout and the uncertainty introduced due to neutronic calculations. The last one can be reduced by applying high fidelity Monte Carlo simulations for TBR estimations. Nevertheless, these calculations have inherent statistical errors controlled by the number of neutron histories, straightforward for a quantity such as that of TBR underlying errors due to nuclear data uncertainties. In fact, every evaluated nuclear data file involved in the MCNP calculations can be replaced with the set of the random data files representing the particular deviation of the nuclear model parameters, each of them being correct and valid for applications. To account for the uncertainty of the nuclear model parameters introduced in the evaluated data file, a total Monte Carlo (TMC) method can be used to analyze the uncertainty of TBR owing to the nuclear data used for calculations. To this end, two 3D fully heterogeneous geometry models of the helium cooled pebble bed (HCPB) and water cooled lithium lead (WCLL) European DEMOs were utilized for the calculations of the TBR. The TMC calculations were performed, making use of the TENDL-2017 nuclear data library random files with high enough statistics providing a well-resolved Gaussian distribution of the TBR value. The assessment was done for the estimation of the TBR uncertainty due to the nuclear data for entire material compositions and for separate materials: structural, breeder and neutron multipliers. The overall TBR uncertainty for the nuclear data was estimated to be 3~4% for the HCPB and WCLL DEMOs, respectively.


2008 ◽  
Vol 10 (2) ◽  
pp. 153-162 ◽  
Author(s):  
B. G. Ruessink

When a numerical model is to be used as a practical tool, its parameters should preferably be stable and consistent, that is, possess a small uncertainty and be time-invariant. Using data and predictions of alongshore mean currents flowing on a beach as a case study, this paper illustrates how parameter stability and consistency can be assessed using Markov chain Monte Carlo. Within a single calibration run, Markov chain Monte Carlo estimates the parameter posterior probability density function, its mode being the best-fit parameter set. Parameter stability is investigated by stepwise adding new data to a calibration run, while consistency is examined by calibrating the model on different datasets of equal length. The results for the present case study indicate that various tidal cycles with strong (say, >0.5 m/s) currents are required to obtain stable parameter estimates, and that the best-fit model parameters and the underlying posterior distribution are strongly time-varying. This inconsistent parameter behavior may reflect unresolved variability of the processes represented by the parameters, or may represent compensational behavior for temporal violations in specific model assumptions.


Sign in / Sign up

Export Citation Format

Share Document