Uncertainty analysis of calibrated parameter values of an urban storm water quality model using metropolis monte carlo algorithm

1997 ◽  
Vol 36 (5) ◽  
pp. 141-148 ◽  
Author(s):  
A. Mailhot ◽  
É. Gaume ◽  
J.-P. Villeneuve

The Storm Water Management Model's quality module is calibrated for a section of Québec City's sewer system using data collected during five rain events. It is shown that even for this simple model, calibration can fail: similarly a good fit between recorded data and simulation results can be obtained with quite different sets of model parameters, leading to great uncertainty on calibrated parameter values. In order to further investigate the lack of data and data uncertainty impacts on calibration, we used a new methodology based on the Metropolis Monte Carlo algorithm. This analysis shows that for a large amount of calibration data generated by the model itself, small data uncertainties are necessary to significantly decrease calibrated parameter uncertainties. This also confirms the usefulness of the Metropolis algorithm as a tool for uncertainty analysis in the context of model calibration.

1996 ◽  
Vol 33 (2) ◽  
pp. 79-90 ◽  
Author(s):  
Jian Hua Lei ◽  
Wolfgang Schilling

Physically-based urban rainfall-runoff models are mostly applied without parameter calibration. Given some preliminary estimates of the uncertainty of the model parameters the associated model output uncertainty can be calculated. Monte-Carlo simulation followed by multi-linear regression is used for this analysis. The calculated model output uncertainty can be compared to the uncertainty estimated by comparing model output and observed data. Based on this comparison systematic or spurious errors can be detected in the observation data, the validity of the model structure can be confirmed, and the most sensitive parameters can be identified. If the calculated model output uncertainty is unacceptably large the most sensitive parameters should be calibrated to reduce the uncertainty. Observation data for which systematic and/or spurious errors have been detected should be discarded from the calibration data. This procedure is referred to as preliminary uncertainty analysis; it is illustrated with an example. The HYSTEM program is applied to predict the runoff volume from an experimental catchment with a total area of 68 ha and an impervious area of 20 ha. Based on the preliminary uncertainty analysis, for 7 of 10 events the measured runoff volume is within the calculated uncertainty range, i.e. less than or equal to the calculated model predictive uncertainty. The remaining 3 events include most likely systematic or spurious errors in the observation data (either in the rainfall or the runoff measurements). These events are then discarded from further analysis. After calibrating the model the predictive uncertainty of the model is estimated.


2002 ◽  
Vol 6 (5) ◽  
pp. 883-898 ◽  
Author(s):  
K. Engeland ◽  
L. Gottschalk

Abstract. This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC) analysis. The Bayesian method requires formulation of a likelihood function for the parameters and three alternative formulations are used. The first is a subjectively chosen objective function that describes the goodness of fit between the simulated and observed streamflow, as defined in the GLUE framework. The second and third formulations are more statistically correct likelihood models that describe the simulation errors. The full statistical likelihood model describes the simulation errors as an AR(1) process, whereas the simple model excludes the auto-regressive part. The statistical parameters depend on the catchments and the hydrological processes and the statistical and the hydrological parameters are estimated simultaneously. The results show that the simple likelihood model gives the most robust parameter estimates. The simulation error may be explained to a large extent by the catchment characteristics and climatic conditions, so it is possible to transfer knowledge about them to ungauged catchments. The statistical models for the simulation errors indicate that structural errors in the model are more important than parameter uncertainties. Keywords: regional hydrological model, model uncertainty, Bayesian analysis, Markov Chain Monte Carlo analysis


Author(s):  
Georg A. Mensah ◽  
Luca Magri ◽  
Jonas P. Moeck

Thermoacoustic instabilities are a major threat for modern gas turbines. Frequency-domain-based stability methods, such as network models and Helmholtz solvers, are common design tools because they are fast compared to compressible flow computations. They result in an eigenvalue problem, which is nonlinear with respect to the eigenvalue. Thus, the influence of the relevant parameters on mode stability is only given implicitly. Small changes in some model parameters, may have a great impact on stability. The assessment of how parameter uncertainties propagate to system stability is therefore crucial for safe gas turbine operation. This question is addressed by uncertainty quantification. A common strategy for uncertainty quantification in thermoacoustics is risk factor analysis. One general challenge regarding uncertainty quantification is the sheer number of uncertain parameter combinations to be quantified. For instance, uncertain parameters in an annular combustor might be the equivalence ratio, convection times, geometrical parameters, boundary impedances, flame response model parameters, etc. A new and fast way to obtain algebraic parameter models in order to tackle the implicit nature of the problem is using adjoint perturbation theory. This paper aims to further utilize adjoint methods for the quantification of uncertainties. This analytical method avoids the usual random Monte Carlo (MC) simulations, making it particularly attractive for industrial purposes. Using network models and the open-source Helmholtz solver PyHoltz, it is also discussed how to apply the method with standard modeling techniques. The theory is exemplified based on a simple ducted flame and a combustor of EM2C laboratory for which experimental data are available.


2020 ◽  
Author(s):  
Saulė Simutė ◽  
Lion Krischer ◽  
Christian Boehm ◽  
Martin Vallée ◽  
Andreas Fichtner

<p>We present a proof-of-concept catalogue of full-waveform seismic source solutions for the Japanese Islands area. Our method is based on the Bayesian inference of source parameters and a tomographically derived heterogeneous Earth model, used to compute Green’s strain tensors. We infer the full moment tensor, location and centroid time of the seismic events in the study area.</p><p>To compute spatial derivatives of Green’s functions, we use a previously derived regional Earth model (Simutė et al., 2016). The model is radially anisotropic, visco-elastic, and fully heterogeneous. It was constructed using full waveforms in the period band of 15–80 s.</p><p>Green’s strains are computed numerically with the spectral-element solver SES3D (Gokhberg & Fichtner, 2016). We exploit reciprocity, and by treating seismic stations as virtual sources we compute and store the wavefield across the domain. This gives us a strain database for all potential source-receiver pairs. We store the wavefield for more than 50 F-net broadband stations (www.fnet.bosai.go.jp). By assuming an impulse response as the source time function, the displacements are then promptly obtained by linear combination of the pre-computed strains scaled by the moment tensor elements.</p><p>With a feasible number of model parameters and the fast forward problem we infer the unknowns in a Bayesian framework. The fully probabilistic approach allows us to obtain uncertainty information as well as inter-parameter trade-offs. The sampling is performed with a variant of the Hamiltonian Monte Carlo algorithm, which we developed previously (Fichtner and Simutė, 2017). We apply an L2 misfit on waveform data, and we work in the period band of 15–80 s.</p><p>We jointly infer three location parameters, timing and moment tensor components. We present two sets of source solutions: 1) full moment tensor solutions, where the trace is free to vary away from zero, and 2) moment tensor solutions with the isotropic part constrained to be zero. In particular, we study events with significant non-double-couple component. Preliminary results of ~Mw 5 shallow to intermediate depth events indicate that proper incorporation of 3-D Earth structure results in solutions becoming more double-couple like. We also find that improving the Global CMT solutions in terms of waveform fit requires a very good 3-D Earth model and is not trivial.</p>


1995 ◽  
Vol 06 (01) ◽  
pp. 67-76 ◽  
Author(s):  
GEORGE C. JOHN ◽  
VIJAY A. SINGH

The electron propagator in the Aharonov-Bohm effect is investigated using the Feynman path integral formalism. The calculation of the propagator is effected using a variation of the Metropolis Monte Carlo algorithm. Unlike “exact” calculations, our approach permits us to include a nonvanishing solenoid radius. We investigate the dependence of the resulting interference pattern on the magnetic field as well as the solenoid radius. Our results agree with the exact case in the limit of an infinitesimally small solenoid radius.


1993 ◽  
Vol 9 (4) ◽  
pp. 669-701 ◽  
Author(s):  
Edward H. Field ◽  
Klaus H. Jacob

In the weak-motion phase of the Turkey Flat blind-prediction effort, it was found that given a particular physical model of each sediment site, various theoretical techniques give similar estimates of the site response. However, it remained to be determined how uncertainties in the physical model parameters influence the theoretical predictions. We have studied this question by propagating the physical parameter uncertainties into the theoretical site-response predictions using monte-carlo simulations. The input-parameter uncertainties were estimated directly from the results of several independent geotechnical studies performed at Turkey Flat. While the computed results generally agree with empirical site-response estimates (average spectral ratios of earthquake recordings), we found that the uncertainties lead to a high degree of variability in the theoretical predictions. Most of this variability comes from poor constraints on the shear-wave velocity and thickness of a thin (∼2m) surface layer, and on the attenuation of the sediments. Our results suggest that in site-response studies which rely exclusively on geotechnically based theoretical predictions, it will be important that the variability resulting from input-parameter uncertainties is recognized and accounted for.


2004 ◽  
Vol 824 ◽  
Author(s):  
M.M. Askarieh ◽  
T.G. Heath ◽  
W.M. Tearle

AbstractA Monte Carlo-based approach has been adopted for development of a chemical thermodynamic model to describe the goethite surface in contact with sodium nitrate solutions. The technique involves the calculation of the goethite surface properties for the chemical conditions corresponding to each experimental data point. The representation of the surface was based on a set of model parameters, each of which was either fixed or was randomly sampled from a specified range of values. Thousands of such model representations were generated for different selected sets of parameter values with the use of the standard geochemical speciation computer program, HARPHRQ. The method allowed many combinations of parameter values to be sampled that might not be achieved with a simple least-squares fitting approach. It also allowed the dependence of the quality of fit on each parameter to be analysed. The Monte Carlo approach is most appropriate in the development of complex models involving the fitting of several datasets with several fitting parameters.Introduction of selenate surface complexes allowed the model to be extended to represent selenate ion sorption, selenium being an important radioelement in evaluation of the long-term safety of ILW disposal. The sorption model gave good agreement with a wide range of experimental sorption datasets for selenate.


Sign in / Sign up

Export Citation Format

Share Document