Parameter Estimation and Uncertainty Analysis of ORYZA_V3 Model Using the GLUE Method

2019 ◽  
Vol 62 (4) ◽  
pp. 941-949
Author(s):  
Junwei Tan ◽  
Qingyun Duan

Abstract. The Generalized Likelihood Uncertainty Estimation (GLUE) method is one of the popular methods for parameter estimation and uncertainty analysis, although it has been criticized for some drawbacks in numerous studies. In this study, we performed an uncertainty analysis for the ORYZA_V3 model using the GLUE method integrated with Latin hypercube sampling (LHS). Different likelihood measures were examined to understand the differences in derived posterior parameter distributions and uncertainty estimates of the model predictions based on a variety of observations from field experiments. The results indicated that the parameter posterior distributions and 95% confidence intervals (95CI) of model outputs were very sensitive to the choice of likelihood measure, as well as the weights assigned to observations at different dates and to different observation types within a likelihood measure. Performance of the likelihood measure with a proper likelihood function based on normal distribution of model errors and the combining method based on mathematical multiplication was the best, with respect to the effectiveness of reducing the uncertainties of parameter values and model predictions. Moreover, only the means and standard deviations of observation replicates were enough to construct an effective likelihood function in the GLUE method. This study highlighted the importance of using appropriate likelihood measures integrated with multiple observation types in the GLUE method. Keywords: GLUE, Likelihood measures, Model uncertainty, Crop model.

2020 ◽  
Author(s):  
somayeh shadkam ◽  
Mehedi Hasan ◽  
Christoph Niemann ◽  
Andreas Guenter ◽  
Petra Döll

<p>In this research we evaluated the WaterGAP Global Hydrological Model (WGHM) parameter uncertainties and predictive intervals for multi-type variables, including streamflow, total water storage anomaly (TWSA) and snow cover based on the Generalized Likelihood Uncertainty Estimation (GLUE) method, for a large river basin in North America, the Mississippi basin. The GLUE approach is built on Monte Carlo concept, in which simulations are performed for all the parameter sets. The parameter sets are sampled from a prior range of the parameters using the Latin Hypercube Sampling. The Nash-Sutcliffe efficiency was used as likelihood measure in case of all variables. The behavioral set of models were selected as those which result likelihood measures above the pre-specified thresholds for all three variables or subsets. These behavioral parameters set were used to analyze different parameters uncertainties, trade-offs among the variables, and the influence of each individual observation data on constraining other variables.</p>


2007 ◽  
Vol 56 (6) ◽  
pp. 11-18 ◽  
Author(s):  
E. Lindblom ◽  
H. Madsen ◽  
P.S. Mikkelsen

In this paper two attempts to assess the uncertainty involved with model predictions of copper loads from stormwater systems are made. In the first attempt, the GLUE methodology is applied to derive model parameter sets that result in model outputs encompassing a significant number of the measurements. In the second attempt the conceptual model is reformulated to a grey-box model followed by parameter estimation. Given data from an extensive measurement campaign, the two methods suggest that the output of the stormwater pollution model is associated with significant uncertainty. With the proposed model and input data, the GLUE analysis show that the total sampled copper mass can be predicted within a range of ±50% of the median value (385 g), whereas the grey-box analysis showed a prediction uncertainty of less than ±30%. Future work will clarify the pros and cons of the two methods and furthermore explore to what extent the estimation can be improved by modifying the underlying accumulation-washout model.


2019 ◽  
Vol 151 ◽  
pp. 170-182 ◽  
Author(s):  
Long T. Ho ◽  
Andres Alvarado ◽  
Josue Larriva ◽  
Cassia Pompeu ◽  
Peter Goethals

2013 ◽  
Vol 10 (8) ◽  
pp. 13097-13128 ◽  
Author(s):  
F. Hartig ◽  
C. Dislich ◽  
T. Wiegand ◽  
A. Huth

Abstract. Inverse parameter estimation of process-based models is a long-standing problem in ecology and evolution. A key problem of inverse parameter estimation is to define a metric that quantifies how well model predictions fit to the data. Such a metric can be expressed by general cost or objective functions, but statistical inversion approaches are based on a particular metric, the probability of observing the data given the model, known as the likelihood. Deriving likelihoods for dynamic models requires making assumptions about the probability for observations to deviate from mean model predictions. For technical reasons, these assumptions are usually derived without explicit consideration of the processes in the simulation. Only in recent years have new methods become available that allow generating likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional MCMC, performs well in retrieving known parameter values from virtual field data generated by the forest model. We analyze the results of the parameter estimation, examine the sensitivity towards the choice and aggregation of model outputs and observed data (summary statistics), and show results from using this method to fit the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss differences of this approach to Approximate Bayesian Computing (ABC), another commonly used method to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can successfully be applied to process-based models of high complexity. The methodology is particularly suited to heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models in ecology and evolution.


Author(s):  
Tim Loossens ◽  
Kristof Meers ◽  
Niels Vanhasbroeck ◽  
Nil Anarat ◽  
Stijn Verdonck ◽  
...  

AbstractComputational modeling plays an important role in a gamut of research fields. In affect research, continuous-time stochastic models are becoming increasingly popular. Recently, a non-linear, continuous-time, stochastic model has been introduced for affect dynamics, called the Affective Ising Model (AIM). The drawback of non-linear models like the AIM is that they generally come with serious computational challenges for parameter estimation and related statistical analyses. The likelihood function of the AIM does not have a closed form expression. Consequently, simulation based or numerical methods have to be considered in order to evaluate the likelihood function. Additionally, the likelihood function can have multiple local minima. Consequently, a global optimization heuristic is required and such heuristics generally require a large number of likelihood function evaluations. In this paper, a Julia software package is introduced that is dedicated to fitting the AIM. The package includes an implementation of a numeric algorithm for fast computations of the likelihood function, which can be run both on graphics processing units (GPU) and central processing units (CPU). The numerical method introduced in this paper is compared to the more traditional Euler-Maruyama method for solving stochastic differential equations. Furthermore, the estimation software is tested by means of a recovery study and estimation times are reported for benchmarks that were run on several computing devices (two different GPUs and three different CPUs). According to these results, a single parameter estimation can be obtained in less than thirty seconds using a mainstream NVIDIA GPU.


2012 ◽  
Vol 4 (1) ◽  
pp. 185
Author(s):  
Irfan Wahyudi ◽  
Purhadi Purhadi ◽  
Sutikno Sutikno ◽  
Irhamah Irhamah

Multivariate Cox proportional hazard models have ratio property, that is the ratio of  hazard functions for two individuals with covariate vectors  z1 and  z2 are constant (time independent). In this study we talk about estimation of prameters on multivariate Cox model by using Maximum Partial Likelihood Estimation (MPLE) method. To determine the appropriate estimators  that maximize the ln-partial likelihood function, after a score vector and a Hessian matrix are found, numerical iteration methods are applied. In this case, we use a Newton Raphson method. This numerical method is used since the solutions of the equation system of the score vector after setting it equal to zero vector are not closed form. Considering the studies about multivariate Cox model are limited, including the parameter estimation methods, but the methods are urgently needed by some fields of study related such as economics, engineering and medical sciences. For this reasons, the goal of this study is designed to develop parameter estimation methods from univariate to multivariate cases.


2013 ◽  
Vol 56 (12) ◽  
pp. 3151-3160 ◽  
Author(s):  
Fan Lu ◽  
Hao Wang ◽  
DengHua Yan ◽  
DongDong Zhang ◽  
WeiHua Xiao

Sign in / Sign up

Export Citation Format

Share Document