scholarly journals Assessment of The Uncertainties of a Conceptual Hydrologic Model By Using Artificially Generated Flows

2012 ◽  
Vol 20 (4) ◽  
pp. 35-43 ◽  
Author(s):  
Peter Valent ◽  
Ján Szolgay ◽  
Carlo Riverso

ABSTRACTMost of the studies that assess the performance of various calibration techniques have todeal with a certain amount of uncertainty in the calibration data. In this study we testedHBV model calibration procedures in hypothetically ideal conditions under the assumptionof no errors in the measured data. This was achieved by creating an artificial time seriesof the flows created by the HBV model using the parameters obtained from calibrating themeasured flows. The artificial flows were then used to replace the original flows in thecalibration data, which was then used for testing how calibration procedures can reproduceknown model parameters. The results showed that in performing one hundred independentcalibration runs of the HBV model, we did not manage to obtain parameters that werealmost identical to those used to create the artificial flow data without a certain degree ofuncertainty. Although the calibration procedure of the model works properly froma practical point of view, it can be regarded as a demonstration of the equifinality principle,since several parameter sets were obtained which led to equally acceptable or behaviouralrepresentations of the observed flows. The study demonstrated that this concept forassessing how uncertain hydrological predictions can be applied in the further developmentof a model or the choice of calibration method using artificially generated data.

2014 ◽  
Vol 14 (1) ◽  
pp. 16-24 ◽  
Author(s):  
K. Y. You ◽  
Z. Abbas ◽  
M. F. A. Malek ◽  
E. M. Cheng

Abstract This paper focuses on the calibration of apertures for rectangular waveguides using open-short-load (OSL) standards and transmission-line (TL) approaches. The reflection coefficients that were measured using both calibration techniques were compared with the coefficients acquired using the thru-reflect-line (TRL) method. In this study, analogous relationships between the results of OSL calibration and TL calibration were identified. In the OSL calibration method, the theoretical, open-standard values are calculated from quasi-static integral models. The proposed TL calibration procedure is a simple, rapid, broadband approach, and its results were validated by using the OSL calibration method and by comparing the results with the calculated integral admittance. The quasi-static integral models were used to convert the measured reflection coefficients to relative permittivities for the infinite samples and the thin, finite samples


2000 ◽  
Vol 54 (4) ◽  
pp. 608-623 ◽  
Author(s):  
Vítézslav Centner ◽  
Jorge Verdú-Andrés ◽  
Beata Walczak ◽  
Delphine Jouan-Rimbaud ◽  
Frédéric Despagne ◽  
...  

The present study compares the performance of different multivariate calibration techniques applied to four near-infrared data sets when test samples are well within the calibration domain. Three types of problems are discussed: the nonlinear calibration, the calibration using heterogeneous data sets, and the calibration in the presence of irrelevant information in the set of predictors. Recommendations are derived from the comparison, which should help to guide a nonchemometrician through the selection of an appropriate calibration method for a particular type of calibration data. A flexible methodology is proposed to allow selection of an appropriate calibration technique for a given calibration problem.


2021 ◽  
Author(s):  
Jared Smith ◽  
Laurence Lin ◽  
Julianne Quinn ◽  
Lawrence Band

<p>Urban land expansion is expected for our changing world, which unmitigated will result in increased flooding and nutrient exports that already wreak havoc on the wellbeing of coupled human-natural systems worldwide. Reforestation of urbanized catchments is one green infrastructure strategy to reduce stormwater volumes and nutrient exports. Reforestation designs must balance the benefits of flood flow reduction against the costs of implementation and the chance to exacerbate droughts via reduction in recharge that supplies low flows. Optimal locations and numbers of trees depend on the spatial distribution of runoff and streamflow in a catchment; however, calibration data are often only available at the catchment outlet. Equifinal model parameterizations for the outlet can result in uncertainty in the locations and magnitudes of streamflows across the catchment, which can lead to different optimal reforestation designs for different parameterizations.</p><p>Multi-objective robust optimization (MORO) has been proposed to discover reforestation designs that are robust to such parametric model uncertainty. However, it has not been shown that this actually results in better decisions than optimizing to a single, most likely parameter set, which would be less computationally expensive. In this work, the utility of MORO is assessed by comparing reforestation designs optimized using these two approaches with reforestation designs optimized to a synthetic true set of hydrologic model parameters. The spatially-distributed RHESSys ecohydrological model is employed for this study of a suburban-forested catchment in Baltimore County, Maryland, USA. Calibration of the model’s critical parameters is completed using a Bayesian framework to estimate the joint posterior distribution of the parameters. The Bayesian framework estimates the probability that different parameterizations generated the synthetic streamflow data, allowing the MORO process to evaluate reforestation portfolios across a probability-weighted sample of parameter sets in search of solutions that are robust to this uncertainty.</p><p>Reforestation portfolios are designed to minimize flooding, low flow intensity, and construction costs (number of trees). Comparing the Pareto front obtained from using MORO with the Pareto fronts obtained from optimizing to the estimated maximum a posteriori (MAP) parameter set and the synthetic true parameter set, we find that MORO solutions are closer to the synthetic solutions than are MAP solutions. This illustrates the value of considering parametric uncertainty in designing robust water systems despite the additional computational cost.</p>


2016 ◽  
Vol 20 (5) ◽  
pp. 1925-1946 ◽  
Author(s):  
Nikolaj Kruse Christensen ◽  
Steen Christensen ◽  
Ty Paul A. Ferre

Abstract. In spite of geophysics being used increasingly, it is often unclear how and when the integration of geophysical data and models can best improve the construction and predictive capability of groundwater models. This paper uses a newly developed HYdrogeophysical TEst-Bench (HYTEB) that is a collection of geological, groundwater and geophysical modeling and inversion software to demonstrate alternative uses of electromagnetic (EM) data for groundwater modeling in a hydrogeological environment consisting of various types of glacial deposits with typical hydraulic conductivities and electrical resistivities covering impermeable bedrock with low resistivity (clay). The synthetic 3-D reference system is designed so that there is a perfect relationship between hydraulic conductivity and electrical resistivity. For this system it is investigated to what extent groundwater model calibration and, often more importantly, model predictions can be improved by including in the calibration process electrical resistivity estimates obtained from TEM data. In all calibration cases, the hydraulic conductivity field is highly parameterized and the estimation is stabilized by (in most cases) geophysics-based regularization. For the studied system and inversion approaches it is found that resistivities estimated by sequential hydrogeophysical inversion (SHI) or joint hydrogeophysical inversion (JHI) should be used with caution as estimators of hydraulic conductivity or as regularization means for subsequent hydrological inversion. The limited groundwater model improvement obtained by using the geophysical data probably mainly arises from the way these data are used here: the alternative inversion approaches propagate geophysical estimation errors into the hydrologic model parameters. It was expected that JHI would compensate for this, but the hydrologic data were apparently insufficient to secure such compensation. With respect to reducing model prediction error, it depends on the type of prediction whether it has value to include geophysics in a joint or sequential hydrogeophysical model calibration. It is found that all calibrated models are good predictors of hydraulic head. When the stress situation is changed from that of the hydrologic calibration data, then all models make biased predictions of head change. All calibrated models turn out to be very poor predictors of the pumping well's recharge area and groundwater age. The reason for this is that distributed recharge is parameterized as depending on estimated hydraulic conductivity of the upper model layer, which tends to be underestimated. Another important insight from our analysis is thus that either recharge should be parameterized and estimated in a different way, or other types of data should be added to better constrain the recharge estimates.


2015 ◽  
Vol 12 (9) ◽  
pp. 9599-9653 ◽  
Author(s):  
N. K. Christensen ◽  
S. Christensen ◽  
T. P. A. Ferre

Abstract. Despite geophysics is being used increasingly, it is still unclear how and when the integration of geophysical data improves the construction and predictive capability of groundwater models. Therefore, this paper presents a newly developed HYdrogeophysical TEst-Bench (HYTEB) which is a collection of geological, groundwater and geophysical modeling and inversion software wrapped to make a platform for generation and consideration of multi-modal data for objective hydrologic analysis. It is intentionally flexible to allow for simple or sophisticated treatments of geophysical responses, hydrologic processes, parameterization, and inversion approaches. It can also be used to discover potential errors that can be introduced through petrophysical models and approaches to correlating geophysical and hydrologic parameters. With HYTEB we study alternative uses of electromagnetic (EM) data for groundwater modeling in a hydrogeological environment consisting of various types of glacial deposits with typical hydraulic conductivities and electrical resistivities covering impermeable bedrock with low resistivity. It is investigated to what extent groundwater model calibration and, often more importantly, model predictions can be improved by including in the calibration process electrical resistivity estimates obtained from TEM data. In all calibration cases, the hydraulic conductivity field is highly parameterized and the estimation is stabilized by regularization. For purely hydrologic inversion (HI, only using hydrologic data) we used Tikhonov regularization combined with singular value decomposition. For joint hydrogeophysical inversion (JHI) and sequential hydrogeophysical inversion (SHI) the resistivity estimates from TEM are used together with a petrophysical relationship to formulate the regularization term. In all cases, the regularization stabilizes the inversion, but neither the HI nor the JHI objective function could be minimized uniquely. SHI or JHI with regularization based on the use of TEM data produced estimated hydraulic conductivity fields that bear more resemblance to the reference fields than when using HI with Tikhonov regularization. However, for the studied system the resistivities estimated by SHI or JHI must be used with caution as estimators of hydraulic conductivity or as regularization means for subsequent hydrological inversion. Much of the lack of value of the geophysical data arises from a mistaken faith in the power of the petrophysical model in combination with geophysical data of low sensitivity, thereby propagating geophysical estimation errors into the hydrologic model parameters. With respect to reducing model prediction error, it depends on the type of prediction whether it has value to include geophysical data in the model calibration. It is found that all calibrated models are good predictors of hydraulic head. When the stress situation is changed from that of the hydrologic calibration data, then all models make biased predictions of head change. All calibrated models turn out to be a very poor predictor of the pumping well's recharge area and groundwater age. The reason for this is that distributed recharge is parameterized as depending on estimated hydraulic conductivity of the upper model layer which tends to be underestimated. Another important insight from the HYTEB analysis is thus that either recharge should be parameterized and estimated in a different way, or other types of data should be added to better constrain the recharge estimates.


2006 ◽  
Vol 3 (6) ◽  
pp. 3691-3726 ◽  
Author(s):  
A. Bárdossy ◽  
T. Das

Abstract. The objective in this study is to investigate the influence of the spatial resolution of the rainfall input on the model calibration and application. The analysis is carried out by varying the distribution of the raingauge network. The semi-distributed HBV model is calibrated with the precipitation interpolated from the available observed rainfall of the different raingauge networks. An automatic calibration method based on the combinatorial optimization algorithm simulated annealing is applied. Aggregated Nash-Sutcliffe coefficients at different temporal scales are adopted as objective function to estimate the model parameters. The performance of the hydrological model is analyzed as a function of the raingauge density. The calibrated model is validated using the same precipitation used for the calibration as well as interpolated precipitation based on networks of reduced and increased raingauge density. The effect of missing rainfall data is investigated by using a multiple linear regression approach for filling the missing values. The model, calibrated with the complete set of observed data, is then run in the validation period using the above described precipitation field. The simulated hydrographs obtained in the three sets of experiments are analyzed through the comparisons of the computed Nash-Sutcliffe coefficient and several goodness-of-fit indexes. The results show that the model using different raingauge networks might need recalibration of the model parameters: model calibrated on sparse information might perform well on dense information while model calibrated on dense information fails on sparse information. Also, the model calibrated with complete set of observed precipitation and run with incomplete observed data associated with the data estimated using multiple linear regressions, at the locations treated as missing measurements, performs well. A meso-scale catchment located in the south-west of Germany has been selected for this study.


2008 ◽  
Vol 10 (1) ◽  
pp. 97-111 ◽  
Author(s):  
Mohamad I. Hejazi ◽  
Ximing Cai ◽  
Deva K. Borah

We calibrate a storm-event distributed hydrologic model to a watershed, in which runoff is significantly affected by reservoir storage and release, using a multi-objective genetic algorithm (NSGA-II). This paper addresses the following questions: What forms of the objective (fitness) function used in the optimization model will result in a better calibration? How does the error in reservoir release caused by neglected human interference or the imprecise storage–release function affect the calibration? Reservoir release is studied as a specific (and popular) form of human interference. Two procedures for handling reservoir releases are tested and compared: (1) treating reservoir releases to be solely determined by the hydraulic structure (predefined storage or stage-discharge relations) as if perfect, a procedure usually adopted in watershed model calibration; or (2) adding reservoir releases that are determined by the storage–discharge relation to an error term. The error term encompasses a time-variant human interference and a discharge function error, and is determined through an optimization-based calibration procedure. It is found that the calibration procedure with consideration of human interference not only results in a better match of modeled and observed hydrograph, but also more reasonable model parameters in terms of their spatial distribution and the robustness of the parameter values.


Author(s):  
D. W. Beardsmore ◽  
H. Teng ◽  
Michael Martin

We present the detailed results of a series of Monte Carlo simulations of the Gao and Dodds calibration procedure that was carried out to determine the likely size in the errors in the Beremin cleavage model parameter estimates that might be expected for fracture toughness data sets of various sizes. The calibration process was carried out a large number of times using different sample sizes, and mean values and standard errors in the parameter estimates were determined. Modified boundary layer finite element models were used to represent high and low constraint conditions (as in the fracture tests) as well as the SSY condition. The “experimental” Jc values were obtained numerically by random sampling of a Beremin distribution function with known values of the true parameters. A number of cautionary remarks in the application of the calibration method are made.


2012 ◽  
Vol 16 (2) ◽  
pp. 603-629 ◽  
Author(s):  
T. Krauße ◽  
J. Cullmann

Abstract. The development of methods for estimating the parameters of hydrologic models considering uncertainties has been of high interest in hydrologic research over the last years. In particular methods which understand the estimation of hydrologic model parameters as a geometric search of a set of robust performing parameter vectors by application of the concept of data depth found growing research interest. Bárdossy and Singh (2008) presented a first Robust Parameter Estimation Method (ROPE) and applied it for the calibration of a conceptual rainfall-runoff model with daily time step. The basic idea of this algorithm is to identify a set of model parameter vectors with high model performance called good parameters and subsequently generate a set of parameter vectors with high data depth with respect to the first set. Both steps are repeated iteratively until a stopping criterion is met. The results estimated in this case study show the high potential of the principle of data depth to be used for the estimation of hydrologic model parameters. In this paper we present some further developments that address the most important shortcomings of the original ROPE approach. We developed a stratified depth based sampling approach that improves the sampling from non-elliptic and multi-modal distributions. It provides a higher efficiency for the sampling of deep points in parameter spaces with higher dimensionality. Another modification addresses the problem of a too strong shrinking of the estimated set of robust parameter vectors that might lead to overfitting for model calibration with a small amount of calibration data. This contradicts the principle of robustness. Therefore, we suggest to split the available calibration data into two sets and use one set to control the overfitting. All modifications were implemented into a further developed ROPE approach that is called Advanced Robust Parameter Estimation (AROPE). However, in this approach the estimation of the good parameters is still based on an ineffective Monte Carlo approach. Therefore we developed another approach called ROPE with Particle Swarm Optimisation (ROPE-PSO) that substitutes the Monte Carlo approach with a more effective and efficient approach based on Particle Swarm Optimisation. Two case studies demonstrate the improvements of the developed algorithms when compared with the first ROPE approach and two other classical optimisation approaches calibrating a process oriented hydrologic model with hourly time step. The focus of both case studies is on modelling flood events in a small catchment characterised by extreme process dynamics. The calibration problem was repeated with higher dimensionality considering the uncertainty in the soil hydraulic parameters and another conceptual parameter of the soil module. We discuss the estimated results and propose further possibilities in order to apply ROPE as a well-founded parameter estimation and uncertainty analysis tool.


2008 ◽  
Vol 12 (1) ◽  
pp. 77-89 ◽  
Author(s):  
A. Bárdossy ◽  
T. Das

Abstract. The objective in this study is to investigate the influence of the spatial resolution of the rainfall input on the model calibration and application. The analysis is carried out by varying the distribution of the raingauge network. A meso-scale catchment located in southwest Germany has been selected for this study. First, the semi-distributed HBV model is calibrated with the precipitation interpolated from the available observed rainfall of the different raingauge networks. An automatic calibration method based on the combinatorial optimization algorithm simulated annealing is applied. The performance of the hydrological model is analyzed as a function of the raingauge density. Secondly, the calibrated model is validated using interpolated precipitation from the same raingauge density used for the calibration as well as interpolated precipitation based on networks of reduced and increased raingauge density. Lastly, the effect of missing rainfall data is investigated by using a multiple linear regression approach for filling in the missing measurements. The model, calibrated with the complete set of observed data, is then run in the validation period using the above described precipitation field. The simulated hydrographs obtained in the above described three sets of experiments are analyzed through the comparisons of the computed Nash-Sutcliffe coefficient and several goodness-of-fit indexes. The results show that the model using different raingauge networks might need re-calibration of the model parameters, specifically model calibrated on relatively sparse precipitation information might perform well on dense precipitation information while model calibrated on dense precipitation information fails on sparse precipitation information. Also, the model calibrated with the complete set of observed precipitation and run with incomplete observed data associated with the data estimated using multiple linear regressions, at the locations treated as missing measurements, performs well.


Sign in / Sign up

Export Citation Format

Share Document