scholarly journals Probabilistic interval estimation of design floods under non-stationary conditions by an integrated approach

2021 ◽  
Author(s):  
Yanlai Zhou ◽  
Shenglian Guo ◽  
Chong-Yu Xu ◽  
Lihua Xiong ◽  
Hua Chen ◽  
...  

Abstract Quantifying the uncertainty of non-stationary flood frequency analysis is very crucial and beneficial for planning and design of water engineering projects, which is fundamentally challenging especially in the presence of high climate variability and reservoir regulation. This study proposed an integrated approach that combined the Generalized Additive Model for Location, Scale and Shape parameters (GAMLSS) method, the Copula function and the Bayesian Uncertainty Processor (BUP) technique to make reliable probabilistic interval estimations of design floods. The reliability and applicability of the proposed approach were assessed by flood datasets collected from two hydrological monitoring stations located in the Hanjiang River of China. The precipitation and the reservoir index were selected as the explanatory variables for modeling the time-varying parameters of marginal and joint distributions using long-term (1954–2018) observed datasets. First, the GAMLSS method was employed to model and fit the time-varying characteristics of parameters in marginal and joint distributions. Second, the Copula function was employed to execute the point estimations of non-stationary design floods. Finally, the BUP technique was employed to perform the interval estimations of design floods based on the point estimations obtained from the Copula function. The results demonstrated that the proposed approach can provide reliable probabilistic interval estimations of design floods meanwhile reducing the uncertainty of non-stationary flood frequency analysis. Consequently, the integrated approach is a promising way to offer an indication on how design values can be estimated in a high-dimensional problem.

2021 ◽  
Author(s):  
Anne Fangmann ◽  
Uwe Haberlandt

<p>Flood frequency analysis (FFA) has long been the standard procedure for obtaining design floods for all kinds of purposes. Ideally, the data at the basis of the statistical operations have a high temporal resolution, in order to facilitate a full account of the observed flood peaks and hence a precise model fitting and flood quantile estimation.</p><p>Unfortunately, high-resolution flows are rarely disposable. Often, average daily flows pose the only available/sufficiently long base for flood frequency analysis. This averaging naturally causes a significant smoothing of the flood wave, such that the “instantaneous” peak can no longer be observed. As a possible consequence, design floods derived from these data may be severely underrated.</p><p>How strongly the original peaks are flattened and how this influences the design flood estimation depends on a variety of factors and varies from gauge to gauge. In this study we are looking at a range of errors arising from the use of daily instead of instantaneous flow data. These include differences in the observed individual flood peaks and mean annual maximum floods, as well as the estimated distribution parameters and flood quantiles. The aim is to identify catchment specific factors that influence the magnitude of these errors, and ultimately to provide a means for error assessment on the mere basis of local hydrological conditions, specifically where no high-resolution data is available.</p><p>The analyses are carried out on an all-German dataset of discharge gauges, for which high-resolution data is available for at least 30 years. The classical FFA approach of fitting distributions to annual maximum series is utilized for error assessment. For identification of influencing factors, both the discharge series themselves and a catalogue of climatic and physiographic catchment descriptors are screened.</p>


Water ◽  
2019 ◽  
Vol 11 (3) ◽  
pp. 475 ◽  
Author(s):  
Ting Zhou ◽  
Zhiyong Liu ◽  
Juliang Jin ◽  
Hongxiang Hu

Flood frequency analysis plays a fundamental role in dam planning, reservoir operation, and risk assessment. However, conventional univariate flood frequency analysis carried out by flood peak inflow or volume does not account for the dependence between flood properties. In this paper, we proposed an integrated approach to estimate reservoir risk by combining the copula-based bivariate flood frequency (peak and volume) and reservoir routing. Through investigating the chain reaction of “flood frequency—reservoir operation-flood risk”, this paper demonstrated how to simulate flood hydrographs using different frequency definitions (copula “Or” and “And” scenario), and how these definitions affect flood risks. The approach was applied to the Meishan reservoir in central China. A set of flood hydrographs with 0.01 frequency under copula “Or” and “And” definitions were constructed, respectively. Upstream and downstream flood risks incorporating reservoir operation were calculated for each scenario. Comparisons between flood risks from univariate and bivariate flood frequency analysis showed that bivariate flood frequency analysis produced less diversity in the results, and thus the results are more reliable in risk assessment. More importantly, the peak-volume combinations in a bivariate approach can be adjusted according to certain prediction accuracy, providing a flexible estimation of real-time flood risk under different prediction accuracies and safety requirements.


Water ◽  
2018 ◽  
Vol 10 (7) ◽  
pp. 819 ◽  
Author(s):  
Ting Zhang ◽  
Yixuan Wang ◽  
Bing Wang ◽  
Senming Tan ◽  
Ping Feng

2015 ◽  
Vol 16 (4) ◽  
pp. 1561-1574 ◽  
Author(s):  
Martin Durocher ◽  
Fateh Chebana ◽  
Taha B. M. J. Ouarda

Abstract This paper presents an approach for regional flood frequency analysis (RFFA) in the presence of nonlinearity and problematic stations, which require adapted methodologies. To this end, the projection pursuit regression (PPR) is proposed. The PPR is a family of regression models that applies smooth functions on intermediate predictors to fit complex patterns. The PPR approach can be seen as a hybrid method between the generalized additive model (GAM) and the artificial neural network (ANN), which combines the advantages of both methods. Indeed, the PPR approach has the structure of a GAM to describe nonlinear relations between hydrological variables and other basin characteristics. On the other hand, PPR can consider interactions between basin characteristics to improve the predictive capabilities in a similar way to ANN, but simpler. The methodology developed in the present study is applied to a case study represented by hydrometric stations from southern Québec, Canada. It is shown that flood quantiles are mostly associated with a dominant intermediate predictor, which provides a parsimonious representation of the nonlinearity in the flood-generating processes. The model performance is compared to eight other methods available in the literature for the same dataset, including GAM and ANN. When using the same basin characteristics, the results indicate that the simpler structure of PPR does not affect the global performance and that PPR is competitive with the best existing methods in RFFA. Particular attention is also given to the performance resulting from the choice of the basin characteristics and the presence of problematic stations.


Water SA ◽  
2018 ◽  
Vol 44 (3 July) ◽  
Author(s):  
JJ Nathanael ◽  
JC Smithers ◽  
MJC Horan

In engineering and flood hydrology, the estimation of a design flood associates the magnitude of a flood with a level of exceedance, or return period, for a given site. The use of a regional flood frequency analysis (RFFA) approach improves the accuracy and reliability of estimates of design floods. However, no RFFA method is currently widely used in South Africa, despite a number of RFFA studies having been undertaken in Africa and which include South Africa in their study areas. Hence, the performance of the current RFFA approaches needs to be assessed in order to determine the best approaches to use and to determine if a new RFFA approach needs to be developed for use in South Africa. Through a review of the relevant literature it was found that the Meigh et al. (1997) method, the Mkhandi et al. (2000) method, the Görgens (2007) Joint Peak-Volume (JPV) method and the Haile (2011) method are available for application in a nationwide study. The results of the study show that the Haile method generally performs better than the other RFFA methods; however, it also consistently underestimates design floods. Due to the poor overall performance of the RFFA methods assessed, it is recommended that a new RFFA method be developed for application in design flood practice in South Africa.


2020 ◽  
Author(s):  
Lei Yan ◽  
Lihua Xiong ◽  
Lingqi Li ◽  
Gusong Ruan ◽  
Chong-Yu Xu ◽  
...  

<p>In the traditional flood frequency analysis, researchers typically assume the flood events result from a homogeneous flood population. However, actually flood events are likely to be generated by distinct flood generation mechanisms (FGMs), such as snowmelt-induced floods and rainfall-induced floods. To address this problem in flood frequency analysis, currently, the most popular practice for mixture modeling of flood events is to use two-component mixture distributions (TCMD) without a priori classification of distict FGMs, which could result in component distributions without physical reality or lead to a larger standard error of the estimated quantiles. To improve the mixture distribution modeling in Norway, we firstly classify the flood series of 34 watersheds into snowmelt-induced long-duration floods and rainfall-induced short-duration floods based on an index named flood timescale (FT), defined as the ratio of the flood volume to peak value. A total of ten types of mixture distributions are considered in the application of FT-based TCMD to model the flood events in Norway. The results indicate that the FT-based TCMD model can reduce the uncertainty in the estimation of design floods. The improved predictive ability of the FT-based TCMD model is largely due to its explicit recognition of distinct FGMs, enabling the determination of the weighting coefficient without optimization.</p>


Sign in / Sign up

Export Citation Format

Share Document