scholarly journals A stochastic event-based approach for flood estimation in catchments with mixed rainfall/snowmelt flood regimes

2018 ◽  
Author(s):  
Valeriya Filipova ◽  
Deborah Lawrence ◽  
Thomas Skaugen

Abstract. The estimation of extreme floods is associated with high uncertainty, in part due to the limited length of streamflow records. Traditionally, flood frequency analysis or event-based model using a single design storm have been applied. We propose here an alternative, stochastic event-based modelling approach. The stochastic PQRUT method involves Monte Carlo procedure to simulate different combinations of initial conditions, rainfall and snowmelt, from which a distribution of flood peaks can be constructed. The stochastic PQRUT was applied for 20 small and medium-sized catchments in Norway and the results show good fit to the observations. A sensitivity analysis of the method indicates that the soil saturation level is less important than the rainfall input and the parameters of the PQRUT model for flood peaks with return periods higher than 100 years, and that excluding the snow routine can change the seasonality of the flood peaks. Estimates for the 100- and 1000-year return level based on the stochastic PQRUT model are compared with results for a) statistical frequency analysis, and b) a standard implementation of the event-based PQRUT method. The differences between the estimates can be up to 200 % for some catchments, which highlights the uncertainty in these methods.

2019 ◽  
Vol 19 (1) ◽  
pp. 1-18 ◽  
Author(s):  
Valeriya Filipova ◽  
Deborah Lawrence ◽  
Thomas Skaugen

Abstract. The estimation of extreme floods is associated with high uncertainty, in part due to the limited length of streamflow records. Traditionally, statistical flood frequency analysis and an event-based model (PQRUT) using a single design storm have been applied in Norway. We here propose a stochastic PQRUT model, as an extension of the standard application of the event-based PQRUT model, by considering different combinations of initial conditions, rainfall and snowmelt, from which a distribution of flood peaks can be constructed. The stochastic PQRUT was applied for 20 small- and medium-sized catchments in Norway and the results give good fits to observed peak-over-threshold (POT) series. A sensitivity analysis of the method indicates (a) that the soil saturation level is less important than the rainfall input and the parameters of the PQRUT model for flood peaks with return periods higher than 100 years and (b) that excluding the snow routine can change the seasonality of the flood peaks. Estimates for the 100- and 1000-year return level based on the stochastic PQRUT model are compared with results for (a) statistical frequency analysis and (b) a standard implementation of the event-based PQRUT method. The differences in flood estimates between the stochastic PQRUT and the statistical flood frequency analysis are within 50 % in most catchments. However, the differences between the stochastic PQRUT and the standard implementation of the PQRUT model are much higher, especially in catchments with a snowmelt flood regime.


2014 ◽  
Vol 14 (5) ◽  
pp. 1283-1298 ◽  
Author(s):  
D. Lawrence ◽  
E. Paquet ◽  
J. Gailhard ◽  
A. K. Fleig

Abstract. Simulation methods for extreme flood estimation represent an important complement to statistical flood frequency analysis because a spectrum of catchment conditions potentially leading to extreme flows can be assessed. In this paper, stochastic, semi-continuous simulation is used to estimate extreme floods in three catchments located in Norway, all of which are characterised by flood regimes in which snowmelt often has a significant role. The simulations are based on SCHADEX, which couples a precipitation probabilistic model with a hydrological simulation such that an exhaustive set of catchment conditions and responses is simulated. The precipitation probabilistic model is conditioned by regional weather patterns, and a bottom–up classification procedure was used to define a set of weather patterns producing extreme precipitation in Norway. SCHADEX estimates for the 1000-year (Q1000) discharge are compared with those of several standard methods, including event-based and long-term simulations which use a single extreme precipitation sequence as input to a hydrological model, statistical flood frequency analysis based on the annual maximum series, and the GRADEX method. The comparison suggests that the combination of a precipitation probabilistic model with a long-term simulation of catchment conditions, including snowmelt, produces estimates for given return periods which are more in line with those based on statistical flood frequency analysis, as compared with the standard simulation methods, in two of the catchments. In the third case, the SCHADEX method gives higher estimates than statistical flood frequency analysis and further suggests that the seasonality of the most likely Q1000 events differs from that of the annual maximum flows. The semi-continuous stochastic simulation method highlights the importance of considering the joint probability of extreme precipitation, snowmelt rates and catchment saturation states when assigning return periods to floods estimated by precipitation-runoff methods. The SCHADEX methodology, as applied here, is dependent on observed discharge data for calibration of a hydrological model, and further study to extend its application to ungauged catchments would significantly enhance its versatility.


2021 ◽  
Author(s):  
Lei Yan ◽  
Lihua Xiong ◽  
Gusong Ruan ◽  
Chong-Yu Xu ◽  
Mengjie Zhang

Abstract In traditional flood frequency analysis, a minimum of 30 observations is required to guarantee the accuracy of design results with an allowable uncertainty; however, there has not been a recommendation for the requirement on the length of data in NFFA (nonstationary flood frequency analysis). Therefore, this study has been carried out with three aims: (i) to evaluate the predictive capabilities of nonstationary (NS) and stationary (ST) models with varying flood record lengths; (ii) to examine the impacts of flood record lengths on the NS and ST design floods and associated uncertainties; and (iii) to recommend the probable requirements of flood record length in NFFA. To achieve these objectives, 20 stations with record length longer than 100 years in Norway were selected and investigated by using both GEV (generalized extreme value)-ST and GEV-NS models with linearly varying location parameter (denoted by GEV-NS0). The results indicate that the fitting quality and predictive capabilities of GEV-NS0 outperform those of GEV-ST models when record length is approximately larger than 60 years for most stations, and the stability of the GEV-ST and GEV-NS0 is improved as record lengths increase. Therefore, a minimum of 60 years of flood observations is recommended for NFFA for the selected basins in Norway.


2013 ◽  
Vol 1 (6) ◽  
pp. 6785-6828 ◽  
Author(s):  
D. Lawrence ◽  
E. Paquet ◽  
J. Gailhard ◽  
A. K. Fleig

Abstract. Simulation methods for extreme flood estimation represent an important complement to statistical flood frequency analysis because a spectrum of catchment conditions potentially leading to extreme flows can be assessed. In this paper, stochastic, semi-continuous simulation is used to estimate extreme floods in three catchments located in Norway, all of which are characterised by flood regimes in which snowmelt often has a significant role. The simulations are based on SCHADEX, which couples a precipitation probabilistic model with a hydrological simulation such that an exhaustive set of catchment conditions and responses are simulated. The precipitation probabilistic model is conditioned by regional weather patterns, and a "bottom-up" classification procedure was used for defining a set of weather patterns producing extreme precipitation in Norway. SCHADEX estimates for the 1000 yr (Q1000) discharge are compared with those of several standard methods, including event-based and long-term simulations which use a single extreme precipitation sequence as input to a hydrological model, with statistical flood frequency analysis based on the annual maximum series, and with the GRADEX method. The comparison suggests that the combination of a precipitation probabilistic model with a long-term simulation of catchment conditions, including snowmelt, produces estimates for given return periods which are more in line with those based on statistical flood frequency analysis, as compared with the standard simulation methods, in two of the catchments. In the third case, the SCHADEX method gives higher estimates than statistical flood frequency analysis and further suggests that the seasonality of the most likely Q1000 events differs from that of the annual maximum flows. The semi-continuous stochastic simulation method highlights the importance of considering the joint probability of extreme precipitation, snowmelt rates and catchment saturation states when assigning return periods to floods estimated by precipitation-runoff methods. The SCHADEX methodology, as applied here, is dependent on observed discharge data for calibration of a hydrological model, and further study to extend its application to ungauged catchments would significantly enhance its versatility.


2021 ◽  
Author(s):  
Anne Bartens ◽  
Uwe Haberlandt

Abstract. In many cases flood frequency analysis needs to be carried out on mean daily flow (MDF) series without any available information on the instantaneous peak flow (IPF). We analyze the error of using MDFs instead of IPFs for flood quantile estimation on a German dataset and assess spatial patterns and factors that influence the deviation of MDF floods from their IPF counterparts. The main dependence could be found for catchment area but also gauge elevation appeared to have some influence. Based on the findings we propose simple linear models to correct both MDF flood peaks of individual flood events and overall MDF flood statistics. Key predictor in the models is the event-based ratio of flood peak and flood volume obtained directly from the daily flow records. This correction approach requires a minimum of data input, is easily applied, valid for the entire study area and successfully estimates IPF peaks and flood statistics. The models perform particularly well in smaller catchments, where other IPF estimation methods fall short. Still, the limit of the approach is reached for catchment sizes below 100 km2, where the hydrograph information from the daily series is no longer capable of approximating instantaneous flood dynamics.


2021 ◽  
Author(s):  
Anne Fangmann ◽  
Uwe Haberlandt

<p>Flood frequency analysis (FFA) has long been the standard procedure for obtaining design floods for all kinds of purposes. Ideally, the data at the basis of the statistical operations have a high temporal resolution, in order to facilitate a full account of the observed flood peaks and hence a precise model fitting and flood quantile estimation.</p><p>Unfortunately, high-resolution flows are rarely disposable. Often, average daily flows pose the only available/sufficiently long base for flood frequency analysis. This averaging naturally causes a significant smoothing of the flood wave, such that the “instantaneous” peak can no longer be observed. As a possible consequence, design floods derived from these data may be severely underrated.</p><p>How strongly the original peaks are flattened and how this influences the design flood estimation depends on a variety of factors and varies from gauge to gauge. In this study we are looking at a range of errors arising from the use of daily instead of instantaneous flow data. These include differences in the observed individual flood peaks and mean annual maximum floods, as well as the estimated distribution parameters and flood quantiles. The aim is to identify catchment specific factors that influence the magnitude of these errors, and ultimately to provide a means for error assessment on the mere basis of local hydrological conditions, specifically where no high-resolution data is available.</p><p>The analyses are carried out on an all-German dataset of discharge gauges, for which high-resolution data is available for at least 30 years. The classical FFA approach of fitting distributions to annual maximum series is utilized for error assessment. For identification of influencing factors, both the discharge series themselves and a catalogue of climatic and physiographic catchment descriptors are screened.</p>


2019 ◽  
Vol 11 (4) ◽  
pp. 966-979
Author(s):  
Nur Amalina Mat Jan ◽  
Ani Shabri ◽  
Ruhaidah Samsudin

Abstract Non-stationary flood frequency analysis (NFFA) plays an important role in addressing the issue of the stationary assumption (independent and identically distributed flood series) that is no longer valid in infrastructure-designed methods. This confirms the necessity of developing new statistical models in order to identify the change of probability functions over time and obtain a consistent flood estimation method in NFFA. The method of Trimmed L-moments (TL-moments) with time covariate is confronted with the L-moment method for the stationary and non-stationary generalized extreme value (GEV) models. The aims of the study are to investigate the behavior of the proposed TL-moments method in the presence of NFFA and applying the method along with GEV distribution. Comparisons of the methods are made by Monte Carlo simulations and bootstrap-based method. The simulation study showed the better performance of most levels of TL-moments method, which is TL(η,0), (η = 2, 3, 4) than the L-moment method for all models (GEV1, GEV2, and GEV3). The TL-moment method provides more efficient quantile estimates than other methods in flood quantiles estimated at higher return periods. Thus, the TL-moments method can produce better estimation results since the L-moment eliminates lowest value and gives more weight to the largest value which provides important information.


2013 ◽  
Vol 663 ◽  
pp. 768-772
Author(s):  
Li Jie Zhang

The evaluation and reducing of uncertainty is central to the task of hydrological frequency analysis. In this paper a Bayesian Markov Chain Monte Carlo (MCMC) method is employed to infer the parameter values of the probabilistic distribution model and evalue the uncertainties of design flood. Comparison to the estimated results of three-parameter log-normal distribution (LN3) and the three-parameter generalized extreme value distribution (GEV), the Pearson Type 3 distribution (PIII) provides a good approximation to flood-flow data. The choice of the appropriate probabilistic model can reduce uncertainty of design flood estimation. Historical flood events might be greatly reduced uncertainty when incorporating past extreme historical data into the flood frequency analysis.


2019 ◽  
Vol 23 (11) ◽  
pp. 4453-4470 ◽  
Author(s):  
Bin Xiong ◽  
Lihua Xiong ◽  
Jun Xia ◽  
Chong-Yu Xu ◽  
Cong Jiang ◽  
...  

Abstract. Many studies have shown that downstream flood regimes have been significantly altered by upstream reservoir operation. Reservoir effects on the downstream flow regime are normally performed by comparing the pre-dam and post-dam frequencies of certain streamflow indicators, such as floods and droughts. In this study, a rainfall–reservoir composite index (RRCI) is developed to precisely quantify reservoir impacts on downstream flood frequency under a framework of a covariate-based nonstationary flood frequency analysis using the Bayesian inference method. The RRCI is derived from a combination of both a reservoir index (RI) for measuring the effects of reservoir storage capacity and a rainfall index. More precisely, the OR joint (the type of possible joint events based on the OR operator) exceedance probability (OR-JEP) of certain scheduling-related variables selected out of five variables that describe the multiday antecedent rainfall input (MARI) is used to measure the effects of antecedent rainfall on reservoir operation. Then, the RI-dependent or RRCI-dependent distribution parameters and five distributions, the gamma, Weibull, lognormal, Gumbel, and generalized extreme value, are used to analyze the annual maximum daily flow (AMDF) of the Ankang, Huangjiagang, and Huangzhuang gauging stations of the Han River, China. A phenomenon is observed in which although most of the floods that peak downstream of reservoirs have been reduced in magnitude by upstream reservoirs, some relatively large flood events have still occurred, such as at the Huangzhuang station in 1983. The results of nonstationary flood frequency analysis show that, in comparison to the RI, the RRCI that combines both the RI and the OR-JEP resulted in a much better explanation for such phenomena of flood occurrences downstream of reservoirs. A Bayesian inference of the 100-year return level of the AMDF shows that the optimal RRCI-dependent distribution, compared to the RI-dependent one, results in relatively smaller estimated values. However, exceptions exist due to some low OR-JEP values. In addition, it provides a smaller uncertainty range. This study highlights the necessity of including antecedent rainfall effects, in addition to the effects of reservoir storage capacity, on reservoir operation to assess the reservoir effects on downstream flood frequency. This analysis can provide a more comprehensive approach for downstream flood risk management under the impacts of reservoirs.


2016 ◽  
Author(s):  
Tomohiro Tanaka ◽  
Yasuto Tachikawa ◽  
Yutaka Ichikawa ◽  
Kazuaki Yorozu

Abstract. Design flood, river discharge with a particular return period, is fundamental to determine the scale of flood control facilities. In addition, considering a changing climate, not only frequencies of river discharge at design level but those of devastating flooding are also crucial. Characteristics of river discharge during extreme floods largely differ from those during others because of upstream dam operation and/or river overflow; however, flood frequency analysis (FFA) from past discharge data is difficult to represent such impact because river basins rarely experience floods over the design level after river improvement and dam construction. To account for the above impact on extreme flood frequencies, this study presented a rainfall-based flood frequency model (RFFM) that derives flood frequencies from probabilistic rainfall modelling that empirically represents probabilistic structure of rainfall intensity over a catchment by directly using observed spatial-temporal rainfall profiles. The RFFM was applied to the Yodo River basin, Japan and demonstrated that flood frequency estimations by the RFFM well represent past flood frequencies at the Hirakata gauging station. Furthermore, the RFFM showed that return periods of large flood peaks are estimated at extremely large values, reflecting decrease of discharge by the inundation in an upstream area of the gauging station. On the other hand, FFA from past discharge data did not represent this impact because it has not experienced such huge flood peaks in an observation period. This study demonstrated the importance of the RFFM for flood frequency estimations, including those exceeding the design level.


Sign in / Sign up

Export Citation Format

Share Document