scholarly journals Use of historical data in flood frequency analysis: a case study for four catchments in Norway

2017 ◽  
Vol 49 (2) ◽  
pp. 466-486 ◽  
Author(s):  
Kolbjørn Engeland ◽  
Donna Wilson ◽  
Péter Borsányi ◽  
Lars Roald ◽  
Erik Holmqvist

Abstract There is a need to estimate design floods for areal planning and the design of important infrastructure. A major challenge is the mismatch between the length of the flood records and needed return periods. A majority of flood time series are shorter than 50 years, and the required return periods might be 200, 500, or 1,000 years. Consequently, the estimation uncertainty is large. In this paper, we investigated how the use of historical information might improve design flood estimation. We used annual maximum data from four selected Norwegian catchments, and historical flood information to provide an indication of water levels for the largest floods in the last two to three hundred years. We assessed the added value of using historical information and demonstrated that both reliability and stability improves, especially for short record lengths and long return periods. In this study, we used information on water levels, which showed the stability of river profiles to be a major challenge.

2021 ◽  
Author(s):  
Lei Yan ◽  
Lihua Xiong ◽  
Gusong Ruan ◽  
Chong-Yu Xu ◽  
Mengjie Zhang

Abstract In traditional flood frequency analysis, a minimum of 30 observations is required to guarantee the accuracy of design results with an allowable uncertainty; however, there has not been a recommendation for the requirement on the length of data in NFFA (nonstationary flood frequency analysis). Therefore, this study has been carried out with three aims: (i) to evaluate the predictive capabilities of nonstationary (NS) and stationary (ST) models with varying flood record lengths; (ii) to examine the impacts of flood record lengths on the NS and ST design floods and associated uncertainties; and (iii) to recommend the probable requirements of flood record length in NFFA. To achieve these objectives, 20 stations with record length longer than 100 years in Norway were selected and investigated by using both GEV (generalized extreme value)-ST and GEV-NS models with linearly varying location parameter (denoted by GEV-NS0). The results indicate that the fitting quality and predictive capabilities of GEV-NS0 outperform those of GEV-ST models when record length is approximately larger than 60 years for most stations, and the stability of the GEV-ST and GEV-NS0 is improved as record lengths increase. Therefore, a minimum of 60 years of flood observations is recommended for NFFA for the selected basins in Norway.


Water ◽  
2018 ◽  
Vol 10 (8) ◽  
pp. 1016 ◽  
Author(s):  
Jianzhu Li ◽  
Yanchen Zheng ◽  
Yimin Wang ◽  
Ting Zhang ◽  
Ping Feng ◽  
...  

Historical extraordinary floods are an important factor in non-stationary flood frequency analysis and they may occur at any time, regardless of whether the environment is changing or not. Based on mixed distribution (MD) modeling, this paper proposed an improved mixed distribution (IMD) model to consider the discontinuity and non-stationarity of flood samples simultaneously, which adds historical extraordinary floods in both sub-series divided by a change point. As a case study, the annual maximum peak discharge and volume series of Ankang hydrological station, located in the upper Hanjiang River Basin of China, were selected to identify non-stationarity by using the variation diagnosis system. MD and IMD were used to fit the flood characteristic series and a genetic algorithm was employed to estimate the optimal parameters. Compared with the design flood values fitted by the stationary Pearson type-III distribution, the results computed by IMD decreased at low return periods and increased at high return periods, with the difference varying from −6.67% to 7.19%. The results highlighted that although the design flood values of IMD are slightly larger than those of MD with different return periods, IMD provided a better result than MD. IMD provides a new perspective for non-stationary flood frequency analysis.


2013 ◽  
Vol 663 ◽  
pp. 768-772
Author(s):  
Li Jie Zhang

The evaluation and reducing of uncertainty is central to the task of hydrological frequency analysis. In this paper a Bayesian Markov Chain Monte Carlo (MCMC) method is employed to infer the parameter values of the probabilistic distribution model and evalue the uncertainties of design flood. Comparison to the estimated results of three-parameter log-normal distribution (LN3) and the three-parameter generalized extreme value distribution (GEV), the Pearson Type 3 distribution (PIII) provides a good approximation to flood-flow data. The choice of the appropriate probabilistic model can reduce uncertainty of design flood estimation. Historical flood events might be greatly reduced uncertainty when incorporating past extreme historical data into the flood frequency analysis.


2021 ◽  
Author(s):  
Daniel Hamill ◽  
Gabrielle David

Streamflow influences the distribution and organization of high water marks along rivers and streams in a landscape. The federal definition of ordinary high water mark (OHWM) is defined by physical and vegetative field indicators that are used to identify inundation extents of ordinary high water levels without any reference to the relationship between streamflow and regulatory definition. Streamflow is the amount, or volume, of water that moves through a stream per unit time. This study explores regional characteristics and relationships between field-delineated OHWMs and frequency-magnitude streamflow metrics derived from a flood frequency analysis. The elevation of OHWM is related to representative constant-level discharge return periods with national average return periods of 6.9 years using partial duration series and 2.8 years using annual maximum flood frequency approaches. The range in OHWM return periods is 0.5 to 9.08, and 1.05 to 11.01 years for peaks-over-threshold and annual maximum flood frequency methods, respectively. The range of OHWM return periods is consistent with the range found in national studies of return periods related to bankfull streamflow. Hydraulic models produced a statistically significant relationship between OHWM and bank-full, which reinforces the close relationship between the scientific concept and OHWM in most stream systems.


2020 ◽  
Author(s):  
Gang Zhao ◽  
Paul Bates ◽  
Jeff Neal ◽  
Bo Pang

<p>Design flood estimation in data-poor regions is a fundamental task in hydrology. In this paper, we propose a regional flood frequency analysis approach to estimate design floods anywhere on the global river network. This approach involves two stages: (i) clustering global gauging stations into subareas by a K-means model based on twelve globally available catchment descriptors and (ii) developing a regression model in each subarea for design flood estimation using the same descriptors. Nearly 12,000 discharge stations globally were selected for model development and a benchmark global index-flood method was adopted for comparison. The results showed that: (1) the proposed approach achieved the highest accuracy for design flood estimation when using all catchment descriptors for clustering; and the regression model accuracy improved by considering more descriptors in model development; (2) a support vector machine regression showed the highest accuracy among all regression models tested, with relative root mean squared error of 0.67 for mean flood and 0.83 for 100-year return period flood estimations; (3) 100-year return period flood magnitude in tropical, arid, temperate, continental and polar climate zones could be reliably estimated with relative mean biases of -0.18, -0.23, -0.18, 0.17 and -0.11 respectively by adopting a 5-fold cross-validation procedure; (4) the proposed approach outperformed the benchmark index-flood method for 10, 50 and 100 year return period estimates; We conclude that the proposed RFFA is a valid approach to generate design floods globally, improving our understanding of the flood hazard, especially in ungauged areas.</p>


Author(s):  
James E. Ball

Flood Management remains a major problem in many urban environments. Commonly, catchment models are used to generate the data needed for estimation of flood risk; event-based and continuous-based models have been used for this purpose. Use of catchment models requires calibration and validation with a calibration metric used to assess the predicted catchment response against the recorded catchment response. In this study, a continuous model based on SWMM using the Powells Creek catchment as a case study is investigated. Calibration of the model was obtained using 25 selected events from the monitored data for the catchment. Assessment of the calibration used a normalised peak flow error. Using alternative sets of parameter values to obtain estimates of the peak flow for each of the selected events and different accuracy criteria, the best datasets for each of the accuracy criteria were identified. These datasets were used with SWMM in a continuous simulation mode to predict flow sequences for extraction of Annual Maxima Series for an At-Site Flood Frequency Analysis. From analysis of these At-Site Flood Frequency Analyses, it was concluded that the normalised peak flow error needed to be less than 10% if reliable design flood quantile estimates were to be obtained.


2021 ◽  
Vol 5 (1) ◽  
pp. 1-11
Author(s):  
Vitthal Anwat ◽  
Pramodkumar Hire ◽  
Uttam Pawar ◽  
Rajendra Gunjal

Flood Frequency Analysis (FFA) method was introduced by Fuller in 1914 to understand the magnitude and frequency of floods. The present study is carried out using the two most widely accepted probability distributions for FFA in the world namely, Gumbel Extreme Value type I (GEVI) and Log Pearson type III (LP-III). The Kolmogorov-Smirnov (KS) and Anderson-Darling (AD) methods were used to select the most suitable probability distribution at sites in the Damanganga Basin. Moreover, discharges were estimated for various return periods using GEVI and LP-III. The recurrence interval of the largest peak flood on record (Qmax) is 107 years (at Nanipalsan) and 146 years (at Ozarkhed) as per LP-III. Flood Frequency Curves (FFC) specifies that LP-III is the best-fitted probability distribution for FFA of the Damanganga Basin. Therefore, estimated discharges and return periods by LP-III probability distribution are more reliable and can be used for designing hydraulic structures.


2014 ◽  
Vol 14 (5) ◽  
pp. 1283-1298 ◽  
Author(s):  
D. Lawrence ◽  
E. Paquet ◽  
J. Gailhard ◽  
A. K. Fleig

Abstract. Simulation methods for extreme flood estimation represent an important complement to statistical flood frequency analysis because a spectrum of catchment conditions potentially leading to extreme flows can be assessed. In this paper, stochastic, semi-continuous simulation is used to estimate extreme floods in three catchments located in Norway, all of which are characterised by flood regimes in which snowmelt often has a significant role. The simulations are based on SCHADEX, which couples a precipitation probabilistic model with a hydrological simulation such that an exhaustive set of catchment conditions and responses is simulated. The precipitation probabilistic model is conditioned by regional weather patterns, and a bottom–up classification procedure was used to define a set of weather patterns producing extreme precipitation in Norway. SCHADEX estimates for the 1000-year (Q1000) discharge are compared with those of several standard methods, including event-based and long-term simulations which use a single extreme precipitation sequence as input to a hydrological model, statistical flood frequency analysis based on the annual maximum series, and the GRADEX method. The comparison suggests that the combination of a precipitation probabilistic model with a long-term simulation of catchment conditions, including snowmelt, produces estimates for given return periods which are more in line with those based on statistical flood frequency analysis, as compared with the standard simulation methods, in two of the catchments. In the third case, the SCHADEX method gives higher estimates than statistical flood frequency analysis and further suggests that the seasonality of the most likely Q1000 events differs from that of the annual maximum flows. The semi-continuous stochastic simulation method highlights the importance of considering the joint probability of extreme precipitation, snowmelt rates and catchment saturation states when assigning return periods to floods estimated by precipitation-runoff methods. The SCHADEX methodology, as applied here, is dependent on observed discharge data for calibration of a hydrological model, and further study to extend its application to ungauged catchments would significantly enhance its versatility.


2021 ◽  
Author(s):  
Xiao Pan ◽  
Ataur Rahman

Abstract Flood frequency analysis (FFA) enables fitting of distribution functions to observed flow data for estimation of flood quantiles. Two main approaches, Annual Maximum (AM) and peaks-over-threshold (POT) are adopted for FFA. POT approach is under-employed due to its complexity and uncertainty associated with the threshold selection and independence criteria for selecting peak flows. This study evaluates the POT and AM approaches using data from 188 gauged stations in south-east Australia. POT approach adopted in this study applies a different average numbers of events per year fitted with Generalised Pareto (GP) distribution with an automated threshold detection method. The POT model extends its parametric approach to Maximum Likelihood Estimator (MLE) and Point Moment Weighted Unbiased (PMWU) method. Generalised Extreme Value (GEV) distribution using L-moment estimator is used for AM approach. It has been found that there is a large difference in design flood estimates between the AM and POT approaches for smaller average recurrence intervals (ARI), with a median difference of 25% for 1.01 year ARI and 5% for 50 and 100 years ARIs.


Water ◽  
2020 ◽  
Vol 12 (7) ◽  
pp. 1867
Author(s):  
Chunlai Qu ◽  
Jing Li ◽  
Lei Yan ◽  
Pengtao Yan ◽  
Fang Cheng ◽  
...  

Under changing environments, the most widely used non-stationary flood frequency analysis (NFFA) method is the generalized additive models for location, scale and shape (GAMLSS) model. However, the model structure of the GAMLSS model is relatively complex due to the large number of statistical parameters, and the relationship between statistical parameters and covariates is assumed to be unchanged in future, which may be unreasonable. In recent years, nonparametric methods have received increasing attention in the field of NFFA. Among them, the linear quantile regression (QR-L) model and the non-linear quantile regression model of cubic B-spline (QR-CB) have been introduced into NFFA studies because they do not need to determine statistical parameters and consider the relationship between statistical parameters and covariates. However, these two quantile regression models have difficulties in estimating non-stationary design flood, since the trend of the established model must be extrapolated infinitely to estimate design flood. Besides, the number of available observations becomes scarcer when estimating design values corresponding to higher return periods, leading to unreasonable and inaccurate design values. In this study, we attempt to propose a cubic B-spline-based GAMLSS model (GAMLSS-CB) for NFFA. In the GAMLSS-CB model, the relationship between statistical parameters and covariates is fitted by the cubic B-spline under the GAMLSS model framework. We also compare the performance of different non-stationary models, namely the QR-L, QR-CB, and GAMLSS-CB models. Finally, based on the optimal non-stationary model, the non-stationary design flood values are estimated using the average design life level method (ADLL). The annual maximum flood series of four stations in the Weihe River basin and the Pearl River basin are taken as examples. The results show that the GAMLSS-CB model displays the best model performance compared with the QR-L and QR-CB models. Moreover, it is feasible to estimate design flood values based on the GAMLSS-CB model using the ADLL method, while the estimation of design flood based on the quantile regression model requires further studies.


Sign in / Sign up

Export Citation Format

Share Document