scholarly journals Technical Note: Flood frequency study using partial duration series coupled with entropy principle

2021 ◽  
Author(s):  
Sonali Swetapadma ◽  
Chandra Shekhar Prasad Ojha

Abstract. Quality discharge measurements and frequency analysis are two major prerequisites for defining a design flood. Flood frequency analysis (FFA) utilizes a comprehensive understanding of the probabilistic behavior of extreme events but has certain limitations regarding the sampling method and choice of distribution models. Entropy as a modern-day tool has found several applications in FFA, mainly in the derivation of probability distributions and their parameter estimation as per the principle of maximum entropy (POME) theory. The present study explores a new dimension to this area of research, where POME theory is applied in the partial duration series (PDS) modeling of FFA to locate the optimum threshold and the respective distribution models. The proposed methodology is applied to the Waimakariri River at the Old Highway Bridge site in New Zealand, as it has one of the best quality discharge data. The catchment also has a history of significant flood events in the last few decades. The degree of fitness of models to the exceedances is compared with the standardized statistical approach followed in literature. Also, the threshold estimated from this study is matched with some previous findings. Various return period quantiles are calculated, and their predictive ability is tested by bootstrap sampling. An overall analysis of results shows that entropy can be also be used as an effective tool for threshold identification in PDS modeling of flood frequency studies.

2012 ◽  
Vol 4 (1) ◽  
pp. 36-41 ◽  
Author(s):  
Abhijit Bhuyan ◽  
Munindra Borah

The annual maximum discharge data of six gauging sites have been considered for L-moment based regional flood frequency analysis of Tripura, India. Homogeneity of the region has been tested based on heterogeneity measure (H) using method of L-moment. Based on heterogeneity measure it has been observed that the region consist of six gauging sites is homogeneous. Different probability distributions viz. Generalized extreme value (GEV), Generalized Logistic (GLO), Generalized Pareto (GPA), Generalized Normal (GNO), Pearson Type III (PE3) and Wakebay (WAK) have been considered for this investigation. PE3, GNO and GEV have been identified as the candidate distributions based on the L-moment ratio diagram and ZDIST -statistics criteria. Regional growth curves for three candidate distributions have been developed for gauged and ungauged catchments. Monte Carlo simulations technique has also been used to estimate accuracy of the estimated regional growth curves and quantiles. From simulation study it has been observed that PE3 distribution is the robust one.


2021 ◽  
Author(s):  
Xiao Pan ◽  
Ataur Rahman

Abstract Flood frequency analysis (FFA) enables fitting of distribution functions to observed flow data for estimation of flood quantiles. Two main approaches, Annual Maximum (AM) and peaks-over-threshold (POT) are adopted for FFA. POT approach is under-employed due to its complexity and uncertainty associated with the threshold selection and independence criteria for selecting peak flows. This study evaluates the POT and AM approaches using data from 188 gauged stations in south-east Australia. POT approach adopted in this study applies a different average numbers of events per year fitted with Generalised Pareto (GP) distribution with an automated threshold detection method. The POT model extends its parametric approach to Maximum Likelihood Estimator (MLE) and Point Moment Weighted Unbiased (PMWU) method. Generalised Extreme Value (GEV) distribution using L-moment estimator is used for AM approach. It has been found that there is a large difference in design flood estimates between the AM and POT approaches for smaller average recurrence intervals (ARI), with a median difference of 25% for 1.01 year ARI and 5% for 50 and 100 years ARIs.


Water ◽  
2020 ◽  
Vol 12 (7) ◽  
pp. 1867
Author(s):  
Chunlai Qu ◽  
Jing Li ◽  
Lei Yan ◽  
Pengtao Yan ◽  
Fang Cheng ◽  
...  

Under changing environments, the most widely used non-stationary flood frequency analysis (NFFA) method is the generalized additive models for location, scale and shape (GAMLSS) model. However, the model structure of the GAMLSS model is relatively complex due to the large number of statistical parameters, and the relationship between statistical parameters and covariates is assumed to be unchanged in future, which may be unreasonable. In recent years, nonparametric methods have received increasing attention in the field of NFFA. Among them, the linear quantile regression (QR-L) model and the non-linear quantile regression model of cubic B-spline (QR-CB) have been introduced into NFFA studies because they do not need to determine statistical parameters and consider the relationship between statistical parameters and covariates. However, these two quantile regression models have difficulties in estimating non-stationary design flood, since the trend of the established model must be extrapolated infinitely to estimate design flood. Besides, the number of available observations becomes scarcer when estimating design values corresponding to higher return periods, leading to unreasonable and inaccurate design values. In this study, we attempt to propose a cubic B-spline-based GAMLSS model (GAMLSS-CB) for NFFA. In the GAMLSS-CB model, the relationship between statistical parameters and covariates is fitted by the cubic B-spline under the GAMLSS model framework. We also compare the performance of different non-stationary models, namely the QR-L, QR-CB, and GAMLSS-CB models. Finally, based on the optimal non-stationary model, the non-stationary design flood values are estimated using the average design life level method (ADLL). The annual maximum flood series of four stations in the Weihe River basin and the Pearl River basin are taken as examples. The results show that the GAMLSS-CB model displays the best model performance compared with the QR-L and QR-CB models. Moreover, it is feasible to estimate design flood values based on the GAMLSS-CB model using the ADLL method, while the estimation of design flood based on the quantile regression model requires further studies.


2018 ◽  
Vol 7 (4.35) ◽  
pp. 709 ◽  
Author(s):  
Munir Snu ◽  
Sidek L.M ◽  
Haron Sh ◽  
Noh Ns.M ◽  
Basri H ◽  
...  

The recent flood event occurred in 2014 had caused disaster in Perak and Sungai Perak is the main river of Perak which is a major natural drainage system within the state. The aim of this paper is to determine the expected discharge to return period downstream for Sg. Perak River Basin in Perak by using annual maximum flow data. Flood frequency analysis is a technique to assume the flow values corresponding to specific return periods or probabilities along the river at a different site. The method involves the observed annual maximum flow discharge data to calculate statistical information such as standard deviations, mean, sum, skewness and recurrence intervals. The flood frequency analysis for Sg. Perak River Basin was used Log Pearson Type-III probability distribution method. The annual maximum peak flow series data varying over period 1961 to 2016. The probability distribution function was applied to return periods (T) where T values are 2years, 5years, 10years, 25years, 50years, and 100years generally used in flood forecasting. Flood frequency curves are plotted after the choosing the best fits probability distribution for annual peak maximum data. The results for flood frequency analysis shows that Sg. Perak at Jambatan Iskandar much higher inflow discharge  which is 3714.45m3/s at the 100years return period compare to Sg. Plus at Kg Lintang and Sg. Kinta at Weir G. With this, the 100years peak flow at Sg Perak river mouth is estimated to be in the range of 4,000 m3/s. Overall, the analysis relates the expected flow discharge to return period for all tributaries of Sg. Perak River Basin.


1990 ◽  
Vol 17 (4) ◽  
pp. 597-609 ◽  
Author(s):  
K. C. Ander Chow ◽  
W. E. Watt

Single-station flood frequency analysis is an important element in hydrotechnical planning and design. In Canada, no single statistical distribution has been specified for floods; hence, the conventional approach is to select a distribution based on its fit to the observed sample. This selection is not straightforward owing to typically short record lengths and attendant sampling error, magnified influence of apparent outliers, and limited evidence of two populations. Nevertheless, experienced analysts confidently select a distribution for a station based only on a few heuristics. A knowledge-based expert system has been developed to emulate these expert heuristics. It can perform data analyses, suggest an appropriate distribution, detect outliers, and provide means to justify a design flood on physical grounds. If the sample is too small to give reliable quantile estimates, the system performs a Bayesian analysis to combine regional information with station-specific data. The system was calibrated and tested for 52 stations across Canada. Its performance was evaluated by comparing the distributions selected by experts with those given by the developed system. The results indicated that the system can perform at an expert level in the task of selecting distributions. Key words: flood frequency, expert system, single-station, fuzzy logic, inductive reasoning, production system.


2021 ◽  
Author(s):  
Lei Yan ◽  
Lihua Xiong ◽  
Gusong Ruan ◽  
Chong-Yu Xu ◽  
Mengjie Zhang

Abstract In traditional flood frequency analysis, a minimum of 30 observations is required to guarantee the accuracy of design results with an allowable uncertainty; however, there has not been a recommendation for the requirement on the length of data in NFFA (nonstationary flood frequency analysis). Therefore, this study has been carried out with three aims: (i) to evaluate the predictive capabilities of nonstationary (NS) and stationary (ST) models with varying flood record lengths; (ii) to examine the impacts of flood record lengths on the NS and ST design floods and associated uncertainties; and (iii) to recommend the probable requirements of flood record length in NFFA. To achieve these objectives, 20 stations with record length longer than 100 years in Norway were selected and investigated by using both GEV (generalized extreme value)-ST and GEV-NS models with linearly varying location parameter (denoted by GEV-NS0). The results indicate that the fitting quality and predictive capabilities of GEV-NS0 outperform those of GEV-ST models when record length is approximately larger than 60 years for most stations, and the stability of the GEV-ST and GEV-NS0 is improved as record lengths increase. Therefore, a minimum of 60 years of flood observations is recommended for NFFA for the selected basins in Norway.


2021 ◽  
pp. 51-58
Author(s):  
Kajal Kumar Mandal ◽  
K. Dharanirajan ◽  
Sabyasachi Sarkar

The analysis of flood frequency will depend on the historical peak discharge data for at least 10 years. This study has taken into account peak annual maximum discharge data for 72 years (1949 to 2020). The discharge data was collected from the Farakka Barrage Gauging station (24°48'15.10" N and 87°55'52.70" E) situated in the upper part of lower Ganga basin. The flood frequency analysis of the lower Ganga basin’s upper portions has been carried out using Gumbel’s frequency distribution method. Gumbel’s method (XT) is a prediction analysing statistical approach. The discharge data was tabulated in descending order and rank has been assigned based on the discharge volume. The return period was calculated based on Weibull’s formula (P) for this analysis. The flood frequency data was plotted on a graph where X-axis shows the return period and the Yaxis is the discharge value. The R2 value of this graph is 0.9998 which describe Gumbel’s distribution method is best for the flood frequency analysis. The flood frequency analysis is an essential step to assess the flood hazard.


2020 ◽  
Author(s):  
Kolbjørn Engeland ◽  
Eivind Støren ◽  
Anna Aano ◽  
Øyvind Paasche

<p>The Glomma river is the largest in Norway and repeated destructive floods continue to represent a major climate hazard. Area planning and dam safety assessment in Norway, including the large catchment that feeds Glomma, are based on estimates of design flood sizes from 200 to 1000 years return periods despite the fact that most streamflow time series are ≤50 years. Consequently, design flood estimates are subject to sample uncertainty. Other data than streamflow measurements such as historical data and lake sediment cores can be employed not only to increase knowledge about floods, but also to reduce uncertainty in design flood estimates. By merging different data sources, it is possible to reduce the uncertainty associated with flood frequency analysis. The primary objective of this study is to combine systematic- historical and paleo-information in a methodological effort to improve flood frequency analysis.</p><p>We approach this ambition by (i) compiling historical flood data from the existing literature, (ii) presenting  high resolution XRF, MS and CT scanning data from a sediment core covering the last 10 000 years, and (iii) combining flood data from systematic streamflow measurements, historical sources and lacustrine sediment cores for estimating design floods and assessing non-stationarities in flood frequency.</p><p>Based on the lake sediments from Lake Flyginsjøen, which faithfully records flood events in Glomma, we can estimate flood frequency in a moving window of 50 years. Whenever the discharge is sufficient the floodwater crosses a local threshold and suspended sediments are deposited in the lake, providing information about how flood frequency has changed over the last 10 00 years. </p><p>The lake sediment data shows that past flood frequency is non-stationarity on different time scales. Periods with increased flood activity corresponds broadly to similar timeseries from eastern Norway and also in the Alps on centennial time scales. The flood frequency shows significant non-stationarities within periods with increased flood activity as was the case for the 18th century. The lake data indicates that the major historical flood in 1789 is the largest on record for the last 10 000 years at this site.</p><p>The results show that estimation of flood quantiles can benefit from the inclusion of historical and paleodata. The paleodata were in particular useful for evaluating how the flood information in historical data represent flood frequency on longer time scales. Using the frequency of floods obtained from the paleo-flood record resulted in minor changes in design flood estimates.   <br><br>This study has shown that the potential advantage of including paleoflood data and we suggest that paleodata has a high potential for detecting links between climate and flood frequency. The data presented here can be used alone, or in combination with paleoflood data from other locations in Norway and Europe, to assess and better understand the potential links between changes in climate and the corresponding changes in flood frequency.</p>


Water ◽  
2018 ◽  
Vol 10 (8) ◽  
pp. 1016 ◽  
Author(s):  
Jianzhu Li ◽  
Yanchen Zheng ◽  
Yimin Wang ◽  
Ting Zhang ◽  
Ping Feng ◽  
...  

Historical extraordinary floods are an important factor in non-stationary flood frequency analysis and they may occur at any time, regardless of whether the environment is changing or not. Based on mixed distribution (MD) modeling, this paper proposed an improved mixed distribution (IMD) model to consider the discontinuity and non-stationarity of flood samples simultaneously, which adds historical extraordinary floods in both sub-series divided by a change point. As a case study, the annual maximum peak discharge and volume series of Ankang hydrological station, located in the upper Hanjiang River Basin of China, were selected to identify non-stationarity by using the variation diagnosis system. MD and IMD were used to fit the flood characteristic series and a genetic algorithm was employed to estimate the optimal parameters. Compared with the design flood values fitted by the stationary Pearson type-III distribution, the results computed by IMD decreased at low return periods and increased at high return periods, with the difference varying from −6.67% to 7.19%. The results highlighted that although the design flood values of IMD are slightly larger than those of MD with different return periods, IMD provided a better result than MD. IMD provides a new perspective for non-stationary flood frequency analysis.


2013 ◽  
Vol 663 ◽  
pp. 768-772
Author(s):  
Li Jie Zhang

The evaluation and reducing of uncertainty is central to the task of hydrological frequency analysis. In this paper a Bayesian Markov Chain Monte Carlo (MCMC) method is employed to infer the parameter values of the probabilistic distribution model and evalue the uncertainties of design flood. Comparison to the estimated results of three-parameter log-normal distribution (LN3) and the three-parameter generalized extreme value distribution (GEV), the Pearson Type 3 distribution (PIII) provides a good approximation to flood-flow data. The choice of the appropriate probabilistic model can reduce uncertainty of design flood estimation. Historical flood events might be greatly reduced uncertainty when incorporating past extreme historical data into the flood frequency analysis.


Sign in / Sign up

Export Citation Format

Share Document