Flood frequency analysis: assumptions and alternatives

2005 ◽  
Vol 29 (3) ◽  
pp. 392-410 ◽  
Author(s):  
R. Kidson ◽  
K. S. Richards

Flood frequency analysis (FFA) is a form of risk analysis, yet a risk analysis of the activity of FFA itself is rarely undertaken. The recent literature of FFA has been characterized by: (1) a proliferation of mathematical models, lacking theoretical hydrologic justification, but used to extrapolate the return periods of floods beyond the gauged record; (2) official mandating of particular models, which has resulted in (3) research focused on increasingly reductionist and statistically sophisticated procedures for parameter fitting to these models from the limited gauged data. These trends have evolved to such a refined state that FFA may be approaching the ‘limits of splitting’; at the very least, the emphasis was shifted early in the history of FFA from predicting and explaining extreme flood events to the more soluble issue of fitting distributions to the bulk of the data. However, recent evidence indicates that the very modelling basis itself may be ripe for revision. Self-similar (power law) models are not only analytically simpler than conventional models, but they also offer a plausible theoretical basis in complexity theory. Of most significance, however, is the empirical evidence for self-similarity in flood behaviour. Self-similarity is difficult to detect in gauged records of limited length; however, one positive aspect of the application of statistics to FFA has been the refinement of techniques for the incorporation of historical and palaeoflood data. It is these data types, even over modest timescales such as 100 years, which offer the best promise for testing alternative models of extreme flood behaviour across a wider range of basins. At stake is the accurate estimation of flood magnitude, used widely for design purposes: the power law model produces far more conservative estimates of return period of large floods compared to conventional models, and deserves closer study.

2014 ◽  
Vol 14 (5) ◽  
pp. 1283-1298 ◽  
Author(s):  
D. Lawrence ◽  
E. Paquet ◽  
J. Gailhard ◽  
A. K. Fleig

Abstract. Simulation methods for extreme flood estimation represent an important complement to statistical flood frequency analysis because a spectrum of catchment conditions potentially leading to extreme flows can be assessed. In this paper, stochastic, semi-continuous simulation is used to estimate extreme floods in three catchments located in Norway, all of which are characterised by flood regimes in which snowmelt often has a significant role. The simulations are based on SCHADEX, which couples a precipitation probabilistic model with a hydrological simulation such that an exhaustive set of catchment conditions and responses is simulated. The precipitation probabilistic model is conditioned by regional weather patterns, and a bottom–up classification procedure was used to define a set of weather patterns producing extreme precipitation in Norway. SCHADEX estimates for the 1000-year (Q1000) discharge are compared with those of several standard methods, including event-based and long-term simulations which use a single extreme precipitation sequence as input to a hydrological model, statistical flood frequency analysis based on the annual maximum series, and the GRADEX method. The comparison suggests that the combination of a precipitation probabilistic model with a long-term simulation of catchment conditions, including snowmelt, produces estimates for given return periods which are more in line with those based on statistical flood frequency analysis, as compared with the standard simulation methods, in two of the catchments. In the third case, the SCHADEX method gives higher estimates than statistical flood frequency analysis and further suggests that the seasonality of the most likely Q1000 events differs from that of the annual maximum flows. The semi-continuous stochastic simulation method highlights the importance of considering the joint probability of extreme precipitation, snowmelt rates and catchment saturation states when assigning return periods to floods estimated by precipitation-runoff methods. The SCHADEX methodology, as applied here, is dependent on observed discharge data for calibration of a hydrological model, and further study to extend its application to ungauged catchments would significantly enhance its versatility.


2020 ◽  
Vol 6 (12) ◽  
pp. 2425-2436
Author(s):  
Andy Obinna Ibeje ◽  
Ben N. Ekwueme

Hydrologic designs require accurate estimation of quartiles of extreme floods. But in many developing regions, records of flood data are seldom available. A model framework using the dimensionless index flood for the transfer of Flood Frequency Curve (FFC) among stream gauging sites in a hydrologically homogeneous region is proposed.  Key elements of the model framework include: (1) confirmation of the homogeneity of the region; (2) estimation of index flood-basin area relation; (3) derivation of the regional flood frequency curve (RFFC) and deduction of FFC of an ungauged catchment as a product of index flood and dimensionless RFFC. As an application, 1983 to 2004 annual extreme flood from six selected gauging sites located in Anambra-Imo River basin of southeast Nigeria, were used to demonstrate that the developed index flood model: , overestimated flood quartiles in an ungauged site of the basin.  It is recommended that, for wider application, the model results can be improved by the availability and use of over 100 years length of flood data spatially distributed at critical locations of the watershed. Doi: 10.28991/cej-2020-03091627 Full Text: PDF


2020 ◽  
Author(s):  
Alexandra Fedorova ◽  
Nataliia Nesterova ◽  
Olga Makarieva ◽  
Andrey Shikhov

<p>In June 2019, the extreme flash flood was formed on the rivers of the Irkutsk region originating from the East Sayan mountains. This flood became the most hazardous one in the region in 80 years history of observations.</p><p>The greatest rise in water level was recorded at the Iya River in the town of Tulun (more than 9 m in three days). The recorded water level was more than 5 m above the dangerous mark of 850 cm and more than 2.5 m above the historical maximum water level which was observed in 1984.</p><p>The flood led to the catastrophic inundation of the town of Tulun, 25 people died and 8 went missing. According to preliminary assessment, economic damage from the flood in 2019 amounted up to half a billion Euro.</p><p>Among the reasons for the extreme flood in June 2019 that are discussed are heavy rains as a result of climate change, melting of snow and glaciers in the mountains of the East Sayan, deforestation of river basins due to clearings and fires, etc.</p><p>The aim of the study was to analyze the factors that led to the formation of a catastrophic flood in June 2019, as well as estimate the maximum discharge of at the Iya River. For calculations, the deterministic distributed hydrological model Hydrograph was applied. We used the observed data of meteorological stations and the forecast values ​​of the global weather forecast model ICON. The estimated discharge has exceeded previously observed one by about 50%.</p><p>The results of the study have shown that recent flood damage was caused mainly by unprepared infrastructure. The safety dam which was built in the town of Tulun just ten years ago was 2 meters lower than maximum observed water level in 2019. This case and many other cases in Russia suggest that the flood frequency analysis of even long-term historical data may mislead design engineers to significantly underestimate the probability and magnitude of flash floods. There are the evidences of observed precipitation regime transformations which directly contribute to the formation of dangerous hydrological phenomena. The details of the study for the Irkutsk region will be presented.</p>


2013 ◽  
Vol 1 (6) ◽  
pp. 6785-6828 ◽  
Author(s):  
D. Lawrence ◽  
E. Paquet ◽  
J. Gailhard ◽  
A. K. Fleig

Abstract. Simulation methods for extreme flood estimation represent an important complement to statistical flood frequency analysis because a spectrum of catchment conditions potentially leading to extreme flows can be assessed. In this paper, stochastic, semi-continuous simulation is used to estimate extreme floods in three catchments located in Norway, all of which are characterised by flood regimes in which snowmelt often has a significant role. The simulations are based on SCHADEX, which couples a precipitation probabilistic model with a hydrological simulation such that an exhaustive set of catchment conditions and responses are simulated. The precipitation probabilistic model is conditioned by regional weather patterns, and a "bottom-up" classification procedure was used for defining a set of weather patterns producing extreme precipitation in Norway. SCHADEX estimates for the 1000 yr (Q1000) discharge are compared with those of several standard methods, including event-based and long-term simulations which use a single extreme precipitation sequence as input to a hydrological model, with statistical flood frequency analysis based on the annual maximum series, and with the GRADEX method. The comparison suggests that the combination of a precipitation probabilistic model with a long-term simulation of catchment conditions, including snowmelt, produces estimates for given return periods which are more in line with those based on statistical flood frequency analysis, as compared with the standard simulation methods, in two of the catchments. In the third case, the SCHADEX method gives higher estimates than statistical flood frequency analysis and further suggests that the seasonality of the most likely Q1000 events differs from that of the annual maximum flows. The semi-continuous stochastic simulation method highlights the importance of considering the joint probability of extreme precipitation, snowmelt rates and catchment saturation states when assigning return periods to floods estimated by precipitation-runoff methods. The SCHADEX methodology, as applied here, is dependent on observed discharge data for calibration of a hydrological model, and further study to extend its application to ungauged catchments would significantly enhance its versatility.


2019 ◽  
Vol 23 (1) ◽  
pp. 107-124 ◽  
Author(s):  
Manuela I. Brunner ◽  
Reinhard Furrer ◽  
Anne-Catherine Favre

Abstract. Floods often affect not only a single location, but also a whole region. Flood frequency analysis should therefore be undertaken at a regional scale which requires the considerations of the dependence of events at different locations. This dependence is often neglected even though its consideration is essential to derive reliable flood estimates. A model used in regional multivariate frequency analysis should ideally consider the dependence of events at multiple sites which might show dependence in the lower and/or upper tail of the distribution. We here seek to propose a simple model that on the one hand considers this dependence with respect to the network structure of the region and on the other hand allows for the simulation of stochastic event sets at both gauged and ungauged locations. The new Fisher copula model is used for representing the spatial dependence of flood events in the nested Thur catchment in Switzerland. Flood event samples generated for the gauged stations using the Fisher copula are compared to samples generated by other dependence models allowing for modeling of multivariate data including elliptical copulas, R-vine copulas, and max-stable models. The comparison of the dependence structures of the generated samples shows that the Fisher copula is a suitable model for capturing the spatial dependence in the data. We therefore use the copula in a way such that it can be used in an interpolation context to simulate event sets comprising gauged and ungauged locations. The spatial event sets generated using the Fisher copula well capture the general dependence structure in the data and the upper tail dependence, which is of particular interest when looking at extreme flood events and when extrapolating to higher return periods. The Fisher copula was for a medium-sized catchment found to be a suitable model for the stochastic simulation of flood event sets at multiple gauged and ungauged locations.


2021 ◽  
Author(s):  
Nur Amalina Mat Jan ◽  
Ani Shabri ◽  
Muhammad Fadhil Marsani ◽  
Basri Badyalina

Abstract The non-stationarity in hydrological records is a significant concerning area of interest within the field of flood risk management. Ignoring the non-stationary behaviour in flood series will result in a substantial bias in floods quantile. Hence, the non-stationary flood frequency analysis appeared to be an appropriate option to maintain the independent and identically distributed (IID) assumptions in sample observation. This paper utilized the Generalized Extreme Value (GEV) distribution to analyze extreme flood series. The time-varying moment technique, namely the L-moment and TL-moment methods are employed to estimate the non-stationary model (GEV 1, GEV 2, and GEV 3) in the flood series. The ADF test, Mann-Kendall trend test, and Spearman’s Rho test showed that two out of ten streamflow stations in Johor, Malaysia demonstrated a non-stationary behaviour in the annual maximum streamflow. Results from the simulation study demonstrate a consistent performance on the non-stationary model. Furthermore, the TL-moments method could efficiently predict the flood event estimated at quantiles of the higher return periods.


2016 ◽  
Author(s):  
Tomohiro Tanaka ◽  
Yasuto Tachikawa ◽  
Yutaka Ichikawa ◽  
Kazuaki Yorozu

Abstract. Design flood, river discharge with a particular return period, is fundamental to determine the scale of flood control facilities. In addition, considering a changing climate, not only frequencies of river discharge at design level but those of devastating flooding are also crucial. Characteristics of river discharge during extreme floods largely differ from those during others because of upstream dam operation and/or river overflow; however, flood frequency analysis (FFA) from past discharge data is difficult to represent such impact because river basins rarely experience floods over the design level after river improvement and dam construction. To account for the above impact on extreme flood frequencies, this study presented a rainfall-based flood frequency model (RFFM) that derives flood frequencies from probabilistic rainfall modelling that empirically represents probabilistic structure of rainfall intensity over a catchment by directly using observed spatial-temporal rainfall profiles. The RFFM was applied to the Yodo River basin, Japan and demonstrated that flood frequency estimations by the RFFM well represent past flood frequencies at the Hirakata gauging station. Furthermore, the RFFM showed that return periods of large flood peaks are estimated at extremely large values, reflecting decrease of discharge by the inundation in an upstream area of the gauging station. On the other hand, FFA from past discharge data did not represent this impact because it has not experienced such huge flood peaks in an observation period. This study demonstrated the importance of the RFFM for flood frequency estimations, including those exceeding the design level.


2018 ◽  
Author(s):  
Manuela I. Brunner ◽  
Reinhard Furrer ◽  
Anne-Catherine Favre

Abstract. Floods do often not only affect a single location but a whole region. Flood frequency analysis should therefore be undertaken at a regional scale which requires the considerations of the dependence of events at different locations. This dependence is often neglected even though its consideration is essential to derive reliable flood estimates. A model used in regional multivariate frequency analysis should ideally consider the dependence of events at multiple sites which might show dependence in the lower and/or upper tail of the distribution. We here seek at proposing a simple model that on the one hand considers this dependence with respect to the network structure of the region and on the other hand, allows for the simulation of stochastic event sets at both gauged and ungauged locations. The new Fisher copula model is used for representing the spatial dependence of flood events in the nested Thur catchment in Switzerland. Flood event samples generated for the gauged stations using the Fisher copula are compared to samples generated by other dependence models allowing for modeling multivariate data including elliptical copulas, R-vine copulas, and max-stable models. The comparison of the dependence structures of the generated samples shows that the Fisher copula is a suitable model for capturing the spatial dependence in the data. We therefore use the copula in a way such that it can be used in an interpolation context to simulate event sets comprising gauged and ungauged locations. The spatial event sets generated using the Fisher copula well capture the general dependence structure in the data and the upper tail dependence, which is of particular interest when looking at extreme flood events and when extrapolating to higher return periods. The Fisher copula is therefore a suitable model for the stochastic simulation of flood event sets at multiple gauged and ungauged locations.


2013 ◽  
Vol 61 (4) ◽  
pp. 326-333 ◽  
Author(s):  
Pavla Pekárová ◽  
Dana Halmová ◽  
Veronika Bačová Mitková ◽  
Pavol Miklánek ◽  
Ján Pekár ◽  
...  

Abstract In this paper we focused on the history of floods and extreme flood frequency analysis of the upper Danube River at Bratislava. Firstly, we briefly describe the flood marks found on the Danube River in the region of Bratislava, Slovakia, and provide an account of the floods’ consequences. Secondly, we analyzed the annual maximum discharge series for the period 1876-2012, including the most recent flood of June 2013. Thirdly, we compare the values of T-year design discharge computed with and without incorporating the historic floods (floods of the years 1501, 1682, and 1787 into the 138-year series of annual discharge peaks). There are unfortunately only a few historic flood marks preserved in Bratislava, but there are very important and old marks in neighbouring Hainburg and other Austrian cities upstream to Passau. The calculated T-year maximum discharge of the Danube at Bratislava for the period 1876-2010 without and with historic flood values have been compared. Our analysis showed that without incorporating the historic floods from the years 1501, 1682, and 1787 the 1000-year discharge calculated only with data from the instrumented period 1876- 2013 is 14,188 m3 s-1, and it is lower compared to the 1000-year discharge of 14,803 m3 s-1 when the three historic floods are included. In general, the T-year discharge is higher throughout the whole spectrum of T-year discharges (10, 20, 50, 100, 200, 500-year discharge) when the three historic floods are included. Incorporating historic floods into a time series of maximum annual discharge seems to exert a significant effect on the estimates of low probability floods. This has important implications for flood managements and estimation of flood design discharge.


Sign in / Sign up

Export Citation Format

Share Document