TL-Moments Approach: Application of non-Stationary GEV Model in Flood Frequency Analysis

Author(s):  
Nur Amalina Mat Jan ◽  
Ani Shabri ◽  
Muhammad Fadhil Marsani ◽  
Basri Badyalina

Abstract The non-stationarity in hydrological records is a significant concerning area of interest within the field of flood risk management. Ignoring the non-stationary behaviour in flood series will result in a substantial bias in floods quantile. Hence, the non-stationary flood frequency analysis appeared to be an appropriate option to maintain the independent and identically distributed (IID) assumptions in sample observation. This paper utilized the Generalized Extreme Value (GEV) distribution to analyze extreme flood series. The time-varying moment technique, namely the L-moment and TL-moment methods are employed to estimate the non-stationary model (GEV 1, GEV 2, and GEV 3) in the flood series. The ADF test, Mann-Kendall trend test, and Spearman’s Rho test showed that two out of ten streamflow stations in Johor, Malaysia demonstrated a non-stationary behaviour in the annual maximum streamflow. Results from the simulation study demonstrate a consistent performance on the non-stationary model. Furthermore, the TL-moments method could efficiently predict the flood event estimated at quantiles of the higher return periods.

2014 ◽  
Vol 14 (5) ◽  
pp. 1283-1298 ◽  
Author(s):  
D. Lawrence ◽  
E. Paquet ◽  
J. Gailhard ◽  
A. K. Fleig

Abstract. Simulation methods for extreme flood estimation represent an important complement to statistical flood frequency analysis because a spectrum of catchment conditions potentially leading to extreme flows can be assessed. In this paper, stochastic, semi-continuous simulation is used to estimate extreme floods in three catchments located in Norway, all of which are characterised by flood regimes in which snowmelt often has a significant role. The simulations are based on SCHADEX, which couples a precipitation probabilistic model with a hydrological simulation such that an exhaustive set of catchment conditions and responses is simulated. The precipitation probabilistic model is conditioned by regional weather patterns, and a bottom–up classification procedure was used to define a set of weather patterns producing extreme precipitation in Norway. SCHADEX estimates for the 1000-year (Q1000) discharge are compared with those of several standard methods, including event-based and long-term simulations which use a single extreme precipitation sequence as input to a hydrological model, statistical flood frequency analysis based on the annual maximum series, and the GRADEX method. The comparison suggests that the combination of a precipitation probabilistic model with a long-term simulation of catchment conditions, including snowmelt, produces estimates for given return periods which are more in line with those based on statistical flood frequency analysis, as compared with the standard simulation methods, in two of the catchments. In the third case, the SCHADEX method gives higher estimates than statistical flood frequency analysis and further suggests that the seasonality of the most likely Q1000 events differs from that of the annual maximum flows. The semi-continuous stochastic simulation method highlights the importance of considering the joint probability of extreme precipitation, snowmelt rates and catchment saturation states when assigning return periods to floods estimated by precipitation-runoff methods. The SCHADEX methodology, as applied here, is dependent on observed discharge data for calibration of a hydrological model, and further study to extend its application to ungauged catchments would significantly enhance its versatility.


2019 ◽  
Vol 79 ◽  
pp. 03022
Author(s):  
Shangwen Jiang ◽  
Ling Kang

Under changing environment, the streamflow series in the Yangtze River have undergone great changes and it has raised widespread concerns. In this study, the annual maximum flow (AMF) series at the Yichang station were used for flood frequency analysis, in which a time varying model was constructed to account for non-stationarity. The generalized extreme value (GEV) distribution was adopted to fit the AMF series, and the Generalized Additive Models for Location, Scale and Shape (GAMLSS) framework was applied for parameter estimation. The non-stationary return period and risk of failure were calculated and compared for flood risk assessment between stationary and non-stationary models. The results demonstrated that the flow regime at the Yichang station has changed over time and a decreasing trend was detected in the AMF series. The design flood peak given a return period decreased in the non-stationary model, and the risk of failure is also smaller given a design life, which indicated a safer flood condition in the future compared with the stationary model. The conclusions in this study may contribute to long-term decision making in the Yangtze River basin under non-stationary conditions.


2020 ◽  
Author(s):  
Alexandra Fedorova ◽  
Nataliia Nesterova ◽  
Olga Makarieva ◽  
Andrey Shikhov

<p>In June 2019, the extreme flash flood was formed on the rivers of the Irkutsk region originating from the East Sayan mountains. This flood became the most hazardous one in the region in 80 years history of observations.</p><p>The greatest rise in water level was recorded at the Iya River in the town of Tulun (more than 9 m in three days). The recorded water level was more than 5 m above the dangerous mark of 850 cm and more than 2.5 m above the historical maximum water level which was observed in 1984.</p><p>The flood led to the catastrophic inundation of the town of Tulun, 25 people died and 8 went missing. According to preliminary assessment, economic damage from the flood in 2019 amounted up to half a billion Euro.</p><p>Among the reasons for the extreme flood in June 2019 that are discussed are heavy rains as a result of climate change, melting of snow and glaciers in the mountains of the East Sayan, deforestation of river basins due to clearings and fires, etc.</p><p>The aim of the study was to analyze the factors that led to the formation of a catastrophic flood in June 2019, as well as estimate the maximum discharge of at the Iya River. For calculations, the deterministic distributed hydrological model Hydrograph was applied. We used the observed data of meteorological stations and the forecast values ​​of the global weather forecast model ICON. The estimated discharge has exceeded previously observed one by about 50%.</p><p>The results of the study have shown that recent flood damage was caused mainly by unprepared infrastructure. The safety dam which was built in the town of Tulun just ten years ago was 2 meters lower than maximum observed water level in 2019. This case and many other cases in Russia suggest that the flood frequency analysis of even long-term historical data may mislead design engineers to significantly underestimate the probability and magnitude of flash floods. There are the evidences of observed precipitation regime transformations which directly contribute to the formation of dangerous hydrological phenomena. The details of the study for the Irkutsk region will be presented.</p>


2013 ◽  
Vol 1 (6) ◽  
pp. 6785-6828 ◽  
Author(s):  
D. Lawrence ◽  
E. Paquet ◽  
J. Gailhard ◽  
A. K. Fleig

Abstract. Simulation methods for extreme flood estimation represent an important complement to statistical flood frequency analysis because a spectrum of catchment conditions potentially leading to extreme flows can be assessed. In this paper, stochastic, semi-continuous simulation is used to estimate extreme floods in three catchments located in Norway, all of which are characterised by flood regimes in which snowmelt often has a significant role. The simulations are based on SCHADEX, which couples a precipitation probabilistic model with a hydrological simulation such that an exhaustive set of catchment conditions and responses are simulated. The precipitation probabilistic model is conditioned by regional weather patterns, and a "bottom-up" classification procedure was used for defining a set of weather patterns producing extreme precipitation in Norway. SCHADEX estimates for the 1000 yr (Q1000) discharge are compared with those of several standard methods, including event-based and long-term simulations which use a single extreme precipitation sequence as input to a hydrological model, with statistical flood frequency analysis based on the annual maximum series, and with the GRADEX method. The comparison suggests that the combination of a precipitation probabilistic model with a long-term simulation of catchment conditions, including snowmelt, produces estimates for given return periods which are more in line with those based on statistical flood frequency analysis, as compared with the standard simulation methods, in two of the catchments. In the third case, the SCHADEX method gives higher estimates than statistical flood frequency analysis and further suggests that the seasonality of the most likely Q1000 events differs from that of the annual maximum flows. The semi-continuous stochastic simulation method highlights the importance of considering the joint probability of extreme precipitation, snowmelt rates and catchment saturation states when assigning return periods to floods estimated by precipitation-runoff methods. The SCHADEX methodology, as applied here, is dependent on observed discharge data for calibration of a hydrological model, and further study to extend its application to ungauged catchments would significantly enhance its versatility.


2019 ◽  
Vol 23 (1) ◽  
pp. 107-124 ◽  
Author(s):  
Manuela I. Brunner ◽  
Reinhard Furrer ◽  
Anne-Catherine Favre

Abstract. Floods often affect not only a single location, but also a whole region. Flood frequency analysis should therefore be undertaken at a regional scale which requires the considerations of the dependence of events at different locations. This dependence is often neglected even though its consideration is essential to derive reliable flood estimates. A model used in regional multivariate frequency analysis should ideally consider the dependence of events at multiple sites which might show dependence in the lower and/or upper tail of the distribution. We here seek to propose a simple model that on the one hand considers this dependence with respect to the network structure of the region and on the other hand allows for the simulation of stochastic event sets at both gauged and ungauged locations. The new Fisher copula model is used for representing the spatial dependence of flood events in the nested Thur catchment in Switzerland. Flood event samples generated for the gauged stations using the Fisher copula are compared to samples generated by other dependence models allowing for modeling of multivariate data including elliptical copulas, R-vine copulas, and max-stable models. The comparison of the dependence structures of the generated samples shows that the Fisher copula is a suitable model for capturing the spatial dependence in the data. We therefore use the copula in a way such that it can be used in an interpolation context to simulate event sets comprising gauged and ungauged locations. The spatial event sets generated using the Fisher copula well capture the general dependence structure in the data and the upper tail dependence, which is of particular interest when looking at extreme flood events and when extrapolating to higher return periods. The Fisher copula was for a medium-sized catchment found to be a suitable model for the stochastic simulation of flood event sets at multiple gauged and ungauged locations.


Water ◽  
2021 ◽  
Vol 13 (15) ◽  
pp. 2007
Author(s):  
Chaofei He ◽  
Fulong Chen ◽  
Aihua Long ◽  
Chengyan Luo ◽  
Changlu Qiao

With the acceleration of human economic activities and dramatic changes in climate, the validity of the stationarity assumption of flood time series frequency analysis has been questioned. In this study, a framework for flood frequency analysis is developed on the basis of a tool, namely, the Generalized Additive Models for Location, Scale, and Shape (GAMLSS). We introduced this model to construct a non-stationary model with time and climate factor as covariates for the 50-year snowmelt flood time series in the Kenswat Reservoir control basin of the Manas River. The study shows that there are clear non-stationarities in the flood regime, and the characteristic series of snowmelt flood shows an increasing trend with the passing of time. The parameters of the flood distributions are modelled as functions of climate indices (temperature and rainfall). The physical mechanism was incorporated into the study, and the simulation results are similar to the actual flood conditions, which can better describe the dynamic process of snowmelt flood characteristic series. Compared with the design flood results of Kenswat Reservoir approved by the China Renewable Energy Engineering Institute in December 2008, the design value of the GAMLSS non-stationary model considers that the impact of climate factors create a design risk in dry years by underestimating the risk.


2005 ◽  
Vol 29 (3) ◽  
pp. 392-410 ◽  
Author(s):  
R. Kidson ◽  
K. S. Richards

Flood frequency analysis (FFA) is a form of risk analysis, yet a risk analysis of the activity of FFA itself is rarely undertaken. The recent literature of FFA has been characterized by: (1) a proliferation of mathematical models, lacking theoretical hydrologic justification, but used to extrapolate the return periods of floods beyond the gauged record; (2) official mandating of particular models, which has resulted in (3) research focused on increasingly reductionist and statistically sophisticated procedures for parameter fitting to these models from the limited gauged data. These trends have evolved to such a refined state that FFA may be approaching the ‘limits of splitting’; at the very least, the emphasis was shifted early in the history of FFA from predicting and explaining extreme flood events to the more soluble issue of fitting distributions to the bulk of the data. However, recent evidence indicates that the very modelling basis itself may be ripe for revision. Self-similar (power law) models are not only analytically simpler than conventional models, but they also offer a plausible theoretical basis in complexity theory. Of most significance, however, is the empirical evidence for self-similarity in flood behaviour. Self-similarity is difficult to detect in gauged records of limited length; however, one positive aspect of the application of statistics to FFA has been the refinement of techniques for the incorporation of historical and palaeoflood data. It is these data types, even over modest timescales such as 100 years, which offer the best promise for testing alternative models of extreme flood behaviour across a wider range of basins. At stake is the accurate estimation of flood magnitude, used widely for design purposes: the power law model produces far more conservative estimates of return period of large floods compared to conventional models, and deserves closer study.


2016 ◽  
Author(s):  
Tomohiro Tanaka ◽  
Yasuto Tachikawa ◽  
Yutaka Ichikawa ◽  
Kazuaki Yorozu

Abstract. Design flood, river discharge with a particular return period, is fundamental to determine the scale of flood control facilities. In addition, considering a changing climate, not only frequencies of river discharge at design level but those of devastating flooding are also crucial. Characteristics of river discharge during extreme floods largely differ from those during others because of upstream dam operation and/or river overflow; however, flood frequency analysis (FFA) from past discharge data is difficult to represent such impact because river basins rarely experience floods over the design level after river improvement and dam construction. To account for the above impact on extreme flood frequencies, this study presented a rainfall-based flood frequency model (RFFM) that derives flood frequencies from probabilistic rainfall modelling that empirically represents probabilistic structure of rainfall intensity over a catchment by directly using observed spatial-temporal rainfall profiles. The RFFM was applied to the Yodo River basin, Japan and demonstrated that flood frequency estimations by the RFFM well represent past flood frequencies at the Hirakata gauging station. Furthermore, the RFFM showed that return periods of large flood peaks are estimated at extremely large values, reflecting decrease of discharge by the inundation in an upstream area of the gauging station. On the other hand, FFA from past discharge data did not represent this impact because it has not experienced such huge flood peaks in an observation period. This study demonstrated the importance of the RFFM for flood frequency estimations, including those exceeding the design level.


2018 ◽  
Author(s):  
Manuela I. Brunner ◽  
Reinhard Furrer ◽  
Anne-Catherine Favre

Abstract. Floods do often not only affect a single location but a whole region. Flood frequency analysis should therefore be undertaken at a regional scale which requires the considerations of the dependence of events at different locations. This dependence is often neglected even though its consideration is essential to derive reliable flood estimates. A model used in regional multivariate frequency analysis should ideally consider the dependence of events at multiple sites which might show dependence in the lower and/or upper tail of the distribution. We here seek at proposing a simple model that on the one hand considers this dependence with respect to the network structure of the region and on the other hand, allows for the simulation of stochastic event sets at both gauged and ungauged locations. The new Fisher copula model is used for representing the spatial dependence of flood events in the nested Thur catchment in Switzerland. Flood event samples generated for the gauged stations using the Fisher copula are compared to samples generated by other dependence models allowing for modeling multivariate data including elliptical copulas, R-vine copulas, and max-stable models. The comparison of the dependence structures of the generated samples shows that the Fisher copula is a suitable model for capturing the spatial dependence in the data. We therefore use the copula in a way such that it can be used in an interpolation context to simulate event sets comprising gauged and ungauged locations. The spatial event sets generated using the Fisher copula well capture the general dependence structure in the data and the upper tail dependence, which is of particular interest when looking at extreme flood events and when extrapolating to higher return periods. The Fisher copula is therefore a suitable model for the stochastic simulation of flood event sets at multiple gauged and ungauged locations.


Sign in / Sign up

Export Citation Format

Share Document