flood frequency analysis
Recently Published Documents


TOTAL DOCUMENTS

768
(FIVE YEARS 194)

H-INDEX

60
(FIVE YEARS 7)

Water ◽  
2022 ◽  
Vol 14 (2) ◽  
pp. 138
Author(s):  
Lihua Xiong ◽  
Cong Jiang ◽  
Shenglian Guo ◽  
Shuai Li ◽  
Rongrong Li ◽  
...  

Under a changing environment, the current hydrological design values derived from historical flood data for the Three Gorges Reservoir (TGR) might be no longer applicable due to the newly-built reservoirs upstream from the TGR and the changes in climatic conditions. In this study, we perform a multivariate dam-site flood frequency analysis for the TGR considering future reservoir regulation and summer precipitation. The Xinanjiang model and Muskingum routing method are used to reconstruct the dam-site flood variables during the operation period of the TGR. Then the distributions of the dam-site flood peak and flood volumes with durations of 3, 7, 15, and 30 days are built by Pearson type III (PIII) distribution with time-varying parameters, which are expressed as functions of both reservoir index and summer precipitation anomaly (SPA). The multivariate joint distribution of the dam-site flood variables is constructed by a 5-D C-vine copula. Finally, by using the criteria of annual average reliability (AAR) associated with the exceedance probabilities of OR, AND and Kendall, we derive the multivariate dam-site design floods for the TGR from the predicted flood distributions during the future operation period of the reservoir. The results indicate that the mean values of all flood variables are positively linked to SPA and negatively linked to RI. In the future, the flood mean values are predicted to present a dramatic decrease due to the regulation of the reservoirs upstream from the TGR. As the result, the design dam-site floods in the future will be smaller than those derived from historical flood distributions. This finding indicates that the TGR would have smaller flood risk in the future.


2021 ◽  
Author(s):  
Yanlai Zhou ◽  
Shenglian Guo ◽  
Chong-Yu Xu ◽  
Lihua Xiong ◽  
Hua Chen ◽  
...  

Abstract Quantifying the uncertainty of non-stationary flood frequency analysis is very crucial and beneficial for planning and design of water engineering projects, which is fundamentally challenging especially in the presence of high climate variability and reservoir regulation. This study proposed an integrated approach that combined the Generalized Additive Model for Location, Scale and Shape parameters (GAMLSS) method, the Copula function and the Bayesian Uncertainty Processor (BUP) technique to make reliable probabilistic interval estimations of design floods. The reliability and applicability of the proposed approach were assessed by flood datasets collected from two hydrological monitoring stations located in the Hanjiang River of China. The precipitation and the reservoir index were selected as the explanatory variables for modeling the time-varying parameters of marginal and joint distributions using long-term (1954–2018) observed datasets. First, the GAMLSS method was employed to model and fit the time-varying characteristics of parameters in marginal and joint distributions. Second, the Copula function was employed to execute the point estimations of non-stationary design floods. Finally, the BUP technique was employed to perform the interval estimations of design floods based on the point estimations obtained from the Copula function. The results demonstrated that the proposed approach can provide reliable probabilistic interval estimations of design floods meanwhile reducing the uncertainty of non-stationary flood frequency analysis. Consequently, the integrated approach is a promising way to offer an indication on how design values can be estimated in a high-dimensional problem.


2021 ◽  
Author(s):  
Valeriya Filipova ◽  
Anthony Hammond ◽  
David Leedal ◽  
Rob Lamb

Abstract In this study, we utilise Artificial Neural Network (ANN) models to estimate the 100- and 1500-year return levels for around 900,000 ungauged catchments in the contiguous USA. The models were trained and validated using 4,079 gauges and several selected catchment descriptors out of a total of 25 available. The study area was split into 15 regions, which represent major watersheds. ANN models were developed for each region and evaluated by calculating several performance metrics such as root-mean-squared error (RMSE), coefficient of determination (R2) and absolute percent error. The availability of a large dataset of gauges made it possible to test different model architectures and assess the regional performance of the models. The results indicate that ANN models with only one hidden layer are sufficient to describe the relationship between flood quantiles and catchment descriptors. The regional performance depends on climate type as models perform worse in arid and humid continental climates. Overall, the study suggests that ANN models are particularly applicable for predicting ungauged flood quantiles across a large geographic area. The paper presents recommendations about future application of ANN in regional flood frequency analysis.


2021 ◽  
Author(s):  
Sonali Swetapadma ◽  
Chandra Shekhar Prasad Ojha

Abstract. Quality discharge measurements and frequency analysis are two major prerequisites for defining a design flood. Flood frequency analysis (FFA) utilizes a comprehensive understanding of the probabilistic behavior of extreme events but has certain limitations regarding the sampling method and choice of distribution models. Entropy as a modern-day tool has found several applications in FFA, mainly in the derivation of probability distributions and their parameter estimation as per the principle of maximum entropy (POME) theory. The present study explores a new dimension to this area of research, where POME theory is applied in the partial duration series (PDS) modeling of FFA to locate the optimum threshold and the respective distribution models. The proposed methodology is applied to the Waimakariri River at the Old Highway Bridge site in New Zealand, as it has one of the best quality discharge data. The catchment also has a history of significant flood events in the last few decades. The degree of fitness of models to the exceedances is compared with the standardized statistical approach followed in literature. Also, the threshold estimated from this study is matched with some previous findings. Various return period quantiles are calculated, and their predictive ability is tested by bootstrap sampling. An overall analysis of results shows that entropy can be also be used as an effective tool for threshold identification in PDS modeling of flood frequency studies.


Water Policy ◽  
2021 ◽  
Author(s):  
Richard M. Vogel ◽  
Charles N. Kroll

Abstract Extreme drought and resulting low streamflows occur throughout the U.S., causing billions of dollars in annual losses, detrimentally impacting ecosystems, as well as agricultural, hydropower, navigation, water supply, recreation, and a myriad of other water resource systems, leading to reductions in both the effectiveness and resiliency of our water resource infrastructure. Since 1966, with the introduction of Bulletin 13 titled ‘Methods of Flow Frequency Analysis’, the U.S. adopted uniform guidelines for performing flood flow frequency analysis to ensure and enable all federal agencies concerned with water resource design, planning, and management under flood conditions to obtain sensible, consistent, and reproducible estimators of flood flow statistics. Remarkably, over one-half century later, no uniform national U.S. guidelines for hydrologic drought streamflow frequency analysis exist, and the various assorted guidelines that do exist are not reliable because (1) they are based on methods developed for floods, which are distinctly different than low streamflows and (2) the methods do not take advantage of the myriad of advances in flood and low streamflow frequency analyses over the last 50 years. We provide a justification for the need for developing national guidelines for streamflow drought frequency analysis as an analog to the existing national guidelines for flood frequency analysis. Those guidelines should result in improved water resources design, planning, operations, and management under low streamflow conditions throughout the U.S. and could prove useful elsewhere.


Author(s):  
K A Johnson ◽  
J C Smithers ◽  
R E Schulze

Frequency analysis of extreme rainfall and flood events are used to determine design rainfalls and design floods which are needed to design hydraulic structures such as dams, spillways and culverts. Standard methods for frequency analysis of extreme events are based on the assumption of a stationary climate. However, this assumption in rainfall and flood frequency analysis is being challenged with growing evidence of climate change. As a consequence of a changing climate, the frequency and magnitude of extreme rainfall events are reported to have increased in parts of South Africa, and these and other changes in extreme rainfall occurrences are expected to continue into the future. The possible non-stationarity in climate resulting in changes in rainfall may impact on the accuracy of the estimation of extreme rainfall quantities and design rainfall estimations. This may have significant consequences for the design of new hydraulic infrastructure, as well as for the rehabilitation of existing infrastructure. Hence, methods that account for non-stationary data, such as caused by climate change, need to be developed. This may be achieved by using data from downscaled global circulation models in order to identify non-stationary climate variables which affect rainfall, and which can then be incorporated into extreme value analysis of a non-stationary data series.


Sign in / Sign up

Export Citation Format

Share Document