scholarly journals Addressing Spatial Heterogeneity in the Discrete Generalized Nash Model for Flood Routing

Water ◽  
2021 ◽  
Vol 13 (21) ◽  
pp. 3133
Author(s):  
Bao-Wei Yan ◽  
Yi-Xuan Zou ◽  
Yu Liu ◽  
Ran Mu ◽  
Hao Wang ◽  
...  

River flood routing is one of the key components of hydrologic modeling and the topographic heterogeneity of rivers has great effects on it. It is beneficial to take into consideration such spatial heterogeneity, especially for hydrologic routing models. The discrete generalized Nash model (DGNM) based on the Nash cascade model has the potential to address spatial heterogeneity by replacing the equal linear reservoirs into unequal ones. However, it seems impossible to obtain the solution of this complex high order differential equation directly. Alternatively, the strict mathematical derivation is combined with the deeper conceptual interpretation of the DGNM to obtain the heterogeneous DGNM (HDGNM). In this work, the HDGNM is explicitly expressed as a linear combination of the inflows and outflows, whose weight coefficients are calculated by the heterogeneous S curve. Parameters in HDGNM can be obtained in two different ways: optimization by intelligent algorithm or estimation based on physical characteristics, thus available to perform well in both gauged and ungauged basins. The HDGNM expands the application scope, and becomes more applicable, especially in river reaches where the river slopes and cross-sections change greatly. Moreover, most traditional routing models are lumped, whereas the HDGNM can be developed to be semidistributed. The middle Hanjiang River in China is selected as a case study to test the model performance. The results show that the HDGNM outperforms the DGNM in terms of model efficiency and smaller relative errors and can be used also for ungauged basins.

2020 ◽  
Author(s):  
Baowei Yan ◽  
Huining Jiang ◽  
Zhengkun Li ◽  
Jun Zhang ◽  
Wenfa Yang

Abstract. The topographic heterogeneity of the rivers has great effects on the river flood routing. The discrete generalized Nash model (DGNM), developed on the basis of the Nash's instantaneous unit hydrograph (IUH), is a lumped model that can't reflect the spatial heterogeneity of the river topography. The heterogeneous DGNM (HDGNM) with a consideration of such heterogeneity has been developed by the conceptual interpretation of the DGNM. Two compositions of the downstream outflow generated by the recession of the old water stored in the river channel and the discharge of the new water from upstream inflow were deduced respectively with the help of the heterogeneous IUH and the corresponding heterogeneous S curve. The HDGNM is finally expressed as a linear combination of the inflows and outflows, whose weight coefficients are calculated by the heterogeneous S curve. The HDGNM expands the application scope, and becomes more applicable, especially in river reaches where the river slopes and cross-sections change greatly. The middle Hanjiang River was selected as a case study to test the model performance. It is suggested that the HDGNM performs better than the DGNM, with higher model efficiency and smaller relative error in the simulated flood hydrographs.


Water ◽  
2021 ◽  
Vol 13 (11) ◽  
pp. 1456
Author(s):  
Kee-Won Seong ◽  
Jang Hyun Sung

An oscillatory S-curve causes unexpected fluctuations in a unit hydrograph (UH) of desired duration or an instantaneous UH (IUH) that may affect the constraints for hydrologic stability. On the other hand, the Savitzky–Golay smoothing and differentiation filter (SG filter) is a digital filter known to smooth data without distorting the signal tendency. The present study proposes a method based on the SG filter to cope with oscillatory S-curves. Compared to previous conventional methods, the application of the SG filter to an S-curve was shown to drastically reduce the oscillation problems on the UH and IUH. In this method, the SG filter parameters are selected to give the minimum influence on smoothing and differentiation. Based on runoff reproduction results and performance criteria, it appears that the SG filter performed both smoothing and differentiation without the remarkable variation of hydrograph properties such as peak or time-to peak. The IUH, UH, and S-curve were estimated using storm data from two watersheds. The reproduced runoffs showed high levels of model performance criteria. In addition, the analyses of two other watersheds revealed that small watershed areas may experience scale problems. The proposed method is believed to be valuable when error-prone data are involved in analyzing the linear rainfall–runoff relationship.


2018 ◽  
Vol 22 (8) ◽  
pp. 4565-4581 ◽  
Author(s):  
Florian U. Jehn ◽  
Lutz Breuer ◽  
Tobias Houska ◽  
Konrad Bestian ◽  
Philipp Kraft

Abstract. The ambiguous representation of hydrological processes has led to the formulation of the multiple hypotheses approach in hydrological modeling, which requires new ways of model construction. However, most recent studies focus only on the comparison of predefined model structures or building a model step by step. This study tackles the problem the other way around: we start with one complex model structure, which includes all processes deemed to be important for the catchment. Next, we create 13 additional simplified models, where some of the processes from the starting structure are disabled. The performance of those models is evaluated using three objective functions (logarithmic Nash–Sutcliffe; percentage bias, PBIAS; and the ratio between the root mean square error and the standard deviation of the measured data). Through this incremental breakdown, we identify the most important processes and detect the restraining ones. This procedure allows constructing a more streamlined, subsequent 15th model with improved model performance, less uncertainty and higher model efficiency. We benchmark the original Model 1 and the final Model 15 with HBV Light. The final model is not able to outperform HBV Light, but we find that the incremental model breakdown leads to a structure with good model performance, fewer but more relevant processes and fewer model parameters.


2021 ◽  
Author(s):  
Abebe Tadesse Bulti

Abstract An advancement on flood routing techniques is important for a good perdiction and forecast of the flow discharge in a river basins. Hydraulic and hydrologic routing techniques are widely applied in most simulation models separately. A combined hydrologic and hydraulic routing method is a recent approach that used to improve the modeling effort in hydrological studies. The main drawback of hydrologic routing methods was inaccuracy on downstream areas of the river basin, where the effect of hydraulic structures and the river dynamics processes are dominant. The hydraulic routing approaches are relatively good on a downstream reaches of a river. This research was done on the Awash River basin, at the upstream areas of a Koka dam. A combined hydrologic and hydraulic approach was used to assess the discharge and sediment flow in the river basin. The hydrologic routing method was applied at an upstream part of a river basin through a SWAT model. HEC-RAS model was applied at the middle and downstream areas of the study basin based on hydraulic routing principle. A combined routing method can improve the result from a simulation process and increases an accuracy on a prediction of the peak flow. It can simulate a flow discharges for both short and long-term duration, with good model performance indicators. Besides, sediment modeling was done by comparing a regression model, SWAT model, and combination of HEC-RAS and SWAT model. The result from the sediment modeling indicates that the regression model and combined model show good agreement in predicting the suspended sediment in the river basin. The integrated application of such different type of models can be one of the option for sediment modeling.


2013 ◽  
Vol 302 ◽  
pp. 124-127
Author(s):  
Yin Bang Liu ◽  
Li Chun Jiang

Wood density samples were collected from dahurian larch (Larix gmelinii Rupr.) trees grown in northeastern China. Six discs (about 5 cm thick) were cut from each tree (i.e. from the root stem, at breast height (1.3m), and at 20%, 40%, 60%, and 80% of the total height). For each disc, a thick sliver with parallel sides was cut out along the diameter of the disc. The sliver was about 40-mm thick, with the pith located in the middle. Eight small pieces were cut from the sliver with equal distance from pith to bark. Wood density of small piece was obtained using water displacement method. A second order polynomial equation with linear mixed-effects was used for modeling wood density. The LME procedure in S-Plus is used to fit the mixed-effects models for the wood density data. The results showed that the polynomial model with three random parameters could significantly improve the model performance. The fitted mixed-effects model was also evaluated using a separate dataset. The mixed model was found to predict wood density better than the original model fitted using ordinary least-squares based on absolute and relative errors.


2019 ◽  
Vol 12 (2) ◽  
pp. 849-878 ◽  
Author(s):  
Quazi Z. Rasool ◽  
Jesse O. Bash ◽  
Daniel S. Cohan

Abstract. Soils are important sources of emissions of nitrogen-containing (N-containing) gases such as nitric oxide (NO), nitrous acid (HONO), nitrous oxide (N2O), and ammonia (NH3). However, most contemporary air quality models lack a mechanistic representation of the biogeochemical processes that form these gases. They typically use heavily parameterized equations to simulate emissions of NO independently from NH3 and do not quantify emissions of HONO or N2O. This study introduces a mechanistic, process-oriented representation of soil emissions of N species (NO, HONO, N2O, and NH3) that we have recently implemented in the Community Multiscale Air Quality (CMAQ) model. The mechanistic scheme accounts for biogeochemical processes for soil N transformations such as mineralization, volatilization, nitrification, and denitrification. The rates of these processes are influenced by soil parameters, meteorology, land use, and mineral N availability. We account for spatial heterogeneity in soil conditions and biome types by using a global dataset for soil carbon (C) and N across terrestrial ecosystems to estimate daily mineral N availability in nonagricultural soils, which was not accounted for in earlier parameterizations for soil NO. Our mechanistic scheme also uses daily year-specific fertilizer use estimates from the Environmental Policy Integrated Climate (EPIC v0509) agricultural model. A soil map with sub-grid biome definitions was used to represent conditions over the continental United States. CMAQ modeling for May and July 2011 shows improvement in model performance in simulated NO2 columns compared to Ozone Monitoring Instrument (OMI) satellite retrievals for regions where soils are the dominant source of NO emissions. We also assess how the new scheme affects model performance for NOx (NO+NO2), fine nitrate (NO3) particulate matter, and ozone observed by various ground-based monitoring networks. Soil NO emissions in the new mechanistic scheme tend to fall between the magnitudes of the previous parametric schemes and display much more spatial heterogeneity. The new mechanistic scheme also accounts for soil HONO, which had been ignored by parametric schemes.


Hydrology ◽  
2019 ◽  
Vol 6 (2) ◽  
pp. 32 ◽  
Author(s):  
Nag ◽  
Biswal

Construction of flow duration curves (FDCs) is a challenge for hydrologists as most streams and rivers worldwide are ungauged. Regionalization methods are commonly followed to solve the problem of discharge data scarcity by transforming hydrological information from gauged basins to ungauged basins. As a consequence, regionalization-based FDC predictions are not very reliable where discharge data are scarce quantitatively and/or qualitatively. In such a scenario, it is perhaps more meaningful to use a calibration-free rainfall‒runoff model that can exploit easily available meteorological information to predict FDCs in ungauged basins. This hypothesis is tested in this study by comparing a well-known regionalization-based model, the inverse distance weighting (IDW) model, with the recently proposed calibration-free dynamic Budyko model (DB) in a region where discharge observations are not only insufficient quantitatively but also show apparent signs of observational errors. The DB model markedly outperformed the IDW model in the study region. Furthermore, the IDW model’s performance sharply declined when we randomly removed discharge gauging stations to test the model in a variety of data availability scenarios. The analysis here also throws some light on how errors in observational datasets and drainage area influence model performance and thus provides a better picture of the relative strengths of the two models. Overall, the results of this study support the notion that a calibration-free rainfall‒runoff model can be chosen to predict FDCs in discharge data-scarce regions. On a philosophical note, our study highlights the importance of process understanding for the development of meaningful hydrological models.


Water ◽  
2019 ◽  
Vol 11 (4) ◽  
pp. 772 ◽  
Author(s):  
Yingbing Chen ◽  
Peng Shi ◽  
Simin Qu ◽  
Xiaomin Ji ◽  
Lanlan Zhao ◽  
...  

The geomorphologic instantaneous unit hydrograph (GIUH) is an applicable approach that simulates the runoff for the ungauged basins. The nash model is an efficient tool to derive the unit hydrograph (UH), which only requires two items, including the indices n and k. Theoretically, the GIUH method describes the process of a droplet flowing from which it falls on to the basin outlet, only covering the flow concentration process. The traditional technique for flood estimation using GIUH method always uses the effective rainfall, which is empirically obtained and scant of accuracy, and then calculates the convolution of the effective rainfall and GIUH. To improve the predictive capability of the GIUH model, the Xin’anjiang (XAJ) model, which is a conceptual model with clear physical meaning, is applied to simulate the runoff yielding and the slope flow concentration, integrating with the GIUH derived based on Nash model to compute the river network flow convergence, forming a modified GIUH model for flood simulation. The average flow velocity is the key to obtain the indices k, and two methods to calculate the flow velocity were compared in this study. 10 flood events in three catchments in Fujian, China are selected to calibrate the model, and six for validation. Four criteria, including the time-to-peak error, the relative peak flow error, the relative runoff depth error, and the Nash–Sutcliff efficiency coefficient are computed for the model performance evaluation. The observed runoff value and simulated series in validation stage is also presented in the scatter plots to analyze the fitting degree. The analysis results show the modified model with a convenient calculation and a high fitting and illustrates that the model is reliable for the flood estimation and has potential for practical flood forecasting.


2017 ◽  
Author(s):  
Florian U. Jehn ◽  
Lutz Breuer ◽  
Tobias Houska ◽  
Konrad Bestian ◽  
Philipp Kraft

Abstract. The ambiguous representation of hydrological processes have led to the formulation of the multiple hypotheses approach in hydrological modelling, which requires new ways of model construction. However, most recent studies focus only on the comparison of predefined model structures or building a model step-by-step. This study tackles the problem the other way around: We start with one complex model structure, which includes all processes deemed to be important for the catchment. Next, we create 13 additional simplified models, where some of the processes from the starting structure are disabled. The performance of those models is evaluated using three objective functions (logarithmic Nash-Sutcliffe, percentage bias and the ratio between root mean square error to the standard deviation of the measured data). Through this incremental breakdown, we identify the most important processes and detect the restraining ones. This procedure allows constructing a more streamlined, subsequent 15th model with improved model performance, less uncertainty and higher model efficiency. We benchmark the original Model 1 with the final Model 15 and find that the incremental model breakdown leads to a structure with good model performance, fewer but more relevant processes and less model parameters.


Sign in / Sign up

Export Citation Format

Share Document