scholarly journals Simulating Surface and Subsurface Water Balance Changes Due to Burn Severity

2018 ◽  
Vol 17 (1) ◽  
pp. 180099 ◽  
Author(s):  
Adam L. Atchley ◽  
Alicia M. Kinoshita ◽  
Sonya R. Lopez ◽  
Laura Trader ◽  
Richard Middleton
2012 ◽  
Vol 16 (7) ◽  
pp. 1969-1990 ◽  
Author(s):  
G. Kraller ◽  
M. Warscher ◽  
H. Kunstmann ◽  
S. Vogl ◽  
T. Marke ◽  
...  

Abstract. The water balance in high Alpine regions is often characterized by significant variation of meteorological variables in space and time, a complex hydrogeological situation and steep gradients. The system is even more complex when the rock composition is dominated by soluble limestone, because unknown underground flow conditions and flow directions lead to unknown storage quantities. Reliable distributed modeling cannot be implemented by traditional approaches due to unknown storage processes at local and catchment scale. We present an artificial neural network extension of a distributed hydrological model (WaSiM-ETH) that allows to account for subsurface water transfer in a karstic environment. The extension was developed for the Alpine catchment of the river "Berchtesgadener Ache" (Berchtesgaden Alps, Germany), which is characterized by extreme topography and calcareous rocks. The model assumes porous conditions and does not account for karstic environments, resulting in systematic mismatch of modeled and measured runoff in discharge curves at the outlet points of neighboring high alpine subbasins. Various precipitation interpolation methods did not allow to explain systematic mismatches, and unknown subsurface hydrological processes were concluded as the underlying reason. We introduce a new method that allows to describe the unknown subsurface boundary fluxes, and account for them in the hydrological model. This is achieved by an artificial neural network approach (ANN), where four input variables are taken to calculate the unknown subsurface storage conditions. This was first developed for the high Alpine subbasin Königsseer Ache to improve the monthly water balance. We explicitly derive the algebraic transfer function of an artificial neural net to calculate the missing boundary fluxes. The result of the ANN is then implemented in the groundwater module of the hydrological model as boundary flux, and considered during the consecutive model process. We tested several ANN setups in different time increments to investigate ANN performance and to examine resulting runoff dynamics of the hydrological model. The ANN with 5-day time increment showed best results in reproducing the observed water storage data (r2 = 0.6). The influx of the 20-day ANN showed best results in the hydrological model correction. The boundary influx in the subbasin improved the hydrological model, as performance increased from NSE = 0.48 to NSE = 0.57 for subbasin Königsseetal, from NSE = 0.22 to NSE = 0.49 for subbasin Berchtesgadener Ache, and from NSE = 0.56 to NSE = 0.66 for the whole catchment within the test period. This combined approach allows distributed quantification of water balance components including subsurface water transfer.


2012 ◽  
Vol 9 (1) ◽  
pp. 215-259
Author(s):  
G. Kraller ◽  
M. Warscher ◽  
H. Kunstmann ◽  
S. Vogl ◽  
T. Marke ◽  
...  

Abstract. The water balance in high Alpine regions is often characterized by significant variation of meteorological variables in space and time, a complex hydrogeological situation and steep gradients. The system is even more complex when the rock composition is dominated by soluble limestone, because unknown underground flow conditions and flow directions lead to unknown storage quantities. Reliable distributed modeling cannot be implemented by traditional approaches due to unknown storage processes at local and catchment scale. We present an artificial neural network extension of a distributed hydrological model (WaSiM-ETH) that allows to account for subsurface water transfer in a karstic environment. The extension was developed for the Alpine catchment of the river "Berchtesgadener Ache" (Berchtesgaden Alps, Germany), which is characterized by extreme topography and calcareous rocks. The model assumes porous conditions and does not account for karstic environments, resulting in systematic mismatch of modeled and measured runoff in discharge curves at the outlet points of neighboring high alpine sub-catchments. Various precipitation interpolation methods did not allow to explain systematic mismatches, and unknown subsurface hydrological processes were concluded as the underlying reason. We introduce a new method that allows to describe the unknown subsurface boundary fluxes, and account for them in the distributed model. This is achieved by an Artificial Neural Network approach (ANN), where three input variables are taken to calculate the unknown subsurface storage conditions. We explicitly derive the algebraic transfer function of an artificial neural net to calculate the missing boundary fluxes. The result of the ANN is then implemented in the groundwater module of the distributed model as boundary flux, and considered during the consecutive model process. The ANN was able to reproduce the observed water storage data sufficiently (r2 = 0.48). The boundary influx in the sub-catchment improved the distributed model, as performance increased from NSE = 0.34 to NSE = 0.57. This combined approach allows distributed quantification of water balance components including subsurface water transfer.


2011 ◽  
Vol 52 (No. 6) ◽  
pp. 239-244 ◽  
Author(s):  
P. Kovář

The paper is focused on the impact of land use changes on water regime. First, an emphasis was given to what extent the main components of the water balance on the experimental catchment Všeminka (region Vsetínské Hills) were influenced. For this reason, the WBCM-5 model was implemented for the period of 10 years in a daily step with a particular reference to simulate the components of direct runoff and of subsurface water recharge. In the selected years of the period 1990–2000, the major changes were made in land use and also the significant fluctuation of rainfall-runoff regimes were observed (e.g. dry year 1992 and flood year 1997). After WBCM-5 parameter calibration it was found that some water balance components can change in relation to substantial land use changes even up to tens of percent in a balance-consideration, i.e. in daily, monthly and yearly or decade values, namely the components of interception and also of direct runoff and of subsurface water recharge. However, a different situation appears when investigating significant short-term rainfall-runoff processes. There were about seven real flood events analysed using the model KINFIL-2 (time step 0.5 hr) during the same period of about 10 years on the same catchment. Furthermore, some land use change positive or negative scenarios were also analysed there. As opposed to long-term water balance analyses, there was never achieved any greater differences in the hydrograph peak or volume than 10%. Summarising, it is always important to distinguish a possible land use change impact in either long-term balance or short-term runoff consideration, otherwise a misunderstanding might be easily made, as can often be found when commenting on the impact on floods in some mass media.


2004 ◽  
Vol 36 (4) ◽  
pp. 1982
Author(s):  
Γ. Δημόπουλος ◽  
Β. Μπαρούτη

In this paper the hydro geological conditions of the quaternary deposits of Doirani lake basin are represented. The surface and subsurface water balance for the years 1988-1990 are also calculated.During the period 1985-1998 decline of 3,77 m of the lake's level has been noticed. The fact that the lake does not appear to have any sign of restoration leads to many questions regarding to its existence, water balance, hydraulic conditions of the basin and the groundwater overexploretion Analyzing the available hydrological, geological, lithological data a water balance deficit of 101.23*106 m3 /year for the period 1988-1990 is calculated resulted from the groundwater overexploretion. At the same time the loss of water due to the decline of the lake's level have reached the amounts of 90.58*106 m3 /year.


2021 ◽  
Vol 3 ◽  
Author(s):  
Joseph Rungee ◽  
Qin Ma ◽  
Michael L. Goulden ◽  
Roger Bales

Spatially resolved annual evapotranspiration was calculated across the 14 main river basins draining into California's Central Valley, USA, using a statistical model that combined satellite greenness, gridded precipitation, and flux-tower measurements. Annual evapotranspiration across the study area averaged 529 mm. Average basin-scale annual precipitation minus evapotranspiration was in good agreement with annual runoff, with deviations in wet and dry years suggesting withdrawal or recharge of subsurface water storage. Evapotranspiration peaked at lower elevations in the colder, northern basins, and at higher elevations in the southern high-Sierra basins, closely tracking the 12.3°C mean temperature isocline. Precipitation and evapotranspiration are closely balanced across much of the study region, and small shifts in either will cause disproportionate changes in water storage and runoff. The majority of runoff was generated below the rain-snow transition in northern basins, and originated in snow-dominated elevations in the southern basins. Climate warming that increases growing season length will increase evapotranspiration and reduce runoff across all elevations in the north, but only at higher elevations in the south. Feedback mechanisms in these steep mountain basins, plus over-year subsurface storage, with their steep precipitation and temperature gradients, provide important buffering of the water balance to change. Leave-one-out cross validation revealed that the statistical model for annual evapotranspiration is sensitive to the number and distribution of measurement sites, implying that additional strategically located flux towers would improve evapotranspiration predictions. Leave-one-out with individual years was less sensitive, implying that longer records are less important. This statistical top-down modeling of evapotranspiration provides an important complement to constraining water-balance measurements with gridded precipitation and unimpaired runoff, with applications such as quantifying water balance following forest die-off, management or wildfire.


2020 ◽  
Author(s):  
Merten Minke ◽  
Ann Christin Sieber ◽  
Arne Tegge ◽  
Heinrich Höper

<p>About 30% (0.4 Mha) of German peatlands are located in Lower Saxony and about 65% of these peatlands are used for agriculture, mainly grassland. These peatlands are drained for agricultural use, which creates huge GHG emissions. Grasslands on carbon rich soils are responsible for seven percent of the GHG budget of Lower Saxony. Raising the annual water level to 30 cm below surface or higher should substantially reduce peat oxidation and GHG emissions from such sites, while allowing grassland management or other ways of peatland utilization under wet conditions. Such water levels, however, may be difficult to achieve by high ditch water levels alone, because the low hydraulic conductivity of the degraded peat does not allow sufficient water movement to compensate for evapotranspiration in summer. We hypothesize that subsurface water regulation may allow constant high peatland water levels, because the applied submerged drains form conduits from ditches into the peat that should improve the water exchange.</p><p>We tested subsurface water regulation at 1 ha plots on a fen and bog grassland in NW-Germany. Both sites included three treatments: (1) blocked ditches with subsurface water regulation, (2) blocked ditches without subsurface water regulation, and (3) conventional drainage (control). Ditches in treatments (1) and (2) were filled with surface water up to 15 cm below land surface during the growing season using a solar pump. Over a period of three years, we monitored ditch and peatland water levels along transects. We analyzed effects of treatments, ditch water levels, climatic water balance, and saturated water conductivity (kf) on peatland water levels and changes of surface elevation.</p><p>Our results show that subsurface water regulation allowed for a better control of peatland water levels as compared to ditch blocking and conventional drainage. In the winter, subsurface water regulation improved drainage, so that water levels within the site were not much higher than the ditch water levels. In the summer, subsurface water regulation allowed to maintain peatland water levels of 30 to 40 cm below surface, more than 20 cm higher compared to both other treatments. Furthermore, subsurface water regulation reduced subsidence. However, despite a narrow drain spacing of four to five meters, it was difficult to maintain the target peatland water levels during very dry summer months albeit the tested years were atypically dry and hot. The differences between ditch water levels and peatland water levels were closely related to the climatic water balance, and the slope of the linear function depended on saturated water conductivity (kf) of the peat. Based on climatic water balances, weir adjustment can be optimized to achieve high and stable peatland water levels. The results help in understanding and analyzing the hydrology of degraded peatlands. This information will prove extremely useful for planning water management measures, which are necessary to reduce the GHG emissions from drained peatlands.</p>


2020 ◽  
Author(s):  
Roman Juras ◽  
Yuliya Vystavna ◽  
Soňa Hnilicová

<p>Hydrological response covered by disturbed forest catchments are in a focus of hydrologist last decades, mainly because the connection with widespread droughts. In this study, we compare two mountain catchments in Šumava Mts. (Czech Republic), both with small glacial lakes. Plešné lake catchment is characterised by disturbed forest by a bark beetle calamity. Contrary, Čertovo lake catchment features with undisturbed forest. Both catchments have comparable geological, climate setting and origin forest types. Stable isotopes of water were used for determining the hydrological pathways and water residence time. The results show that the state of the forest significantly affects the water balance of the catchments, but the mean residence time seems to be independent on this. On the other hand, even small changes in water residence time are important for the solutes and nutrients transport in the catchments. The lakes are fed by surface and subsurface water originating from liquid precipitation in and mostly snow in winter. The isotopic analysis helps to understand how much the snow cover affects the water balance during the hydrological year in two catchments with different forest stands.</p>


2020 ◽  
Vol 24 (9) ◽  
pp. 4317-4337
Author(s):  
Francesco Avanzi ◽  
Joseph Rungee ◽  
Tessa Maurer ◽  
Roger Bales ◽  
Qin Ma ◽  
...  

Abstract. Multi-year droughts in Mediterranean climates may shift the water balance, that is, the partitioning rule of precipitation across runoff, evapotranspiration, and sub-surface storage. Mechanisms causing these shifts remain largely unknown and are not well represented in hydrologic models. Focusing on measurements from the headwaters of California's Feather River, we found that also in these mixed rain–snow Mediterranean basins a lower fraction of precipitation was partitioned to runoff during multi-year droughts compared to non-drought years. This shift in the precipitation–runoff relationship was larger in the surface-runoff-dominated than subsurface-flow-dominated headwaters (−39 % vs. −18 % decline of runoff, respectively, for a representative precipitation amount). The predictive skill of the Precipitation Runoff Modeling System (PRMS) hydrologic model in these basins decreased during droughts, with evapotranspiration (ET) being the only water-balance component besides runoff for which the drop in predictive skill during drought vs. non-drought years was statistically significant. In particular, the model underestimated the response time required by ET to adjust to interannual climate variability, which we define as climate elasticity of ET. Differences between simulated and data-driven estimates of ET were well correlated with accompanying data-driven estimates of changes in sub-surface storage (ΔS, r=0.78). This correlation points to shifts in precipitation–runoff relationships being evidence of a hysteretic response of the water budget to climate elasticity of ET during and after multi-year droughts. This hysteresis is caused by carryover storage offsetting precipitation deficit during the initial drought period, followed by vegetation mortality when storage is depleted and subsequent post-drought vegetation expansion. Our results point to a general improvement in hydrologic predictions across drought and recovery cycles by including the climate elasticity of ET and better accounting for actual subsurface water storage in not only soil, but also deeper regolith that stores water accessible to roots. This can be done by explicitly parametrizing carryover storage and feedback mechanisms capturing vegetation response to atmospheric demand for moisture.


Sign in / Sign up

Export Citation Format

Share Document