scholarly journals Prognostic simulation and analysis of the impact of climate change on the hydrological dynamics in Thuringia, Germany

2007 ◽  
Vol 4 (6) ◽  
pp. 4037-4067
Author(s):  
P. Krause ◽  
S. Hanisch

Abstract. The impact of predicted climate change on the hydrological dynamics and long term hydrological balance in the federal German state Thuringia was investigated and analysed. For this study the prognostic climate data, provided by the statistical regionalisation approach WETTREG, which is based on results of the global climate model ECHAM5/MPI-OM, was used. This regional climate model provides synthetic climate time series for the existent precipitation and climate station in Germany from 2000 to 2100. This data was processed with the hydrological model J2000g which we used for the regionalisation of the climatological time series data and for the computation of potential and actual evapotranspiration, runoff generation and groundwater recharge. In this study we analysed the two emission scenarios A2 and B1, defined by the Intergovernmental Panel on Climate Change (IPCC) and their impact on the temporal and spatial distribution of temperature, precipitation, evapotranspiration and runoff generation for the time frame 2071–2100 for the entire area of the German state of Thuringia. For this purpose we compared simulation with the scenario data with simulation results based on reference data from 1971–2000. The comparison showed an increase of the mean annual temperature of 1.8 (B1) to 2.2 (A2) °C which is much more distinct during winter. The mean annual precipitation is decreasing only slightly but, the seasonal spatio-temporal rainfall distribution which has major impact on the hydrological water balance is changing significantly. This pattern change results in more precipitation during winter and less in summer. Actual evapotranspiration was computed higher for both scenarios compared to the evapotranspiration of the reference period 1971–2000. As a follow up a decrease in the runoff generation was simulated which was again very variable in space and time. The overall trends worked out in this study showed that it is likely that the extremes of flooding in winter and dry spells in summer might occur more often in Thuringia because of the changing weather conditions due to climate change.

2009 ◽  
Vol 21 ◽  
pp. 33-48 ◽  
Author(s):  
P. Krause ◽  
S. Hanisch

Abstract. The impact of projected climate change on the long-term hydrological balance and seasonal variability in the federal German state of Thuringia was assessed and analysed. For this study projected climate data for the scenarios A2 and B1 were used in conjunction with a conceptual hydrological model. The downscaled climate data are based on outputs of the general circulation model ECHAM5 and provide synthetic climate time series for a large number of precipitation and climate stations in Germany for the time period of 1971 to 2100. These data were used to compute the spatially distributed hydrological quantities, i.e. precipitation, actual evapotranspiration and runoff generation with a conceptual hydrological model. This paper discusses briefly the statistical downscaling method and its validation in Thuringia and includes an overview of the hydrological model. The achieved results show that the projected climate conditions in Thuringia follow the general European climate trends – increased temperature, wetter winters, drier summers. But, in terms of the spatial distribution and interannual variability regional differences occur. The analysis showed that the general increase of the winter precipitation is more distinct in the mid-mountain region and less pronounced in the lowland whereas the decrease of summer precipitation is higher in the lowland and less distinct in the mid-mountains. The actual evapotranspiration showed a statewide increase due to higher temperatures which is largest in the summer period. The resulting runoff generation in winter was found to increase in the mid-mountains and to slightly decrease in the lowland region. In summer and fall a decrease in runoff generation was estimated for the entire area due to lower precipitation and higher evapotranspiration rates. These spatially differentiated results emphasize the need of high resolution climate input data and distributed modelling for regional impact analyses.


2021 ◽  
Vol 164 (1-2) ◽  
Author(s):  
Bano Mehdi ◽  
Julie Dekens ◽  
Mathew Herrnegger

AbstractThe Ruhezamyenda catchment in Uganda includes a unique lake, Lake Bunyonyi, and is threatened by increasing social and environmental pressures. The COSERO hydrological model was used to assess the impact of climate change on future surface runoff and evapotranspiration in the Lake Bunyonyi catchment (381 km2). The model was forced with an ensemble of CMIP5 global climate model (GCM) simulations for the mid-term future (2041–2070) and for the far future (2071–2100), each with RCP4.5 and RCP8.5. In the Ruhezamyenda catchment, compared to 1971–2000, the median of all GCMs (for both RCPs) showed the mean monthly air temperature to increase by approximately 1.5 to 3.0 °C in the mid-term future and by roughly 2.0 to 4.5 °C in the far future. The mean annual precipitation is generally projected to increase, with future changes between − 25 and + 75% (RCP8.5). AET in the Lake Bunyonyi catchment was simulated to increase for the future by approximately + 8 mm/month in the median of all GCMs for RCP8.5 for the far future. The runoff for future periods showed much uncertainty, but with an overall increasing trend. A combination of no-regrets adaptation options in the five categories of: governance; communication and capacity development; water, soil, land management and livelihoods improvement; data management; and research, was identified and validated with stakeholders, who also identified additional adaptation actions based on the model results. This study contributes to improving scientific knowledge on the impacts of climate change on water resources in Uganda with the purpose to support adaptation.


Author(s):  
M. K. Patasaraiya ◽  
B. Sinha ◽  
J. Bisaria ◽  
S. Saran ◽  
R. K. Jaiswal

<p><strong>Abstract.</strong> Climate change poses a severe threat to the forest ecosystems by impacting its productivity, species composition and forest biodiversity at global and regional level. The scientific community all over the world is using remote sensing techniques to monitor and assess the impact of climate change on forest ecosystems. The consistent time series data provided by MODIS is immensely used for developing a different type of Vegetation indices like NDVI (Normalized difference vegetation indices) products at different spatial and temporal resolution. These vegetation indices have significant potential to detect forest growth and health, vegetation seasonality and different phenological events like budding and flowering. The current study aims to understand the impact of climate change on Teak and Sal forest of STR (Satpura tiger reserve) in central India by using Landsat and MODIS time series data. The rationale for taking STR as study site was to attribute the changes exclusively to climate change as there is no anthropogenic disturbance in STR. A change detection analysis was carried out to detect changes between the period 2017 and 1990 using Landsat data of October month. To understand the inter-annual and seasonal variation of Teak and Sal forests, freely available MOD13Q1 product (250<span class="thinspace"></span>m, 16 days’ interval) was used to extract NDVI values for each month and four seasons (DJF, JJAS, ON, MAM) for the period 2000 to 2015. The climatic data (rainfall and temperature) was sourced from IMD (India Meteorological Department) at different resolutions (1, 0.5 and 0.25 degree) for the given period of the study. A correlation analysis was done to establish a causal relationship between climate variable (temperature and rainfall) and vegetation health (NDVI) on a different temporal scale of annual, seasonal and month. The study found an increasing trend in annual mean temperature and no consistent trend in total annual rainfall over the period 2000 to 2015. The maximum percentage change was observed in minimum temperature over the period 2000 to 2015. The average annual NDVI of Teak and Sal forests showed an increasing trend however, no trend was observed in seasonal and monthly NDVI over the same period. The maximum and minimum NDVI was found in the post-monsoon months (ON) and summer months (MAM) respectively. As STR is a Teak and Sal dominated landscape, the findings of the current study can also be applied in developing silvicultural and adaptation strategies for other Teak and Sal dominated landscapes of central India.</p>


2008 ◽  
Vol 18 (12) ◽  
pp. 3679-3687 ◽  
Author(s):  
AYDIN A. CECEN ◽  
CAHIT ERKAL

We present a critical remark on the pitfalls of calculating the correlation dimension and the largest Lyapunov exponent from time series data when trend and periodicity exist. We consider a special case where a time series Zi can be expressed as the sum of two subsystems so that Zi = Xi + Yi and at least one of the subsystems is deterministic. We show that if the trend and periodicity are not properly removed, correlation dimension and Lyapunov exponent estimations yield misleading results, which can severely compromise the results of diagnostic tests and model identification. We also establish an analytic relationship between the largest Lyapunov exponents of the subsystems and that of the whole system. In addition, the impact of a periodic parameter perturbation on the Lyapunov exponent for the logistic map and the Lorenz system is discussed.


Forecasting ◽  
2021 ◽  
Vol 3 (1) ◽  
pp. 39-55
Author(s):  
Rodgers Makwinja ◽  
Seyoum Mengistou ◽  
Emmanuel Kaunda ◽  
Tena Alemiew ◽  
Titus Bandulo Phiri ◽  
...  

Forecasting, using time series data, has become the most relevant and effective tool for fisheries stock assessment. Autoregressive integrated moving average (ARIMA) modeling has been commonly used to predict the general trend for fish landings with increased reliability and precision. In this paper, ARIMA models were applied to predict Lake Malombe annual fish landings and catch per unit effort (CPUE). The annual fish landings and CPUE trends were first observed and both were non-stationary. The first-order differencing was applied to transform the non-stationary data into stationary. Autocorrelation functions (AC), partial autocorrelation function (PAC), Akaike information criterion (AIC), Bayesian information criterion (BIC), square root of the mean square error (RMSE), the mean absolute error (MAE), percentage standard error of prediction (SEP), average relative variance (ARV), Gaussian maximum likelihood estimation (GMLE) algorithm, efficiency coefficient (E2), coefficient of determination (R2), and persistent index (PI) were estimated, which led to the identification and construction of ARIMA models, suitable in explaining the time series and forecasting. According to the measures of forecasting accuracy, the best forecasting models for fish landings and CPUE were ARIMA (0,1,1) and ARIMA (0,1,0). These models had the lowest values AIC, BIC, RMSE, MAE, SEP, ARV. The models further displayed the highest values of GMLE, PI, R2, and E2. The “auto. arima ()” command in R version 3.6.3 further displayed ARIMA (0,1,1) and ARIMA (0,1,0) as the best. The selected models satisfactorily forecasted the fish landings of 2725.243 metric tons and CPUE of 0.097 kg/h by 2024.


2021 ◽  
Vol 11 (8) ◽  
pp. 3561
Author(s):  
Diego Duarte ◽  
Chris Walshaw ◽  
Nadarajah Ramesh

Across the world, healthcare systems are under stress and this has been hugely exacerbated by the COVID pandemic. Key Performance Indicators (KPIs), usually in the form of time-series data, are used to help manage that stress. Making reliable predictions of these indicators, particularly for emergency departments (ED), can facilitate acute unit planning, enhance quality of care and optimise resources. This motivates models that can forecast relevant KPIs and this paper addresses that need by comparing the Autoregressive Integrated Moving Average (ARIMA) method, a purely statistical model, to Prophet, a decomposable forecasting model based on trend, seasonality and holidays variables, and to the General Regression Neural Network (GRNN), a machine learning model. The dataset analysed is formed of four hourly valued indicators from a UK hospital: Patients in Department; Number of Attendances; Unallocated Patients with a DTA (Decision to Admit); Medically Fit for Discharge. Typically, the data exhibit regular patterns and seasonal trends and can be impacted by external factors such as the weather or major incidents. The COVID pandemic is an extreme instance of the latter and the behaviour of sample data changed dramatically. The capacity to quickly adapt to these changes is crucial and is a factor that shows better results for GRNN in both accuracy and reliability.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Virgílio A. Bento ◽  
Andreia F. S. Ribeiro ◽  
Ana Russo ◽  
Célia M. Gouveia ◽  
Rita M. Cardoso ◽  
...  

AbstractThe impact of climate change on wheat and barley yields in two regions of the Iberian Peninsula is here examined. Regression models are developed by using EURO-CORDEX regional climate model (RCM) simulations, forced by ERA-Interim, with monthly maximum and minimum air temperatures and monthly accumulated precipitation as predictors. Additionally, RCM simulations forced by different global climate models for the historical period (1972–2000) and mid-of-century (2042–2070; under the two emission scenarios RCP4.5 and RCP8.5) are analysed. Results point to different regional responses of wheat and barley. In the southernmost regions, results indicate that the main yield driver is spring maximum temperature, while further north a larger dependence on spring precipitation and early winter maximum temperature is observed. Climate change seems to induce severe yield losses in the southern region, mainly due to an increase in spring maximum temperature. On the contrary, a yield increase is projected in the northern regions, with the main driver being early winter warming that stimulates earlier growth. These results warn on the need to implement sustainable agriculture policies, and on the necessity of regional adaptation strategies.


2014 ◽  
Vol 15 (4) ◽  
pp. 1517-1531 ◽  
Author(s):  
Gerhard Smiatek ◽  
Harald Kunstmann ◽  
Andreas Heckl

Abstract The impact of climate change on the future water availability of the upper Jordan River (UJR) and its tributaries Dan, Snir, and Hermon located in the eastern Mediterranean is evaluated by a highly resolved distributed approach with the fifth-generation Pennsylvania State University–NCAR Mesoscale Model (MM5) run at 18.6- and 6.2-km resolution offline coupled with the Water Flow and Balance Simulation Model (WaSiM). The MM5 was driven with NCEP reanalysis for 1971–2000 and with Hadley Centre Coupled Model, version 3 (HadCM3), GCM forcings for 1971–2099. Because only one regional–global climate model combination was applied, the results may not give the full range of possible future projections. To describe the Dan spring behavior, the hydrological model was extended by a bypass approach to allow the fast discharge components of the Snir to enter the Dan catchment. Simulation results for the period 1976–2000 reveal that the coupled system was able to reproduce the observed discharge rates in the partially karstic complex terrain to a reasonable extent with the high-resolution 6.2-km meteorological input only. The performed future climate simulations show steadily rising temperatures with 2.2 K above the 1976–2000 mean for the period 2031–60 and 3.5 K for the period 2070–99. Precipitation trends are insignificant until the middle of the century, although a decrease of approximately 12% is simulated. For the end of the century, a reduction in rainfall ranging between 10% and 35% can be expected. Discharge in the UJR is simulated to decrease by 12% until 2060 and by 26% until 2099, both related to the 1976–2000 mean. The discharge decrease is associated with a lower number of high river flow years.


Water ◽  
2021 ◽  
Vol 13 (4) ◽  
pp. 416
Author(s):  
Bwalya Malama ◽  
Devin Pritchard-Peterson ◽  
John J. Jasbinsek ◽  
Christopher Surfleet

We report the results of field and laboratory investigations of stream-aquifer interactions in a watershed along the California coast to assess the impact of groundwater pumping for irrigation on stream flows. The methods used include subsurface sediment sampling using direct-push drilling, laboratory permeability and particle size analyses of sediment, piezometer installation and instrumentation, stream discharge and stage monitoring, pumping tests for aquifer characterization, resistivity surveys, and long-term passive monitoring of stream stage and groundwater levels. Spectral analysis of long-term water level data was used to assess correlation between stream and groundwater level time series data. The investigations revealed the presence of a thin low permeability silt-clay aquitard unit between the main aquifer and the stream. This suggested a three layer conceptual model of the subsurface comprising unconfined and confined aquifers separated by an aquitard layer. This was broadly confirmed by resistivity surveys and pumping tests, the latter of which indicated the occurrence of leakage across the aquitard. The aquitard was determined to be 2–3 orders of magnitude less permeable than the aquifer, which is indicative of weak stream-aquifer connectivity and was confirmed by spectral analysis of stream-aquifer water level time series. The results illustrate the importance of site-specific investigations and suggest that even in systems where the stream is not in direct hydraulic contact with the producing aquifer, long-term stream depletion can occur due to leakage across low permeability units. This has implications for management of stream flows, groundwater abstraction, and water resources management during prolonged periods of drought.


Sign in / Sign up

Export Citation Format

Share Document