Long term variations of the hydrochemical composition of deep thermal ground water in the Lower Bavarian Molasse Basin – Causes and Perspectives

Author(s):  
Annette Dietmaier ◽  
Thomas Baumann

<p>The European Water Framework Directive (WFD) commits EU member states to achieve a good qualitative and quantitative status of all their water bodies.  WFD provides a list of actions to be taken to achieve the goal of good status.  However, this list disregards the specific conditions under which deep (> 400 m b.g.l.) groundwater aquifers form and exist.  In particular, deep groundwater fluid composition is influenced by interaction with the rock matrix and other geofluids, and may assume a bad status without anthropogenic influences. Thus, a new concept with directions of monitoring and modelling this specific kind of aquifers is needed. Their status evaluation must be based on the effects induced by their exploitation. Here, we analyze long-term real-life production data series to detect changes in the hydrochemical deep groundwater characteristics which might be triggered by balneological and geothermal exploitation. We aim to use these insights to design a set of criteria with which the status of deep groundwater aquifers can be quantitatively and qualitatively determined. Our analysis is based on a unique long-term hydrochemical data set, taken from 8 balneological and geothermal sites in the molasse basin of Lower Bavaria, Germany, and Upper Austria. It is focused on a predefined set of annual hydrochemical concentration values. The data range dates back to 1937. Our methods include developing threshold corridors, within which a good status can be assumed, and developing cluster analyses, correlation, and piper diagram analyses. We observed strong fluctuations in the hydrochemical characteristics of the molasse basin deep groundwater during the last decades. Special interest is put on fluctuations that seem to have a clear start and end date, and to be correlated with other exploitation activities in the region. For example, during the period between 1990 and 2020, bicarbonate and sodium values displayed a clear increase, followed by a distinct dip to below-average values and a subsequent return to average values at site F. During the same time, these values showed striking irregularities at site B. Furthermore, we observed fluctuations in several locations, which come close to disqualifying quality thresholds, commonly used in German balneology. Our preliminary results prove the importance of using long-term (multiple decades) time series analysis to better inform quality and quantity assessments for deep groundwater bodies: most fluctuations would stay undetected within a < 5 year time series window, but become a distinct irregularity when viewed in the context of multiple decades. In the next steps, a quality assessment matrix and threshold corridors will be developed, which take into account methods to identify these fluctuations. This will ultimately aid in assessing the sustainability of deep groundwater exploitation and reservoir management for balneological and geothermal uses.</p>

2021 ◽  
Author(s):  
Ole Einar Tveito

<p>For many purposes, including the estimation of climate normals, requires long, continuous  and preferably homogeneous time series. Many observation series do not meet these requirements, especially due to modernisation and automation of the observation network. Despite the lack of long series there is still a need to provide climate parameters representing a longer time period than available. An actual problem is the calculation of new standard climate normals for the 1991-2020 period, where normal values need to be assigned also for observation series not meeting the requirements of WMO to estimate climate normals from observations. </p><p>One possible approach to estimate monthly time series is to extract value from gridded climate anomaly fields. In this study this approach is applied to complete time series that will be the basis for calculation of long term reference values.</p><p>The calculation of the long term time series is a two step procedure. First monthly anomaly grids based on homogenised data series are produced. The homogenized series provide more stable and reliable spatial estimates than applying non homogenised data. The homogenised data set is also complete ensuring a spatially consistent input throughout the analysis period 1991-2020.</p><p>The monthly anomalies for the location of the series to be complete are extracted from the gridded fields. By combining the interpolated anomalies with the observations the long term mean value can be estimated. The study shows that this approach provides reliable estimates of long term values, even with just a few events for calibration. The precision of the estimates depend more on the representativity of the grid estimates than length of the observation series. At locations where the anomaly grids represent the spatial climate variability well, stable estimates are achieved. On the other hand will the estimates at locations where the anomaly grids are less accurate due to sparse data coverage or steep climate gradients lead to estimates with a larger variability, and  thus more uncertain estimates. </p>


2013 ◽  
Vol 864-867 ◽  
pp. 2213-2217 ◽  
Author(s):  
Ju Yan Zhu ◽  
Hai Peng Guo

Due to long-term excessive exploitation of groundwater, serious land subsidence has been caused in Cangzhou City, Hebei Province, China. With GIS spatial analysis method, this paper conducted an analysis of the quantitative relationship between deep groundwater exploitation and the land subsidence in this area. This quantitative relation was analyzed by using data of both long-term and short-term time series. The long-term time series analysis indicates that the land subsidence volume accounts for 57.6% of the amount of deep groundwater exploitation, indirectly showing the proportion of released water from compressibility of the aquifers and the aquitards in deep groundwater exploitation. Some factors such as hysteresis effects of subsidence may be ignored in the short-term time series analysis, thus the calculated ratio becomes significantly large. From perspective of water resources evaluation, the long-term time series analysis is better to analyze the relation between land subsidence and deep groundwater exploitation.


2011 ◽  
Vol 11 (8) ◽  
pp. 21835-21875
Author(s):  
S. Pandey Deolal ◽  
D. Brunner ◽  
M. Steinbacher ◽  
U. Weers ◽  
J. Staehelin

Abstract. We present an analysis of the NOy (NOx + other oxidized species) measurements at the high alpine site Jungfraujoch (JFJ, 3580 m a.s.l.) for the period 1998–2009, which is the longest continous NOy data set reported from the lower free troposphere worldwide. Due to stringent emission control regulations, nitrogen oxides (NOx) emissions have been reduced significantly in Europe since the late 1980s as well as during the investigation period. However, the time series of NOy at JFJ does not show a consistent trend but a maximum during 2002 to 2004 and a decreasing tendency thereafter. The seasonal cycle of NOy exhibits a maximum in the warm season and a minimum in the cold months, opposite to measurements in the PBL, reflecting the seasonal changes in vertical transport and mixing. Except for summer, the seasonal mean NOx concentrations at JFJ show a high year-to-year variability which is strongly controlled by short episodic pollution events obscuring any long-term trends. The low variability in mean and median NOx values in summer is quite remarkable indicating rapid photochemical conversion of NOx to higher oxidized species (NOz) of the NOy family on a timescale shorter than the time required to transport polluted air from the boundary layer to JFJ. In order to evaluate the quality of the NOy data series, an in-situ intercomparison with a second collocated NOy analyzer with a separate inlet was performed in 2009–2010 which showed an agreement within 10 % including all uncertainties and errors.


2012 ◽  
Vol 12 (5) ◽  
pp. 2551-2566 ◽  
Author(s):  
S. Pandey Deolal ◽  
D. Brunner ◽  
M. Steinbacher ◽  
U. Weers ◽  
J. Staehelin

Abstract. We present an analysis of the NOy (NOx + other oxidized species) measurements at the high alpine site Jungfraujoch (JFJ, 3580 m a.s.l.) for the period 1998–2009, which is the longest continous NOy data set reported from the lower free troposphere worldwide. Due to stringent emission control regulations, nitrogen oxides (NOx) emissions have been reduced significantly in Europe since the late 1980s as well as during the investigation period. However, the time series of NOy at JFJ does not show a consistent trend but a maximum during 2002 to 2004 and a decreasing tendency thereafter. The seasonal cycle of NOy exhibits a maximum in the warm season and a minimum in the cold months, opposite to measurements in the PBL, reflecting the seasonal changes in vertical transport and mixing. Except for summer, the seasonal mean NOx concentrations at JFJ show a high year-to-year variability which is strongly controlled by short episodic pollution events obscuring any long-term trends. The low variability in mean and median NOx values in summer is quite remarkable indicating rapid photochemical conversion of NOx to higher oxidized species (NOz) of the NOy family on a timescale shorter than the time required to transport polluted air from the boundary layer to JFJ. In order to evaluate the quality of the NOy data series, an in-situ intercomparison with a second collocated NOy analyzer with a separate inlet was performed in 2009–2010 which showed an overall agreement within 10% including all uncertainties and errors.


Water ◽  
2018 ◽  
Vol 10 (10) ◽  
pp. 1477 ◽  
Author(s):  
Davide De Luca ◽  
Luciano Galasso

This study tests stationary and non-stationary approaches for modelling data series of hydro-meteorological variables. Specifically, the authors considered annual maximum rainfall accumulations observed in the Calabria region (southern Italy), and attention was focused on time series characterized by heavy rainfall events which occurred from 1 January 2000 in the study area. This choice is justified by the need to check if the recent rainfall events in the new century can be considered as very different or not from the events occurred in the past. In detail, the whole data set of each considered time series (characterized by a sample size N > 40 data) was analyzed, in order to compare recent and past rainfall accumulations, which occurred in a specific site. All the proposed models were based on the Two-Component Extreme Value (TCEV) probability distribution, which is frequently applied for annual maximum time series in Calabria. The authors discussed the possible sources of uncertainty related to each framework and remarked on the crucial role played by ergodicity. In fact, if the process is assumed to be non-stationary, then ergodicity cannot hold, and thus possible trends should be derived from external sources, different from the time series of interest: in this work, Regional Climate Models’ (RCMs) outputs were considered in order to assess possible trends of TCEV parameters. From the obtained results, it does not seem essential to adopt non-stationary models, as significant trends do not appear from the observed data, due to a relevant number of heavy events which also occurred in the central part of the last century.


2018 ◽  
Vol 611 ◽  
pp. A85 ◽  
Author(s):  
R. Silvotti ◽  
S. Schuh ◽  
S.-L. Kim ◽  
R. Lutz ◽  
M. Reed ◽  
...  

V391 Peg (alias HS 2201+2610) is a subdwarf B (sdB) pulsating star that shows both p- and g-modes. By studying the arrival times of the p-mode maxima and minima through the O–C method, in a previous article the presence of a planet was inferred with an orbital period of 3.2 years and a minimum mass of 3.2 MJup. Here we present an updated O–C analysis using a larger data set of 1066 h of photometric time series (~2.5× larger in terms of the number of data points), which covers the period between 1999 and 2012 (compared with 1999–2006 of the previous analysis). Up to the end of 2008, the new O–C diagram of the main pulsation frequency (f1) is compatible with (and improves) the previous two-component solution representing the long-term variation of the pulsation period (parabolic component) and the giant planet (sine wave component). Since 2009, the O–C trend of f1 changes, and the time derivative of the pulsation period (p.) passes from positive to negative; the reason of this change of regime is not clear and could be related to nonlinear interactions between different pulsation modes. With the new data, the O–C diagram of the secondary pulsation frequency (f2) continues to show two components (parabola and sine wave), like in the previous analysis. Various solutions are proposed to fit the O–C diagrams of f1 and f2, but in all of them, the sinusoidal components of f1 and f2 differ or at least agree less well than before. The nice agreement found previously was a coincidence due to various small effects that are carefully analyzed. Now, with a larger dataset, the presence of a planet is more uncertain and would require confirmation with an independent method. The new data allow us to improve the measurement of p. for f1 and f2: using only the data up to the end of 2008, we obtain p.1 = (1.34 ± 0.04) × 10−12 and p.2 = (1.62 ± 0.22) × 10−12. The long-term variation of the two main pulsation periods (and the change of sign of p.1) is visible also in direct measurements made over several years. The absence of peaks near f1 in the Fourier transform and the secondary peak close to f2 confirm a previous identification as l = 0 and l = 1, respectively, and suggest a stellar rotation period of about 40 days. The new data allow constraining the main g-mode pulsation periods of the star.


Author(s):  
Jan-Peter Seevers ◽  
Kristina Jurczyk ◽  
Henning Meschede ◽  
Jens Hesselbach ◽  
John W. Sutherland

Abstract Manufacturing industry companies are increasingly interested in using less energy in order to enhance competitiveness and reduce environmental impact. To implement technologies and make decisions that lead to less energy demand, energy/power data are required. All too often, however, energy data are either not available, or available but too aggregated to be useful, or in a form that makes information difficult to access. Attention herein is focused on this last point. As a step toward greater energy information transparency and smart energy-monitoring systems, this paper introduces a novel, robust time series-based approach to automatically detect and analyze the electrical power cycles of manufacturing equipment. A new pattern recognition algorithm including a power peak clustering method is applied to a large real-life sensor data set of various machine tools. With the help of synthetic time series, it is shown that the accuracy of the cycle detection of nearly 100% is realistic, depending on the degree of measurement noise and the measurement sampling rate. Moreover, this paper elucidates how statistical load profiling of manufacturing equipment cycles as well as statistical deviation analyses can be of value for automatic sensor and process fault detection.


2020 ◽  
Vol 12 (1) ◽  
pp. 54-61
Author(s):  
Abdullah M. Almarashi ◽  
Khushnoor Khan

The current study focused on modeling times series using Bayesian Structural Time Series technique (BSTS) on a univariate data-set. Real-life secondary data from stock prices for flying cement covering a period of one year was used for analysis. Statistical results were based on simulation procedures using Kalman filter and Monte Carlo Markov Chain (MCMC). Though the current study involved stock prices data, the same approach can be applied to complex engineering process involving lead times. Results from the current study were compared with classical Autoregressive Integrated Moving Average (ARIMA) technique. For working out the Bayesian posterior sampling distributions BSTS package run with R software was used. Four BSTS models were used on a real data set to demonstrate the working of BSTS technique. The predictive accuracy for competing models was assessed using Forecasts plots and Mean Absolute Percent Error (MAPE). An easyto-follow approach was adopted so that both academicians and practitioners can easily replicate the mechanism. Findings from the study revealed that, for short-term forecasting, both ARIMA and BSTS are equally good but for long term forecasting, BSTS with local level is the most plausible option.


Open Physics ◽  
2009 ◽  
Vol 7 (3) ◽  
Author(s):  
Shahriar Shadkhoo ◽  
Fakhteh Ghanbarnejad ◽  
Gholam Jafari ◽  
Mohammad Tabar

AbstractIn this paper, we investigate the statistical and scaling properties of the California earthquakes’ inter-events over a period of the recent 40 years. To detect long-term correlations behavior, we apply detrended fluctuation analysis (DFA), which can systematically detect and overcome nonstationarities in the data set at all time scales. We calculate for various earthquakes with magnitudes larger than a given M. The results indicate that the Hurst exponent decreases with increasing M; characterized by a Hurst exponent, which is given by, H = 0:34 + 1:53/M, indicating that for events with very large magnitudes M, the Hurst exponent decreases to 0:50, which is for independent events.


Sign in / Sign up

Export Citation Format

Share Document