scholarly journals An Indirect Approach Based on Long Short-Term Memory Networks to Estimate Groundwater Table Depth Anomalies Across Europe With an Application for Drought Analysis

2021 ◽  
Vol 3 ◽  
Author(s):  
Yueling Ma ◽  
Carsten Montzka ◽  
Bagher Bayat ◽  
Stefan Kollet

The lack of high-quality continental-scale groundwater table depth observations necessitates developing an indirect method to produce reliable estimation for water table depth anomalies (wtda) over Europe to facilitate European groundwater management under drought conditions. Long Short-Term Memory (LSTM) networks are a deep learning technology to exploit long-short-term dependencies in the input-output relationship, which have been observed in the response of groundwater dynamics to atmospheric and land surface processes. Here, we introduced different input variables including precipitation anomalies (pra), which is the most common proxy of wtda, for the networks to arrive at improved wtda estimates at individual pixels over Europe in various experiments. All input and target data involved in this study were obtained from the simulated TSMP-G2A data set. We performed wavelet coherence analysis to gain a comprehensive understanding of the contributions of different input variable combinations to wtda estimates. Based on the different experiments, we derived an indirect method utilizing LSTM networks with pra and soil moisture anomaly (θa) as input, which achieved the optimal network performance. The regional medians of test R2 scores and RMSEs obtained by the method in the areas with wtd ≤ 3.0 m were 76–95% and 0.17–0.30, respectively, constituting a 20–66% increase in median R2 and a 0.19–0.30 decrease in median RMSEs compared to the LSTM networks only with pra as input. Our results show that introducing θa significantly improved the performance of the trained networks to predict wtda, indicating the substantial contribution of θa to explain groundwater anomalies. Also, the European wtda map reproduced by the method had good agreement with that derived from the TSMP-G2A data set with respect to drought severity, successfully detecting ~41% of strong drought events (wtda ≥ 1.5) and ~29% of extreme drought events (wtda ≥ 2) in August 2015. The study emphasizes the importance to combine soil moisture information with precipitation information in quantifying or predicting groundwater anomalies. In the future, the indirect method derived in this study can be transferred to real-time monitoring of groundwater drought at the continental scale using remotely sensed soil moisture and precipitation observations or respective information from weather prediction models.

2021 ◽  
Author(s):  
Yueling Ma ◽  
Carsten Montzka ◽  
Bagher Bayat ◽  
Stefan Kollet

<p>Near real-time groundwater table depth measurements are scarce over Europe, leading to challenges in monitoring groundwater resources at the continental scale. In this study, we leveraged knowledge learned from simulation results by Long Short-Term Memory (LSTM) networks to estimate monthly groundwater table depth anomaly (<em>wtd<sub>a</sub></em>) data over Europe. The LSTM networks were trained, validated, and tested at individual pixels on anomaly data derived from daily integrated hydrologic simulation results over Europe from 1996 to 2016, with a spatial resolution of 0.11° (Furusho-Percot et al., 2019), to predict monthly <em>wtd<sub>a</sub></em> based on monthly precipitation anomalies (<em>pr<sub>a</sub></em>) and soil moisture anomalies (<em>θ<sub>a</sub></em>). Without additional training, we directly fed the networks with averaged monthly <em>pr<sub>a</sub></em> and <em>θ<sub>a</sub></em> data from 1996 to 2016 obtained from commonly available observational datasets and reanalysis products, and compared the network outputs with available borehole <em>in situ</em> measured <em>wtd<sub>a</sub></em>. The LSTM network estimates show good agreement with the <em>in situ</em> observations, resulting in Pearson correlation coefficients of regional averaged <em>wtd<sub>a</sub></em> data in seven PRUDENCE regions ranging from 42% to 76%, which are ~ 10% higher than the original simulation results except for the Iberian Peninsula. Our study demonstrates the potential of LSTM networks to transfer knowledge from simulation to reality for the estimation of <em>wtd<sub>a</sub></em> over Europe. The proposed method can be used to provide spatiotemporally continuous information at large spatial scales in case of sparse ground-based observations, which is common for groundwater table depth measurements. Moreover, the results highlight the advantage of combining physically-based models with machine learning techniques in data processing.</p><p> </p><p>Reference:</p><p>Furusho-Percot, C., Goergen, K., Hartick, C., Kulkarni, K., Keune, J. and Kollet, S. (2019). Pan-European groundwater to atmosphere terrestrial systems climatology from a physically consistent simulation. Scientific Data, 6(1).</p>


2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Xiaofei Zhang ◽  
Tao Wang ◽  
Qi Xiong ◽  
Yina Guo

Imagery-based brain-computer interfaces (BCIs) aim to decode different neural activities into control signals by identifying and classifying various natural commands from electroencephalogram (EEG) patterns and then control corresponding equipment. However, several traditional BCI recognition algorithms have the “one person, one model” issue, where the convergence of the recognition model’s training process is complicated. In this study, a new BCI model with a Dense long short-term memory (Dense-LSTM) algorithm is proposed, which combines the event-related desynchronization (ERD) and the event-related synchronization (ERS) of the imagery-based BCI; model training and testing were conducted with its own data set. Furthermore, a new experimental platform was built to decode the neural activity of different subjects in a static state. Experimental evaluation of the proposed recognition algorithm presents an accuracy of 91.56%, which resolves the “one person one model” issue along with the difficulty of convergence in the training process.


2021 ◽  
Vol 17 (12) ◽  
pp. 155014772110612
Author(s):  
Zhengqiang Ge ◽  
Xinyu Liu ◽  
Qiang Li ◽  
Yu Li ◽  
Dong Guo

To significantly protect the user’s privacy and prevent the user’s preference disclosure from leading to malicious entrapment, we present a combination of the recommendation algorithm and the privacy protection mechanism. In this article, we present a privacy recommendation algorithm, PrivItem2Vec, and the concept of the recommended-internet of things, which is a privacy recommendation algorithm, consisting of user’s information, devices, and items. Recommended-internet of things uses bidirectional long short-term memory, based on item2vec, which improves algorithm time series and the recommended accuracy. In addition, we reconstructed the data set in conjunction with the Paillier algorithm. The data on the server are encrypted and embedded, which reduces the readability of the data and ensures the data’s security to a certain extent. Experiments show that our algorithm is superior to other works in terms of recommended accuracy and efficiency.


Author(s):  
Dejiang Kong ◽  
Fei Wu

The widely use of positioning technology has made mining the movements of people feasible and plenty of trajectory data have been accumulated. How to efficiently leverage these data for location prediction has become an increasingly popular research topic as it is fundamental to location-based services (LBS). The existing methods often focus either on long time (days or months) visit prediction (i.e., the recommendation of point of interest) or on real time location prediction (i.e., trajectory prediction). In this paper, we are interested in the location prediction problem in a weak real time condition and aim to predict users' movement in next minutes or hours. We propose a Spatial-Temporal Long-Short Term Memory (ST-LSTM) model which naturally combines spatial-temporal influence into LSTM to mitigate the problem of data sparsity. Further, we employ a hierarchical extension of the proposed ST-LSTM (HST-LSTM) in an encoder-decoder manner which models the contextual historic visit information in order to boost the prediction performance. The proposed HST-LSTM is evaluated on a real world trajectory data set and the experimental results demonstrate the effectiveness of the proposed model.


2021 ◽  
Author(s):  
Jianrong Dai

Abstract Purpose Machine Performance Check (MPC) is a daily quality assurance (QA) tool for Varian machines. The daily QA data based on MPC tests show machine performance patterns and potentially provide warning messages for preventive actions. This study developed a neural network model that could predict the trend of data variations quantitively. Methods and materials: MPC data used were collected daily for 3 years. The stacked long short-term memory (LSTM)model was used to develop the neural work model. To compare the stacked LSTM, the autoregressive integrated moving average model (ARIMA) was developed on the same data set. Cubic interpolation was used to double the amount of data to enhance prediction accuracy. After then, the data were divided into 3 groups: 70% for training, 15% for validation, and 15% for testing. The training set and the validation set were used to train the stacked LSTM with different hyperparameters to find the optimal hyperparameter. Furthermore, a greedy coordinate descent method was employed to combinate different hyperparameter sets. The testing set was used to assess the performance of the model with the optimal hyperparameter combination. The accuracy of the model was quantified by the mean absolute error (MAE), root-mean-square error (RMSE), and coefficient of determination (R2). Results A total of 867 data were collected to predict the data for the next 5 days. The mean MAE, RMSE, and \({\text{R}}^{2}\) with all MPC tests was 0.013, 0.020, and 0.853 in LSTM, while 0.021, 0.030, and 0.618 in ARIMA, respectively. The results show that the LSTM outperforms the ARIMA. Conclusions In this study, the stacked LSTM model can accurately predict the daily QA data based on MPC tests. Predicting future performance data based on MPC tests will foresee possible machine failure, allowing early machine maintenance and reducing unscheduled machine downtime.


2019 ◽  
Vol 15 (10) ◽  
pp. 155014771988313 ◽  
Author(s):  
Chi Hua ◽  
Erxi Zhu ◽  
Liang Kuang ◽  
Dechang Pi

Accurate prediction of the generation capacity of photovoltaic systems is fundamental to ensuring the stability of the grid and to performing scheduling arrangements correctly. In view of the temporal defect and the local minimum problem of back-propagation neural network, a forecasting method of power generation based on long short-term memory-back-propagation is proposed. On this basis, the traditional prediction data set is improved. According to the three traditional methods listed in this article, we propose a fourth method to improve the traditional photovoltaic power station short-term power generation prediction. Compared with the traditional method, the long short-term memory-back-propagation neural network based on the improved data set has a lower prediction error. At the same time, a horizontal comparison with the multiple linear regression and the support vector machine shows that the long short-term memory-back-propagation method has several advantages. Based on the long short-term memory-back-propagation neural network, the short-term forecasting method proposed in this article for generating capacity of photovoltaic power stations will provide a basis for dispatching plan and optimizing operation of power grid.


Author(s):  
Mingqiang Lin ◽  
Denggao Wu ◽  
Gengfeng Zheng ◽  
Ji Wu

Lithium-ion batteries are widely used as the power source in electric vehicles. The state of health (SOH) diagnosis is very important for the safety and storage capacity of lithium-ion batteries. In order to accurately and robustly estimate lithium-ion battery SOH, a novel long short-term memory network (LSTM) based on the charging curve is proposed for SOH estimation in this work. Firstly, aging features that reflect the battery degradation phenomenon are extracted from the charging curves. Then, considering capture the long-term tendency of battery degradation, some improvements are made in the proposed LSTM model. The connection between the input gate and the output gate is added to better control output information of the memory cell. Meanwhile, the forget gate and input gate are coupled into a single update gate for selectively forgetting before the accumulation of information. To achieve more reliability and robustness of the SOH estimation method, the improved LSTM network is adaptively trained online by using a particle filter. Furthermore, to verify the effectiveness of the proposed method, we compare the proposed method with two data-driven methods on the public battery data set and the commercial battery data set. Experimental results demonstrate the proposed method can obtain the highest SOH accuracy.


2020 ◽  
Author(s):  
Li Yuheng ◽  
Tang Lihua

<p>Due to the scarcity of available surface water, many irrigated areas in North China Plain (NCP) heavily rely on groundwater, which has resulted in groundwater overexploitation and massive environmental impacts, such as groundwater depression core and land subsidence. The net groundwater depletion, one of the groundwater indicators, means the actual groundwater consumption for human impact. This indicator is quite essential for the evaluation of the effects of agricultural activities in well irrigation areas. However, net depletion forecasts, which can help inform the management of well irrigation areas, are generally unavailable with easy methods. Therefore, this study explored machine learning models, Long Short-term Memory (LSTM) networks, to forecast net groundwater depletion in well irrigation counties, Hebei Province. Firstly, Luancheng county was selected to construct the forecasting model. The training dataset was prepared by collecting the measured precipitation, remote sensing evaporation and groundwater table from 2006-2017. Besides, an agro-hydrological model (Soil-Water-Atmosphere-Plant, SWAP) with an optimization tool (Parameter ESTimation, PEST) was used to calculate the net depletion, and an unsaturated-saturated zone water balance conceptual hydrological model was constructed to calculate the net groundwater use. Secondly, to determine the effect of training data type on model accuracy, freshwater budget (evaporation minus precipitation), change of groundwater table and net groundwater use were chosen as training inputs by analyzing related temporal variable characteristics of net groundwater depletion. The response time of training inputs with net groundwater depletion were also approximated with highest cross-correlation value (CCF). Then, by circular bootstrapping methods to enlarge the Luancheng datasets from 2006-2016, the annual and monthly model for forecasting the net depletion were respectively trained with enlarged Luancheng datasets. Additionally, to test the model’s ability to predict the net groundwater depletion in other well irrigation areas with the similar rule of groundwater depletion, the annual and monthly forecasting scenarios were also carried out in the adjacent county, Zhaoxian. The results showed that both of the monthly and annual models estimating the groundwater net depletion had good performance in Zhaoxian from 2006-2017, with NSE of 0.91 and 0.81, respectively. According to the modelling results, further analysis showed that groundwater depletion in research counties mainly occurred in spring (March to May) and winter (December to February). In addition, the major factor leading to groundwater depletion in spring and winter was freshwater budget; while in summer and autumn, soil moisture determined the depletion activity. These results demonstrate the feasible use of LSTM networks to create annual and monthly forecasts of net groundwater depletion in well irrigation areas with similar depletion rule, which can provide valuable suggestion to well irrigation management in NCP within a challenging environment.</p><p><strong>Keywords: net groundwater depletion; long short-term memory; well irrigation areas</strong></p>


2020 ◽  
Author(s):  
Yueling Ma ◽  
Carsten Montzka ◽  
Bagher Bayat ◽  
Stefan Kollet

<p>Groundwater is the dominant source of fresh water in many European countries. However, due to a lack of near-real-time water table depth (wtd) observations, monitoring of groundwater resources is not feasible at the continental scale. Thus, an alternative approach is required to produce wtd data from other available observations near-real-time. In this study, we propose Long Short-Term Memory (LSTM) networks to model monthly wtd anomalies over Europe utilizing monthly precipitation anomalies as input. LSTM networks are a special type of artificial neural networks, showing great promise in exploiting long-term dependencies between time series, which is expected in the response of groundwater to precipitation. To establish the methodology, spatially and temporally continuous data from terrestrial simulations at the continental scale were applied with a spatial resolution of 0.11°, ranging from the year 1996 to 2016 (Furusho-Percot et al., 2019). They were divided into a training set (1996 – 2012), a validation set (2012 – 2014) and a testing set (2015 -2016) to construct local models on selected pixels over eight PRUDENCE regions. The outputs of the LSTM networks showed good agreement with the simulation results in locations with a shallow wtd (~3m). It is important to note, the quality of the models was strongly affected by the amount of snow cover. Moreover, with the introduction of monthly evapotranspiration anomalies as additional input, pronounced improvements of the network performances were only obtained in more arid regions (i.e., Iberian Peninsula and Mediterranean). Our results demonstrate the potential of LSTM networks to produce high-quality wtd anomalies from hydrometeorological variables that are monitored at the large scale and part of operational forecasting systems potentially facilitating the implementation of an efficient groundwater monitoring system over Europe.</p><p>Reference:</p><p>Furusho-Percot, C., Goergen, K., Hartick, C., Kulkarni, K., Keune, J. and Kollet, S. (2019). Pan-European groundwater to atmosphere terrestrial systems climatology from a physically consistent simulation. Scientific Data, 6(1).</p>


Sign in / Sign up

Export Citation Format

Share Document