scholarly journals Using Long Short-Term Memory networks to connect water table depth anomalies to precipitation anomalies over Europe

2021 ◽  
Vol 25 (6) ◽  
pp. 3555-3575
Author(s):  
Yueling Ma ◽  
Carsten Montzka ◽  
Bagher Bayat ◽  
Stefan Kollet

Abstract. Many European countries rely on groundwater for public and industrial water supply. Due to a scarcity of near-real-time water table depth (wtd) observations, establishing a spatially consistent groundwater monitoring system at the continental scale is a challenge. Hence, it is necessary to develop alternative methods for estimating wtd anomalies (wtda) using other hydrometeorological observations routinely available near real time. In this work, we explore the potential of Long Short-Term Memory (LSTM) networks for producing monthly wtda using monthly precipitation anomalies (pra) as input. LSTM networks are a special category of artificial neural networks that are useful for detecting a long-term dependency within sequences, in our case time series, which is expected in the relationship between pra and wtda. In the proposed methodology, spatiotemporally continuous data were obtained from daily terrestrial simulations of the Terrestrial Systems Modeling Platform (TSMP) over Europe (hereafter termed the TSMP-G2A data set), with a spatial resolution of 0.11∘, ranging from the years 1996 to 2016. The data were separated into a training set (1996–2012), a validation set (2013–2014), and a test set (2015–2016) to establish local networks at selected pixels across Europe. The modeled wtda maps from LSTM networks agreed well with TSMP-G2A wtda maps on spatially distributed dry and wet events, with 2003 and 2015 constituting drought years over Europe. Moreover, we categorized the test performances of the networks based on intervals of yearly averaged wtd, evapotranspiration (ET), soil moisture (θ), snow water equivalent (Sw), soil type (St), and dominant plant functional type (PFT). Superior test performance was found at the pixels with wtd < 3 m, ET > 200 mm, θ>0.15 m3 m−3, and Sw<10 mm, revealing a significant impact of the local factors on the ability of the networks to process information. Furthermore, results of the cross-wavelet transform (XWT) showed a change in the temporal pattern between TSMP-G2A pra and wtda at some selected pixels, which can be a reason for undesired network behavior. Our results demonstrate that LSTM networks are useful for producing high-quality wtda based on other hydrometeorological data measured and predicted at large scales, such as pra. This contribution may facilitate the establishment of an effective groundwater monitoring system over Europe that is relevant to water management.

2020 ◽  
Author(s):  
Yueling Ma ◽  
Carsten Montzka ◽  
Bagher Bayat ◽  
Stefan Kollet

Abstract. Many European countries mainly rely on groundwater for domestic water use. Due to a scarcity of near real-time water table depth (wtd) observations, establishing a spatially consistent groundwater monitoring system at the continental scale is a challenge. Hence, it is necessary to develop alternative methods to estimate wtd anomalies (wtda) using other hydrometeorological observations routinely available near real-time. In this work, we explore the potential of Long Short-Term Memory (LSTM) networks to produce monthly wtda, using monthly precipitation anomalies (pra) as input. LSTM networks are a special category of artificial neural networks, useful in detecting a long-term dependency within sequences, in our case time series, which is expected in the relationship between pra and wtda. To set up the methodology, spatio-temporally continuous data were obtained from daily terrestrial simulations (hereafter termed the TSMP-G2A data set) with a spatial resolution of 0.11°, ranging from the year 1996 to 2016. They were separated into a training set (1996–2012), a validation set (2013–2014), and a test set (2015–2016) to establish local networks at selected pixels across Europe. The modeled wtda maps from LSTM networks agreed well with TSMP-G2A wtda maps in 2003 and 2015 constituting drought years over Europe. Moreover, we categorized test performances of the networks based on yearly averaged wtd, evapotranspiration (ET), soil moisture (θ), snow water equivalent (Sw), and soil type (St) and dominant plant functional type (PFT). Superior test performance was found at the pixels with wtd  200 mm, θ > 0.15 m3 m−3 and Sw 


2018 ◽  
Vol 561 ◽  
pp. 918-929 ◽  
Author(s):  
Jianfeng Zhang ◽  
Yan Zhu ◽  
Xiaoping Zhang ◽  
Ming Ye ◽  
Jinzhong Yang

2019 ◽  
Vol 31 (6) ◽  
pp. 1085-1113 ◽  
Author(s):  
Po-He Tseng ◽  
Núria Armengol Urpi ◽  
Mikhail Lebedev ◽  
Miguel Nicolelis

Although many real-time neural decoding algorithms have been proposed for brain-machine interface (BMI) applications over the years, an optimal, consensual approach remains elusive. Recent advances in deep learning algorithms provide new opportunities for improving the design of BMI decoders, including the use of recurrent artificial neural networks to decode neuronal ensemble activity in real time. Here, we developed a long-short term memory (LSTM) decoder for extracting movement kinematics from the activity of large ( N = 134–402) populations of neurons, sampled simultaneously from multiple cortical areas, in rhesus monkeys performing motor tasks. Recorded regions included primary motor, dorsal premotor, supplementary motor, and primary somatosensory cortical areas. The LSTM's capacity to retain information for extended periods of time enabled accurate decoding for tasks that required both movements and periods of immobility. Our LSTM algorithm significantly outperformed the state-of-the-art unscented Kalman filter when applied to three tasks: center-out arm reaching, bimanual reaching, and bipedal walking on a treadmill. Notably, LSTM units exhibited a variety of well-known physiological features of cortical neuronal activity, such as directional tuning and neuronal dynamics across task epochs. LSTM modeled several key physiological attributes of cortical circuits involved in motor tasks. These findings suggest that LSTM-based approaches could yield a better algorithm strategy for neuroprostheses that employ BMIs to restore movement in severely disabled patients.


Optifab 2019 ◽  
2019 ◽  
Author(s):  
Chung-Ying Wang ◽  
Chien-Yao Huang ◽  
Jung-Hsing Wang ◽  
Jun-Cheng Chen ◽  
Wei-Cheng Lin ◽  
...  

Author(s):  
Dejiang Kong ◽  
Fei Wu

The widely use of positioning technology has made mining the movements of people feasible and plenty of trajectory data have been accumulated. How to efficiently leverage these data for location prediction has become an increasingly popular research topic as it is fundamental to location-based services (LBS). The existing methods often focus either on long time (days or months) visit prediction (i.e., the recommendation of point of interest) or on real time location prediction (i.e., trajectory prediction). In this paper, we are interested in the location prediction problem in a weak real time condition and aim to predict users' movement in next minutes or hours. We propose a Spatial-Temporal Long-Short Term Memory (ST-LSTM) model which naturally combines spatial-temporal influence into LSTM to mitigate the problem of data sparsity. Further, we employ a hierarchical extension of the proposed ST-LSTM (HST-LSTM) in an encoder-decoder manner which models the contextual historic visit information in order to boost the prediction performance. The proposed HST-LSTM is evaluated on a real world trajectory data set and the experimental results demonstrate the effectiveness of the proposed model.


Sign in / Sign up

Export Citation Format

Share Document