Near-Real-Time Forecast of Satellite-Based Soil Moisture Using Long Short-Term Memory with an Adaptive Data Integration Kernel

2020 ◽  
Vol 21 (3) ◽  
pp. 399-413 ◽  
Author(s):  
Kuai Fang ◽  
Chaopeng Shen

AbstractNowcasts, or near-real-time (NRT) forecasts, of soil moisture based on the Soil Moisture Active and Passive (SMAP) mission could provide substantial value for a range of applications including hazards monitoring and agricultural planning. To provide such a NRT forecast with high fidelity, we enhanced a time series deep learning architecture, long short-term memory (LSTM), with a novel data integration (DI) kernel to assimilate the most recent SMAP observations as soon as they become available. The kernel is adaptive in that it can accommodate irregular observational schedules. Testing over the CONUS, this NRT forecast product showcases predictions with unprecedented accuracy when evaluated against subsequent SMAP retrievals. It showed smaller error than NRT forecasts reported in the literature, especially at longer forecast latency. The comparative advantage was due to LSTM’s structural improvements, as well as its ability to utilize more input variables and more training data. The DI-LSTM was compared to the original LSTM model that runs without data integration, referred to as the projection model here. We found that the DI procedure removed the autocorrelated effects of forcing errors and errors due to processes not represented in the inputs, for example, irrigation and floodplain/lake inundation, as well as mismatches due to unseen forcing conditions. The effects of this purely data-driven DI kernel are discussed for the first time in the geosciences. Furthermore, this work presents an upper-bound estimate for the random component of the SMAP retrieval error.

2021 ◽  
Vol 3 ◽  
Author(s):  
Yueling Ma ◽  
Carsten Montzka ◽  
Bagher Bayat ◽  
Stefan Kollet

The lack of high-quality continental-scale groundwater table depth observations necessitates developing an indirect method to produce reliable estimation for water table depth anomalies (wtda) over Europe to facilitate European groundwater management under drought conditions. Long Short-Term Memory (LSTM) networks are a deep learning technology to exploit long-short-term dependencies in the input-output relationship, which have been observed in the response of groundwater dynamics to atmospheric and land surface processes. Here, we introduced different input variables including precipitation anomalies (pra), which is the most common proxy of wtda, for the networks to arrive at improved wtda estimates at individual pixels over Europe in various experiments. All input and target data involved in this study were obtained from the simulated TSMP-G2A data set. We performed wavelet coherence analysis to gain a comprehensive understanding of the contributions of different input variable combinations to wtda estimates. Based on the different experiments, we derived an indirect method utilizing LSTM networks with pra and soil moisture anomaly (θa) as input, which achieved the optimal network performance. The regional medians of test R2 scores and RMSEs obtained by the method in the areas with wtd ≤ 3.0 m were 76–95% and 0.17–0.30, respectively, constituting a 20–66% increase in median R2 and a 0.19–0.30 decrease in median RMSEs compared to the LSTM networks only with pra as input. Our results show that introducing θa significantly improved the performance of the trained networks to predict wtda, indicating the substantial contribution of θa to explain groundwater anomalies. Also, the European wtda map reproduced by the method had good agreement with that derived from the TSMP-G2A data set with respect to drought severity, successfully detecting ~41% of strong drought events (wtda ≥ 1.5) and ~29% of extreme drought events (wtda ≥ 2) in August 2015. The study emphasizes the importance to combine soil moisture information with precipitation information in quantifying or predicting groundwater anomalies. In the future, the indirect method derived in this study can be transferred to real-time monitoring of groundwater drought at the continental scale using remotely sensed soil moisture and precipitation observations or respective information from weather prediction models.


2019 ◽  
Vol 31 (6) ◽  
pp. 1085-1113 ◽  
Author(s):  
Po-He Tseng ◽  
Núria Armengol Urpi ◽  
Mikhail Lebedev ◽  
Miguel Nicolelis

Although many real-time neural decoding algorithms have been proposed for brain-machine interface (BMI) applications over the years, an optimal, consensual approach remains elusive. Recent advances in deep learning algorithms provide new opportunities for improving the design of BMI decoders, including the use of recurrent artificial neural networks to decode neuronal ensemble activity in real time. Here, we developed a long-short term memory (LSTM) decoder for extracting movement kinematics from the activity of large ( N = 134–402) populations of neurons, sampled simultaneously from multiple cortical areas, in rhesus monkeys performing motor tasks. Recorded regions included primary motor, dorsal premotor, supplementary motor, and primary somatosensory cortical areas. The LSTM's capacity to retain information for extended periods of time enabled accurate decoding for tasks that required both movements and periods of immobility. Our LSTM algorithm significantly outperformed the state-of-the-art unscented Kalman filter when applied to three tasks: center-out arm reaching, bimanual reaching, and bipedal walking on a treadmill. Notably, LSTM units exhibited a variety of well-known physiological features of cortical neuronal activity, such as directional tuning and neuronal dynamics across task epochs. LSTM modeled several key physiological attributes of cortical circuits involved in motor tasks. These findings suggest that LSTM-based approaches could yield a better algorithm strategy for neuroprostheses that employ BMIs to restore movement in severely disabled patients.


Author(s):  
Tao Gui ◽  
Qi Zhang ◽  
Lujun Zhao ◽  
Yaosong Lin ◽  
Minlong Peng ◽  
...  

In recent years, long short-term memory (LSTM) has been successfully used to model sequential data of variable length. However, LSTM can still experience difficulty in capturing long-term dependencies. In this work, we tried to alleviate this problem by introducing a dynamic skip connection, which can learn to directly connect two dependent words. Since there is no dependency information in the training data, we propose a novel reinforcement learning-based method to model the dependency relationship and connect dependent words. The proposed model computes the recurrent transition functions based on the skip connections, which provides a dynamic skipping advantage over RNNs that always tackle entire sentences sequentially. Our experimental results on three natural language processing tasks demonstrate that the proposed method can achieve better performance than existing methods. In the number prediction experiment, the proposed model outperformed LSTM with respect to accuracy by nearly 20%.


2020 ◽  
Vol 35 (4) ◽  
pp. 1203-1220 ◽  
Author(s):  
Qidong Yang ◽  
Chia-Ying Lee ◽  
Michael K. Tippett

ABSTRACTRapid intensification (RI) is an outstanding source of error in tropical cyclone (TC) intensity predictions. RI is generally defined as a 24-h increase in TC maximum sustained surface wind speed greater than some threshold, typically 25, 30, or 35 kt (1 kt ≈ 0.51 m s−1). Here, a long short-term memory (LSTM) model for probabilistic RI predictions is developed and evaluated. The variables (features) of the model include storm characteristics (e.g., storm intensity) and environmental variables (e.g., vertical shear) over the previous 48 h. A basin-aware RI prediction model is trained (1981–2009), validated (2010–13), and tested (2014–17) on global data. Models are trained on overlapping 48-h data, which allows multiple training examples for each storm. A challenge is that the data are highly unbalanced in the sense that there are many more non-RI cases than RI cases. To cope with this data imbalance, the synthetic minority-oversampling technique (SMOTE) is used to balance the training data by generating artificial RI cases. Model ensembling is also applied to improve prediction skill further. The model’s Brier skill scores in the Atlantic and eastern North Pacific are higher than those of operational predictions for RI thresholds of 25 and 30 kt and comparable for 35 kt on the independent test data. Composites of the features associated with RI and non-RI situations provide physical insights for how the model discriminates between RI and non-RI cases. Prediction case studies are presented for some recent storms.


2019 ◽  
Vol 16 (8) ◽  
pp. 3404-3409
Author(s):  
Ala Adin Baha Eldin Mustafa Abdelaziz ◽  
Ka Fei Thang ◽  
Jacqueline Lukose

The most commonly used form of energy in houses, factories, buildings and agriculture is the electrical energy, however, in recent years, there has been an increase in electrical energy demand due to technology advancements and rise in population, therefore an appropriated forecasting system must be developed to predict these demands as accurately as possible. For this purpose, five models were selected, they are Bidirectional-Long Short Term Memory (Bi-LSTM), Feed Forward Neural Network (FFNN), Long Short Term Memory (LSTM), Nonlinear Auto Regressive network with eXogenous inputs (NARX) and Multiple Linear Regression (MLR). This paper will demonstrate the development of these selected models using MATLAB and an android mobile application, which is used to visualize and interact with the data. The performance of the selected models was evaluated by performing the Mean Absolute Percent Error (MAPE), the selected historical data used to perform the MAPE was obtained from Toronto, Canada and Tasmania, Australia, where the year 2006 until 2016 was used as training data and the year 2017 was used to test the MAPE of the historical data with the models’ data. It is observed that the NARX model had the least MAPE for both the regions resulting in 1.9% for Toronto, Canada and 2.9% for Tasmania, Australia. Google cloud is used as the IoT (Internet of Things) platform for NARX data model, the 2017 datasets is converted to JavaScript Object Notation (JSON) file using JavaScript programming language, for data visualization and analysis for the android mobile application.


Author(s):  
Dejiang Kong ◽  
Fei Wu

The widely use of positioning technology has made mining the movements of people feasible and plenty of trajectory data have been accumulated. How to efficiently leverage these data for location prediction has become an increasingly popular research topic as it is fundamental to location-based services (LBS). The existing methods often focus either on long time (days or months) visit prediction (i.e., the recommendation of point of interest) or on real time location prediction (i.e., trajectory prediction). In this paper, we are interested in the location prediction problem in a weak real time condition and aim to predict users' movement in next minutes or hours. We propose a Spatial-Temporal Long-Short Term Memory (ST-LSTM) model which naturally combines spatial-temporal influence into LSTM to mitigate the problem of data sparsity. Further, we employ a hierarchical extension of the proposed ST-LSTM (HST-LSTM) in an encoder-decoder manner which models the contextual historic visit information in order to boost the prediction performance. The proposed HST-LSTM is evaluated on a real world trajectory data set and the experimental results demonstrate the effectiveness of the proposed model.


2020 ◽  
Vol 196 ◽  
pp. 02007
Author(s):  
Vladimir Mochalov ◽  
Anastasia Mochalova

In this paper, the previously obtained results on recognition of ionograms using deep learning are expanded to predict the parameters of the ionosphere. After the ionospheric parameters have been identified on the ionogram using deep learning in real time, we can predict the parameters for some time ahead on the basis of the new data obtained Examples of predicting the ionosphere parameters using an artificial recurrent neural network architecture long short-term memory are given. The place of the block for predicting the parameters of the ionosphere in the system for analyzing ionospheric data using deep learning methods is shown.


Sign in / Sign up

Export Citation Format

Share Document