scholarly journals MULTI-TEMPORAL LAND COVER CLASSIFICATION WITH LONG SHORT-TERM MEMORY NEURAL NETWORKS

Author(s):  
M. Rußwurm ◽  
M. Körner

<i>Land cover classification (LCC)</i> is a central and wide field of research in earth observation and has already put forth a variety of classification techniques. Many approaches are based on classification techniques considering observation at certain points in time. However, some land cover classes, such as crops, change their spectral characteristics due to environmental influences and can thus not be monitored effectively with classical mono-temporal approaches. Nevertheless, these temporal observations should be utilized to benefit the classification process. After extensive research has been conducted on modeling temporal dynamics by spectro-temporal profiles using vegetation indices, we propose a deep learning approach to utilize these temporal characteristics for classification tasks. In this work, we show how <i>long short-term memory</i> (LSTM) neural networks can be employed for crop identification purposes with SENTINEL 2A observations from large study areas and label information provided by local authorities. We compare these temporal neural network models, <i>i.e.</i>, LSTM and <i>recurrent neural network</i> (RNN), with a classical non-temporal <i>convolutional neural network</i> (CNN) model and an additional <i>support vector machine</i> (SVM) baseline. With our rather straightforward LSTM variant, we exceeded state-of-the-art classification performance, thus opening promising potential for further research.

2020 ◽  
Vol 2020 ◽  
pp. 1-7
Author(s):  
Xiaolu Wei ◽  
Binbin Lei ◽  
Hongbing Ouyang ◽  
Qiufeng Wu

This study attempts to predict stock index prices using multivariate time series analysis. The study’s motivation is based on the notion that datasets of stock index prices involve weak periodic patterns, long-term and short-term information, for which traditional approaches and current neural networks such as Autoregressive models and Support Vector Machine (SVM) may fail. This study applied Temporal Pattern Attention and Long-Short-Term Memory (TPA-LSTM) for prediction to overcome the issue. The results show that stock index prices prediction through the TPA-LSTM algorithm could achieve better prediction performance over traditional deep neural networks, such as recurrent neural network (RNN), convolutional neural network (CNN), and long and short-term time series network (LSTNet).


Author(s):  
Ralph Sherwin A. Corpuz ◽  

Analyzing natural language-based Customer Satisfaction (CS) is a tedious process. This issue is practically true if one is to manually categorize large datasets. Fortunately, the advent of supervised machine learning techniques has paved the way toward the design of efficient categorization systems used for CS. This paper presents the feasibility of designing a text categorization model using two popular and robust algorithms – the Support Vector Machine (SVM) and Long Short-Term Memory (LSTM) Neural Network, in order to automatically categorize complaints, suggestions, feedbacks, and commendations. The study found that, in terms of training accuracy, SVM has best rating of 98.63% while LSTM has best rating of 99.32%. Such results mean that both SVM and LSTM algorithms are at par with each other in terms of training accuracy, but SVM is significantly faster than LSTM by approximately 35.47s. The training performance results of both algorithms are attributed on the limitations of the dataset size, high-dimensionality of both English and Tagalog languages, and applicability of the feature engineering techniques used. Interestingly, based on the results of actual implementation, both algorithms are found to be 100% effective in accurately predicting the correct CS categories. Hence, the extent of preference between the two algorithms boils down on the available dataset and the skill in optimizing these algorithms through feature engineering techniques and in implementing them toward actual text categorization applications.


2019 ◽  
Vol 9 (8) ◽  
pp. 1687 ◽  
Author(s):  
Huafeng Qin ◽  
Peng Wang

Finger-vein biometrics has been extensively investigated for personal verification. A challenge is that the finger-vein acquisition is affected by many factors, which results in many ambiguous regions in the finger-vein image. Generally, the separability between vein and background is poor in such regions. Despite recent advances in finger-vein pattern segmentation, current solutions still lack the robustness to extract finger-vein features from raw images because they do not take into account the complex spatial dependencies of vein pattern. This paper proposes a deep learning model to extract vein features by combining the Convolutional Neural Networks (CNN) model and Long Short-Term Memory (LSTM) model. Firstly, we automatically assign the label based on a combination of known state of the art handcrafted finger-vein image segmentation techniques, and generate various sequences for each labeled pixel along different directions. Secondly, several Stacked Convolutional Neural Networks and Long Short-Term Memory (SCNN-LSTM) models are independently trained on the resulting sequences. The outputs of various SCNN-LSTMs form a complementary and over-complete representation and are conjointly put into Probabilistic Support Vector Machine (P-SVM) to predict the probability of each pixel of being foreground (i.e., vein pixel) given several sequences centered on it. Thirdly, we propose a supervised encoding scheme to extract the binary vein texture. A threshold is automatically computed by taking into account the maximal separation between the inter-class distance and the intra-class distance. In our approach, the CNN learns robust features for vein texture pattern representation and LSTM stores the complex spatial dependencies of vein patterns. So, the pixels in any region of a test image can then be classified effectively. In addition, the supervised information is employed to encode the vein patterns, so the resulting encoding images contain more discriminating features. The experimental results on one public finger-vein database show that the proposed approach significantly improves the finger-vein verification accuracy.


2020 ◽  
Vol 34 (04) ◽  
pp. 4989-4996
Author(s):  
Ekaterina Lobacheva ◽  
Nadezhda Chirkova ◽  
Alexander Markovich ◽  
Dmitry Vetrov

One of the most popular approaches for neural network compression is sparsification — learning sparse weight matrices. In structured sparsification, weights are set to zero by groups corresponding to structure units, e. g. neurons. We further develop the structured sparsification approach for the gated recurrent neural networks, e. g. Long Short-Term Memory (LSTM). Specifically, in addition to the sparsification of individual weights and neurons, we propose sparsifying the preactivations of gates. This makes some gates constant and simplifies an LSTM structure. We test our approach on the text classification and language modeling tasks. Our method improves the neuron-wise compression of the model in most of the tasks. We also observe that the resulting structure of gate sparsity depends on the task and connect the learned structures to the specifics of the particular tasks.


2021 ◽  
Author(s):  
P. Jiang ◽  
I. Bychkov ◽  
J. Liu ◽  
A. Hmelnov

Forecasting of air pollutant concentration, which is influenced by air pollution accumulation, traffic flow and industrial emissions, has attracted extensive attention for decades. In this paper, we propose a spatio-temporal attention convolutional long short term memory neural networks (Attention-CNN-LSTM) for air pollutant concentration forecasting. Firstly, we analyze the Granger causalities between different stations and establish a hyperparametric Gaussian vector weight function to determine spatial autocorrelation variables, which is used as part of the input feature. Secondly, convolutional neural networks (CNN) is employed to extract the temporal dependence and spatial correlation of the input, while feature maps and channels are weighted by attention mechanism, so as to improve the effectiveness of the features. Finally, a depth long short term memory (LSTM) based time series predictor is established for learning the long-term and short-term dependence of pollutant concentration. In order to reduce the effect of diverse complex factors on LSTM, inherent features are extracted from historical air pollutant concentration data meteorological data and timestamp information are incorporated into the proposed model. Extensive experiments were performed using the Attention-CNNLSTM, autoregressive integrated moving average (ARIMA), support vector regression (SVR), traditional LSTM and CNN, respectively. The results demonstrated that the feasibility and practicability of Attention-CNN-LSTM on estimating CO and NO concentration.


In this study, it is presented a new hybrid model based on deep neural networks to predict the direction and magnitude of the Forex market movement in the short term. The overall model presented is based on the scalping strategy and is provided for high frequency transactions. The proposed hybrid model is based on a combination of three models based on deep neural networks. The first model is a deep neural network with a multi-input structure consisting of a combination of Long Short Term Memory layers. The second model is a deep neural network with a multi-input structure made of a combination of one-dimensional Convolutional Neural network layers. The third model has a simpler structure and is a multi-input model of the Multi-Layer Perceptron layers. The overall model was also a model based on the majority vote of three top models. This study showed that models based on Long Short-Term Memory layers provided better results than the other models and even hybrid models with more than 70% accurate.


Sign in / Sign up

Export Citation Format

Share Document