scholarly journals Comparison of Long Short-Term Memory Networks and Random Forest for Sentinel-1 Time Series Based Large Scale Crop Classification

2021 ◽  
Vol 13 (24) ◽  
pp. 5000
Author(s):  
Felix Reuß ◽  
Isabella Greimeister-Pfeil ◽  
Mariette Vreugdenhil ◽  
Wolfgang Wagner

To ensure future food security, improved agricultural management approaches are required. For many of those applications, precise knowledge of the distribution of crop types is essential. Various machine and deep learning models have been used for automated crop classification using microwave remote sensing time series. However, the application of these approaches on a large spatial and temporal scale is barely investigated. In this study, the performance of two frequently used algorithms, Long Short-Term Memory (LSTM) networks and Random Forest (RF), for crop classification based on Sentinel-1 time series and meteorological data on a large spatial and temporal scale is assessed. For data from Austria, the Netherlands, and France and the years 2015–2019, scenarios with different spatial and temporal scales were defined. To quantify the complexity of these scenarios, the Fisher Discriminant measurement F1 (FDR1) was used. The results demonstrate that both classifiers achieve similar results for simple classification tasks with low FDR1 values. With increasing FDR1 values, however, LSTM networks outperform RF. This suggests that the ability of LSTM networks to learn long-term dependencies and identify the relation between radar time series and meteorological data becomes increasingly important for more complex applications. Thus, the study underlines the importance of deep learning models, including LSTM networks, for large-scale applications.

Author(s):  
Vasily D. Derbentsev ◽  
Vitalii S. Bezkorovainyi ◽  
Iryna V. Luniak

This study investigates the issues of forecasting changes in short-term currency trends using deep learning models, which is relevant for both the scientific community and for traders and investors. The purpose of this study is to build a model for forecasting the direction of change in the prices of currency quotes based on deep neural networks. The developed architecture was based on the model of valve recurrent node, which is a modification of the model of “Long Short-Term Memory”, but is simpler in terms of the number of parameters and learning time. The forecast calculations of the dynamics of quotations of the currency pair euro/dollar and the most capitalised cryptocurrency Bitcoin/dollar were performed using daily, four-hour and hourly datasets. The obtained results of binary classification (forecast of the direction of trend change) when applying daily and hourly quotations turned out to be generally better than those of time series models or models of neural networks of other architecture (in particular, multilayer perceptron or “Long Short-Term Memory” models). According to the study results, the highest accuracy of classification was for the model of daily quotations for both euro/dollar – about 72%, and for Bitcoin/ dollar – about 69%. For four-hour and hourly time series, the accuracy of classification decreased, which can be explained both by the increase in the impact of “market noise” and the probable overfitting. Computer simulation has demonstrated that models predict a rising trend better than a declining one. The study confirmed the prospects for the application of deep learning models for short-term forecasting of time series of currency quotes. The use of the developed models proved to be effective for both fiat and cryptocurrencies. The proposed system of models based on deep neural networks can be used as a basis for developing an automated trading system in the foreign exchange market


2021 ◽  
Author(s):  
Pradeep Lall ◽  
Tony Thomas ◽  
Ken Blecker

Abstract Prognostics and Remaining Useful Life (RUL) estimations of complex systems are essential to operational safety, increased efficiency, and help to schedule maintenance proactively. Modeling the remaining useful life of a system with many complexities is possible with the rapid development in the field of deep learning as a computational technique for failure prediction. Deep learning can adapt to multivariate parameters complex and nonlinear behavior, which is difficult using traditional time-series models for forecasting and prediction purposes. In this paper, a deep learning approach based on Long Short-Term Memory (LSTM) network is used to predict the remaining useful life of the PCB at different conditions of temperature and vibration. This technique can identify the different underlying patterns in the time series that can predict the RUL. This study involves feature vector identification and RUL estimations for SAC305, SAC105, and Tin Lead solder PCBs under different vibration levels and temperature conditions. The acceleration levels of vibration are fixed at 5g and 10g, while the temperature levels are 55°C and 100°C. The test board is a multilayer FR4 configuration with JEDEC standard dimensions consists of twelve packages arranged in a rectangular pattern. Strain signals are acquired from the backside of the PCB at symmetric locations to identify the failure of all the packages during vibration. The strain signals are resistance values that are acquired simultaneously during the experiment until the failure of most of the packages on the board. The feature vectors are identified from statistical analysis on the strain signals frequency and instantaneous frequency components. The principal component analysis is used as a data reduction technique to identify the different patterns produced from the four strain signals with failures of the packages during vibration. LSTM deep learning method is used to model the RUL of the packages at different individual operating conditions of vibration for all three solder materials involved in this study. A combined model for RUL prediction for a material that can take care of the changes in the operating conditions is also modeled for each material.


2019 ◽  
Vol 57 (6) ◽  
pp. 114-119 ◽  
Author(s):  
Yuxiu Hua ◽  
Zhifeng Zhao ◽  
Rongpeng Li ◽  
Xianfu Chen ◽  
Zhiming Liu ◽  
...  

Sensors ◽  
2020 ◽  
Vol 20 (17) ◽  
pp. 4887
Author(s):  
Hailun Zhang ◽  
Rui Fu

At an intersection with complex traffic flow, the early detection of the intention of drivers in surrounding vehicles can enable advanced driver assistance systems (ADAS) to warn the driver in advance or prompt its subsystems to assess the risk and intervene early. Although different drivers show various driving characteristics, the kinematic parameters of human-driven vehicles can be used as a predictor for predicting the driver’s intention within a short time. In this paper, we propose a new hybrid approach for vehicle behavior recognition at intersections based on time series prediction and deep learning networks. First, the lateral position, longitudinal position, speed, and acceleration of the vehicle are predicted using the online autoregressive integrated moving average (ARIMA) algorithm. Next, a variant of the long short-term memory network, called the bidirectional long short-term memory (Bi-LSTM) network, is used to detect the vehicle’s turning behavior using the predicted parameters, as well as the derived parameters, i.e., the lateral velocity, lateral acceleration, and heading angle. The validity of the proposed method is verified at real intersections using the public driving data of the next generation simulation (NGSIM) project. The results of the turning behavior detection show that the proposed hybrid approach exhibits significant improvement over a conventional algorithm; the average recognition rates are 94.2% and 93.5% at 2 s and 1 s, respectively, before initiating the turning maneuver.


2021 ◽  
Vol 5 (4) ◽  
pp. 380
Author(s):  
Abdulkareem A. Hezam ◽  
Salama A. Mostafa ◽  
Zirawani Baharum ◽  
Alde Alanda ◽  
Mohd Zaki Salikon

Distributed-Denial-of-Service impacts are undeniably significant, and because of the development of IoT devices, they are expected to continue to rise in the future. Even though many solutions have been developed to identify and prevent this assault, which is mainly targeted at IoT devices, the danger continues to exist and is now larger than ever. It is common practice to launch denial of service attacks in order to prevent legitimate requests from being completed. This is accomplished by swamping the targeted machines or resources with false requests in an attempt to overpower systems and prevent many or all legitimate requests from being completed. There have been many efforts to use machine learning to tackle puzzle-like middle-box problems and other Artificial Intelligence (AI) problems in the last few years. The modern botnets are so sophisticated that they may evolve daily, as in the case of the Mirai botnet, for example. This research presents a deep learning method based on a real-world dataset gathered by infecting nine Internet of Things devices with two of the most destructive DDoS botnets, Mirai and Bashlite, and then analyzing the results. This paper proposes the BiLSTM-CNN model that combines Bidirectional Long-Short Term Memory Recurrent Neural Network and Convolutional Neural Network (CNN). This model employs CNN for data processing and feature optimization, and the BiLSTM is used for classification. This model is evaluated by comparing its results with three standard deep learning models of CNN, Recurrent Neural Network (RNN), and long-Short Term Memory Recurrent Neural Network (LSTM–RNN). There is a huge need for more realistic datasets to fully test such models' capabilities, and where N-BaIoT comes, it also includes multi-device IoT data. The N-BaIoT dataset contains DDoS attacks with the two of the most used types of botnets: Bashlite and Mirai. The 10-fold cross-validation technique tests the four models. The obtained results show that the BiLSTM-CNN outperforms all other individual classifiers in every aspect in which it achieves an accuracy of 89.79% and an error rate of 0.1546 with a very high precision of 93.92% with an f1-score and recall of 85.73% and 89.11%, respectively. The RNN achieves the highest accuracy among the three individual models, with an accuracy of 89.77%, followed by LSTM, which achieves the second-highest accuracy of 89.71%. CNN, on the other hand, achieves the lowest accuracy among all classifiers of 89.50%.


2021 ◽  
Vol 11 (1) ◽  
pp. 61-67
Author(s):  
Watthana Pongsena ◽  
◽  
Prakaidoy Sitsayabut ◽  
Nittaya Kerdprasop ◽  
Kittisak Kerdprasop ◽  
...  

Forex is the largest global financial market in the world. Traditionally, fundamental and technical analysis are strategies that the Forex traders often used. Nowadays, advanced computational technology, Artificial Intelligence (AI) has played a significant role in the financial domain. Various applications based on AI technologies particularly machine learning and deep learning have been constantly developed. As the historical data of the Forex are time-series data where the values from the past affect the values that will appear in the future. Several existing works from other domains of applications have proved that the Long-Short Term Memory (LSTM), which is a particular kind of deep learning that can be applied to modeling time series, provides better performance than traditional machine learning algorithms. In this paper, we aim to develop a powerful predictive model targeting to predicts the daily price changes of the currency pairwise in the Forex market using LSTM. Besides, we also conduct an extensive experiment with the intention to demonstrate the effect of various factors contributing to the performance of the model. The experimental results show that the optimized LSTM model accurately predicts the direction of the future price up to 61.25 percent.


2019 ◽  
Vol 120 (3) ◽  
pp. 425-441 ◽  
Author(s):  
Sonali Shankar ◽  
P. Vigneswara Ilavarasan ◽  
Sushil Punia ◽  
Surya Prakash Singh

Purpose Better forecasting always leads to better management and planning of the operations. The container throughput data are complex and often have multiple seasonality. This makes it difficult to forecast accurately. The purpose of this paper is to forecast container throughput using deep learning methods and benchmark its performance over other traditional time-series methods. Design/methodology/approach In this study, long short-term memory (LSTM) networks are implemented to forecast container throughput. The container throughput data of the Port of Singapore are used for empirical analysis. The forecasting performance of the LSTM model is compared with seven different time-series forecasting methods, namely, autoregressive integrated moving average (ARIMA), simple exponential smoothing, Holt–Winter’s, error-trend-seasonality, trigonometric regressors (TBATS), neural network (NN) and ARIMA + NN. The relative error matrix is used to analyze the performance of the different models with respect to bias, accuracy and uncertainty. Findings The results showed that LSTM outperformed all other benchmark methods. From a statistical perspective, the Diebold–Mariano test is also conducted to further substantiate better forecasting performance of LSTM over other counterpart methods. Originality/value The proposed study is a contribution to the literature on the container throughput forecasting and adds value to the supply chain theory of forecasting. Second, this study explained the architecture of the deep-learning-based LSTM method and discussed in detail the steps to implement it.


Sign in / Sign up

Export Citation Format

Share Document