scholarly journals An LSTM-Based Method with Attention Mechanism for Travel Time Prediction

Sensors ◽  
2019 ◽  
Vol 19 (4) ◽  
pp. 861 ◽  
Author(s):  
Xiangdong Ran ◽  
Zhiguang Shan ◽  
Yufei Fang ◽  
Chuang Lin

Traffic prediction is based on modeling the complex non-linear spatiotemporal traffic dynamics in road network. In recent years, Long Short-Term Memory has been applied to traffic prediction, achieving better performance. The existing Long Short-Term Memory methods for traffic prediction have two drawbacks: they do not use the departure time through the links for traffic prediction, and the way of modeling long-term dependence in time series is not direct in terms of traffic prediction. Attention mechanism is implemented by constructing a neural network according to its task and has recently demonstrated success in a wide range of tasks. In this paper, we propose an Long Short-Term Memory-based method with attention mechanism for travel time prediction. We present the proposed model in a tree structure. The proposed model substitutes a tree structure with attention mechanism for the unfold way of standard Long Short-Term Memory to construct the depth of Long Short-Term Memory and modeling long-term dependence. The attention mechanism is over the output layer of each Long Short-Term Memory unit. The departure time is used as the aspect of the attention mechanism and the attention mechanism integrates departure time into the proposed model. We use AdaGrad method for training the proposed model. Based on the datasets provided by Highways England, the experimental results show that the proposed model can achieve better accuracy than the Long Short-Term Memory and other baseline methods. The case study suggests that the departure time is effectively employed by using attention mechanism.

Author(s):  
Tao Gui ◽  
Qi Zhang ◽  
Lujun Zhao ◽  
Yaosong Lin ◽  
Minlong Peng ◽  
...  

In recent years, long short-term memory (LSTM) has been successfully used to model sequential data of variable length. However, LSTM can still experience difficulty in capturing long-term dependencies. In this work, we tried to alleviate this problem by introducing a dynamic skip connection, which can learn to directly connect two dependent words. Since there is no dependency information in the training data, we propose a novel reinforcement learning-based method to model the dependency relationship and connect dependent words. The proposed model computes the recurrent transition functions based on the skip connections, which provides a dynamic skipping advantage over RNNs that always tackle entire sentences sequentially. Our experimental results on three natural language processing tasks demonstrate that the proposed method can achieve better performance than existing methods. In the number prediction experiment, the proposed model outperformed LSTM with respect to accuracy by nearly 20%.


Author(s):  
Shruti Patil ◽  
Venkatesh M. Mudaliar ◽  
Pooja Kamat ◽  
Shilpa Gite

A chatbot is a software that can reproduce a discussion portraying a specific dimension of articulation among people and machines utilizing Natural Human Language. With the advent of AI, chatbots have developed from being minor guideline-based models to progressively modern models. A striking highlight of the current chatbot frameworks is their capacity to maintain and support explicit highlights and settings of the discussions empowering them to have human interaction in real-time surroundings. The paper presents a detailed database concerning the models utilized to deal with the learning of long haul conditions in a chatbot. The paper proposes a novel crossbreed Long Short Term Memory based Ensemble model to retain the information in specific situations. The proposed model uses a characterized number of Long Short Term Memory Networks as a significant aspect of its working as one to create the aggregate forecast class for the information inquiry and conversation. We found that both of the ensemble methods LSTM and GRU work well in different dataset environments and the ensemble technique is an effective one in chatbot applications.


Author(s):  
А.С. БОРОДИН ◽  
А.Р. АБДЕЛЛАХ ◽  
А.Е. КУЧЕРЯВЫЙ

Использование искусственного интеллекта в сетях связи пятого (5G) и последующих поколений дает новые возможности, в том числе для прогнозирования трафика. Это особенно важно для трафика интернета вещей (IoT - Internet of Things), поскольку число устройств IoT очень велико. Предлагается для прогнозирования трафика IoT применить глубокое обучение с использованием нейронной сети долговременной краткосрочной памяти LSTM (Long Short-Term Memory). The use of artificial intelligence in communication networks of the 5G and subsequent generations provides completely new opportunities, including for traffic forecasting. This is especially important for IoT traffic because the number of IoT devices is very large. The article proposes to apply deep learning to predict IoT traffic using a neural network of longterm short-term memory (LSTM).


2021 ◽  
Vol 11 (14) ◽  
pp. 6625
Author(s):  
Yan Su ◽  
Kailiang Weng ◽  
Chuan Lin ◽  
Zeqin Chen

An accurate dam deformation prediction model is vital to a dam safety monitoring system, as it helps assess and manage dam risks. Most traditional dam deformation prediction algorithms ignore the interpretation and evaluation of variables and lack qualitative measures. This paper proposes a data processing framework that uses a long short-term memory (LSTM) model coupled with an attention mechanism to predict the deformation response of a dam structure. First, the random forest (RF) model is introduced to assess the relative importance of impact factors and screen input variables. Secondly, the density-based spatial clustering of applications with noise (DBSCAN) method is used to identify and filter the equipment based abnormal values to reduce the random error in the measurements. Finally, the coupled model is used to focus on important factors in the time dimension in order to obtain more accurate nonlinear prediction results. The results of the case study show that, of all tested methods, the proposed coupled method performed best. In addition, it was found that temperature and water level both have significant impacts on dam deformation and can serve as reliable metrics for dam management.


Author(s):  
Azim Heydari ◽  
Meysam Majidi Nezhad ◽  
Davide Astiaso Garcia ◽  
Farshid Keynia ◽  
Livio De Santoli

AbstractAir pollution monitoring is constantly increasing, giving more and more attention to its consequences on human health. Since Nitrogen dioxide (NO2) and sulfur dioxide (SO2) are the major pollutants, various models have been developed on predicting their potential damages. Nevertheless, providing precise predictions is almost impossible. In this study, a new hybrid intelligent model based on long short-term memory (LSTM) and multi-verse optimization algorithm (MVO) has been developed to predict and analysis the air pollution obtained from Combined Cycle Power Plants. In the proposed model, long short-term memory model is a forecaster engine to predict the amount of produced NO2 and SO2 by the Combined Cycle Power Plant, where the MVO algorithm is used to optimize the LSTM parameters in order to achieve a lower forecasting error. In addition, in order to evaluate the proposed model performance, the model has been applied using real data from a Combined Cycle Power Plant in Kerman, Iran. The datasets include wind speed, air temperature, NO2, and SO2 for five months (May–September 2019) with a time step of 3-h. In addition, the model has been tested based on two different types of input parameters: type (1) includes wind speed, air temperature, and different lagged values of the output variables (NO2 and SO2); type (2) includes just lagged values of the output variables (NO2 and SO2). The obtained results show that the proposed model has higher accuracy than other combined forecasting benchmark models (ENN-PSO, ENN-MVO, and LSTM-PSO) considering different network input variables. Graphic abstract


2021 ◽  
Author(s):  
Seyed Vahid Moravvej ◽  
Mohammad Javad Maleki Kahaki ◽  
Moein Salimi Sartakhti ◽  
Abdolreza Mirzaei

2021 ◽  
pp. 1-17
Author(s):  
Enda Du ◽  
Yuetian Liu ◽  
Ziyan Cheng ◽  
Liang Xue ◽  
Jing Ma ◽  
...  

Summary Accurate production forecasting is an essential task and accompanies the entire process of reservoir development. With the limitation of prediction principles and processes, the traditional approaches are difficult to make rapid predictions. With the development of artificial intelligence, the data-driven model provides an alternative approach for production forecasting. To fully take the impact of interwell interference on production into account, this paper proposes a deep learning-based hybrid model (GCN-LSTM), where graph convolutional network (GCN) is used to capture complicated spatial patterns between each well, and long short-term memory (LSTM) neural network is adopted to extract intricate temporal correlations from historical production data. To implement the proposed model more efficiently, two data preprocessing procedures are performed: Outliers in the data set are removed by using a box plot visualization, and measurement noise is reduced by a wavelet transform. The robustness and applicability of the proposed model are evaluated in two scenarios of different data types with the root mean square error (RMSE), the mean absolute error (MAE), and the mean absolute percentage error (MAPE). The results show that the proposed model can effectively capture spatial and temporal correlations to make a rapid and accurate oil production forecast.


Sign in / Sign up

Export Citation Format

Share Document