Ionospheric VTEC Prediction Model in Low-Latitude Regions Based on LSTM Neural Network

2021 ◽  
Author(s):  
T-B Z ◽  
Hui-Jian Liang ◽  
Shi-Guang Wang ◽  
Chen-Guang Ouyang

Abstract Ionosphere delay is one of the main sources of noise affecting Global Navigation Satellite Systems, operation of radio detection and ranging systems and, very-long-baseline-interferometry. One of the most important and common methods to reduce this phase delay is to establish accurate nowcasting and forecasting ionospheric total electron content models. For forecasting models, compared to mid-to-high latitudes, at low latitudes, an active ionosphere leads to extreme differences between long-term prediction models and the actual state of the ionosphere. To improve the problem of low accuracy for long-term prediction models at low latitudes, this article provides a low-latitude, long-term ionospheric prediction model based on a multi-input-multi-output, long short-term memory neural network. To verify the feasibility of the model, we first made predictions of the vertical total electron content data 24 and 48 hours in advance for each day of July 2020 and then compared both the predictions corresponding to a given day, for all days. Furthermore, in the model modification part, we selected historical data from June 2020 for the validation set, determined a large offset from the results that were predicted to be active, and used the ratio of the mean absolute error of the detected results to that of the predicted results as a correction coefficient to modify our multi-input-multi-output long short-term memory model. The average root mean square error of the 24-hour-advance predictions of our modified model was 4.4 TECU, which was lower and better than 5.1 TECU of the multi-input-multi-output, long short-term memory model and 5.9 TECU of the IRI-2016 model.

2020 ◽  
Vol 12 (9) ◽  
pp. 1354
Author(s):  
Maria Kaselimi ◽  
Athanasios Voulodimos ◽  
Nikolaos Doulamis ◽  
Anastasios Doulamis ◽  
Demitris Delikaraoglou

The necessity of predicting the spatio-temporal phenomenon of ionospheric variability is closely related to the requirement of many users to be able to obtain high accuracy positioning with low cost equipment. The Precise Point Positioning (PPP) technique is highly accepted by the scientific community as a means for providing high level of position accuracy from a single receiver. However, its main drawback is the long convergence time to achieve centimeter-level accuracy in positioning. Hereby, we propose a deep learning-based approach for ionospheric modeling. This method exploits the advantages of Long Short-Term Memory (LSTM) Recurrent Neural Networks (RNN) for timeseries modeling and predicts the total electron content per satellite from a specific station by making use of a causal, supervised deep learning method. The scope of the proposed method is to compare and evaluate the between-satellites ionospheric delay estimation, and to aggregate the Total Electron Content (TEC) outcomes per-satellite into a single solution over the station, thus constructing regional TEC models, in an attempt to replace Global Ionospheric Maps (GIM) data. The evaluation of our proposed recurrent method for the prediction of vertical total electron content (VTEC) values is compared against the traditional Autoregressive (AR) and the Autoregressive Moving Average (ARMA) methods, per satellite. The proposed model achieves error lower than 1.5 TECU which is slightly better than the accuracy of the current GIM products which is currently about 2.0–3.0 TECU.


2018 ◽  
Vol 7 (4.15) ◽  
pp. 25 ◽  
Author(s):  
Said Jadid Abdulkadir ◽  
Hitham Alhussian ◽  
Muhammad Nazmi ◽  
Asim A Elsheikh

Forecasting time-series data are imperative especially when planning is required through modelling using uncertain knowledge of future events. Recurrent neural network models have been applied in the industry and outperform standard artificial neural networks in forecasting, but fail in long term time-series forecasting due to the vanishing gradient problem. This study offers a robust solution that can be implemented for long-term forecasting using a special architecture of recurrent neural network known as Long Short Term Memory (LSTM) model to overcome the vanishing gradient problem. LSTM is specially designed to avoid the long-term dependency problem as their default behavior. Empirical analysis is performed using quantitative forecasting metrics and comparative model performance on the forecasted outputs. An evaluation analysis is performed to validate that the LSTM model provides better forecasted outputs on Standard & Poor’s 500 Index (S&P 500) in terms of error metrics as compared to other forecasting models.  


Sensors ◽  
2019 ◽  
Vol 19 (4) ◽  
pp. 861 ◽  
Author(s):  
Xiangdong Ran ◽  
Zhiguang Shan ◽  
Yufei Fang ◽  
Chuang Lin

Traffic prediction is based on modeling the complex non-linear spatiotemporal traffic dynamics in road network. In recent years, Long Short-Term Memory has been applied to traffic prediction, achieving better performance. The existing Long Short-Term Memory methods for traffic prediction have two drawbacks: they do not use the departure time through the links for traffic prediction, and the way of modeling long-term dependence in time series is not direct in terms of traffic prediction. Attention mechanism is implemented by constructing a neural network according to its task and has recently demonstrated success in a wide range of tasks. In this paper, we propose an Long Short-Term Memory-based method with attention mechanism for travel time prediction. We present the proposed model in a tree structure. The proposed model substitutes a tree structure with attention mechanism for the unfold way of standard Long Short-Term Memory to construct the depth of Long Short-Term Memory and modeling long-term dependence. The attention mechanism is over the output layer of each Long Short-Term Memory unit. The departure time is used as the aspect of the attention mechanism and the attention mechanism integrates departure time into the proposed model. We use AdaGrad method for training the proposed model. Based on the datasets provided by Highways England, the experimental results show that the proposed model can achieve better accuracy than the Long Short-Term Memory and other baseline methods. The case study suggests that the departure time is effectively employed by using attention mechanism.


Author(s):  
Tao Gui ◽  
Qi Zhang ◽  
Lujun Zhao ◽  
Yaosong Lin ◽  
Minlong Peng ◽  
...  

In recent years, long short-term memory (LSTM) has been successfully used to model sequential data of variable length. However, LSTM can still experience difficulty in capturing long-term dependencies. In this work, we tried to alleviate this problem by introducing a dynamic skip connection, which can learn to directly connect two dependent words. Since there is no dependency information in the training data, we propose a novel reinforcement learning-based method to model the dependency relationship and connect dependent words. The proposed model computes the recurrent transition functions based on the skip connections, which provides a dynamic skipping advantage over RNNs that always tackle entire sentences sequentially. Our experimental results on three natural language processing tasks demonstrate that the proposed method can achieve better performance than existing methods. In the number prediction experiment, the proposed model outperformed LSTM with respect to accuracy by nearly 20%.


Sign in / Sign up

Export Citation Format

Share Document