scholarly journals Electrical load forecasting through long short term memory

Author(s):  
Debani Prasad Mishra ◽  
Sanhita Mishra ◽  
Rakesh Kumar Yadav ◽  
Rishabh Vishnoi ◽  
Surender Reddy Salkuti

For a power supplier, meeting demand-supply equilibrium is of utmost importance. Electrical energy must be generated according to demand, as a large amount of electrical energy cannot be stored. For the proper functioning of a power supply system, an adequate model for predicting load is a necessity. In the present world, in almost every industry, whether it be healthcare, agriculture, and consulting, growing digitization and automation is a prominent feature. As a result, large sets of data related to these industries are being generated, which when subjected to rigorous analysis, yield out-of-the-box methods to optimize the business and services offered. This paper aims to ascertain the viability of long short term memory (LSTM) neural networks, a recurrent neural network capable of handling both long-term and short-term dependencies of data sets, for predicting load that is to be met by a Dispatch Center located in a major city. The result shows appreciable accuracy in forecasting future demand.

Water ◽  
2019 ◽  
Vol 11 (7) ◽  
pp. 1387 ◽  
Author(s):  
Le ◽  
Ho ◽  
Lee ◽  
Jung

Flood forecasting is an essential requirement in integrated water resource management. This paper suggests a Long Short-Term Memory (LSTM) neural network model for flood forecasting, where the daily discharge and rainfall were used as input data. Moreover, characteristics of the data sets which may influence the model performance were also of interest. As a result, the Da River basin in Vietnam was chosen and two different combinations of input data sets from before 1985 (when the Hoa Binh dam was built) were used for one-day, two-day, and three-day flowrate forecasting ahead at Hoa Binh Station. The predictive ability of the model is quite impressive: The Nash–Sutcliffe efficiency (NSE) reached 99%, 95%, and 87% corresponding to three forecasting cases, respectively. The findings of this study suggest a viable option for flood forecasting on the Da River in Vietnam, where the river basin stretches between many countries and downstream flows (Vietnam) may fluctuate suddenly due to flood discharge from upstream hydroelectric reservoirs.


2018 ◽  
Vol 7 (4.15) ◽  
pp. 25 ◽  
Author(s):  
Said Jadid Abdulkadir ◽  
Hitham Alhussian ◽  
Muhammad Nazmi ◽  
Asim A Elsheikh

Forecasting time-series data are imperative especially when planning is required through modelling using uncertain knowledge of future events. Recurrent neural network models have been applied in the industry and outperform standard artificial neural networks in forecasting, but fail in long term time-series forecasting due to the vanishing gradient problem. This study offers a robust solution that can be implemented for long-term forecasting using a special architecture of recurrent neural network known as Long Short Term Memory (LSTM) model to overcome the vanishing gradient problem. LSTM is specially designed to avoid the long-term dependency problem as their default behavior. Empirical analysis is performed using quantitative forecasting metrics and comparative model performance on the forecasted outputs. An evaluation analysis is performed to validate that the LSTM model provides better forecasted outputs on Standard & Poor’s 500 Index (S&P 500) in terms of error metrics as compared to other forecasting models.  


2020 ◽  
Vol 224 (1) ◽  
pp. 669-681
Author(s):  
Sihong Wu ◽  
Qinghua Huang ◽  
Li Zhao

SUMMARY Late-time transient electromagnetic (TEM) data contain deep subsurface information and are important for resolving deeper electrical structures. However, due to their relatively small signal amplitudes, TEM responses later in time are often dominated by ambient noises. Therefore, noise removal is critical to the application of TEM data in imaging electrical structures at depth. De-noising techniques for TEM data have been developed rapidly in recent years. Although strong efforts have been made to improving the quality of the TEM responses, it is still a challenge to effectively extract the signals due to unpredictable and irregular noises. In this study, we develop a new type of neural network architecture by combining the long short-term memory (LSTM) network with the autoencoder structure to suppress noise in TEM signals. The resulting LSTM-autoencoders yield excellent performance on synthetic data sets including horizontal components of the electric field and vertical component of the magnetic field generated by different sources such as dipole, loop and grounded line sources. The relative errors between the de-noised data sets and the corresponding noise-free transients are below 1% for most of the sampling points. Notable improvement in the resistivity structure inversion result is achieved using the TEM data de-noised by the LSTM-autoencoder in comparison with several widely-used neural networks, especially for later-arriving signals that are important for constraining deeper structures. We demonstrate the effectiveness and general applicability of the LSTM-autoencoder by de-noising experiments using synthetic 1-D and 3-D TEM signals as well as field data sets. The field data from a fixed loop survey using multiple receivers are greatly improved after de-noising by the LSTM-autoencoder, resulting in more consistent inversion models with significantly increased exploration depth. The LSTM-autoencoder is capable of enhancing the quality of the TEM signals at later times, which enables us to better resolve deeper electrical structures.


Sensors ◽  
2019 ◽  
Vol 19 (4) ◽  
pp. 861 ◽  
Author(s):  
Xiangdong Ran ◽  
Zhiguang Shan ◽  
Yufei Fang ◽  
Chuang Lin

Traffic prediction is based on modeling the complex non-linear spatiotemporal traffic dynamics in road network. In recent years, Long Short-Term Memory has been applied to traffic prediction, achieving better performance. The existing Long Short-Term Memory methods for traffic prediction have two drawbacks: they do not use the departure time through the links for traffic prediction, and the way of modeling long-term dependence in time series is not direct in terms of traffic prediction. Attention mechanism is implemented by constructing a neural network according to its task and has recently demonstrated success in a wide range of tasks. In this paper, we propose an Long Short-Term Memory-based method with attention mechanism for travel time prediction. We present the proposed model in a tree structure. The proposed model substitutes a tree structure with attention mechanism for the unfold way of standard Long Short-Term Memory to construct the depth of Long Short-Term Memory and modeling long-term dependence. The attention mechanism is over the output layer of each Long Short-Term Memory unit. The departure time is used as the aspect of the attention mechanism and the attention mechanism integrates departure time into the proposed model. We use AdaGrad method for training the proposed model. Based on the datasets provided by Highways England, the experimental results show that the proposed model can achieve better accuracy than the Long Short-Term Memory and other baseline methods. The case study suggests that the departure time is effectively employed by using attention mechanism.


Author(s):  
Tao Gui ◽  
Qi Zhang ◽  
Lujun Zhao ◽  
Yaosong Lin ◽  
Minlong Peng ◽  
...  

In recent years, long short-term memory (LSTM) has been successfully used to model sequential data of variable length. However, LSTM can still experience difficulty in capturing long-term dependencies. In this work, we tried to alleviate this problem by introducing a dynamic skip connection, which can learn to directly connect two dependent words. Since there is no dependency information in the training data, we propose a novel reinforcement learning-based method to model the dependency relationship and connect dependent words. The proposed model computes the recurrent transition functions based on the skip connections, which provides a dynamic skipping advantage over RNNs that always tackle entire sentences sequentially. Our experimental results on three natural language processing tasks demonstrate that the proposed method can achieve better performance than existing methods. In the number prediction experiment, the proposed model outperformed LSTM with respect to accuracy by nearly 20%.


2019 ◽  
Vol 16 (8) ◽  
pp. 3404-3409
Author(s):  
Ala Adin Baha Eldin Mustafa Abdelaziz ◽  
Ka Fei Thang ◽  
Jacqueline Lukose

The most commonly used form of energy in houses, factories, buildings and agriculture is the electrical energy, however, in recent years, there has been an increase in electrical energy demand due to technology advancements and rise in population, therefore an appropriated forecasting system must be developed to predict these demands as accurately as possible. For this purpose, five models were selected, they are Bidirectional-Long Short Term Memory (Bi-LSTM), Feed Forward Neural Network (FFNN), Long Short Term Memory (LSTM), Nonlinear Auto Regressive network with eXogenous inputs (NARX) and Multiple Linear Regression (MLR). This paper will demonstrate the development of these selected models using MATLAB and an android mobile application, which is used to visualize and interact with the data. The performance of the selected models was evaluated by performing the Mean Absolute Percent Error (MAPE), the selected historical data used to perform the MAPE was obtained from Toronto, Canada and Tasmania, Australia, where the year 2006 until 2016 was used as training data and the year 2017 was used to test the MAPE of the historical data with the models’ data. It is observed that the NARX model had the least MAPE for both the regions resulting in 1.9% for Toronto, Canada and 2.9% for Tasmania, Australia. Google cloud is used as the IoT (Internet of Things) platform for NARX data model, the 2017 datasets is converted to JavaScript Object Notation (JSON) file using JavaScript programming language, for data visualization and analysis for the android mobile application.


Sign in / Sign up

Export Citation Format

Share Document