Investigation of Performance of Electric Load Demand Forecasting with New Architecture Realized in Long Short-Term Memory Deep Learning Network

Author(s):  
M Vetri Selvi ◽  
Sukumar Mishra
2021 ◽  
Vol 366 (1) ◽  
Author(s):  
Zhichao Wen ◽  
Shuhui Li ◽  
Lihua Li ◽  
Bowen Wu ◽  
Jianqiang Fu

2018 ◽  
Vol 99 ◽  
pp. 24-37 ◽  
Author(s):  
Κostas Μ. Tsiouris ◽  
Vasileios C. Pezoulas ◽  
Michalis Zervakis ◽  
Spiros Konitsiotis ◽  
Dimitrios D. Koutsouris ◽  
...  

2020 ◽  
Vol 3 (1) ◽  
Author(s):  
Jinghe Yuan ◽  
Rong Zhao ◽  
Jiachao Xu ◽  
Ming Cheng ◽  
Zidi Qin ◽  
...  

AbstractWe propose an unsupervised deep learning network to analyze the dynamics of membrane proteins from the fluorescence intensity traces. This system was trained in an unsupervised manner with the raw experimental time traces and synthesized ones, so neither predefined state number nor pre-labelling were required. With the bidirectional Long Short-Term Memory (biLSTM) networks as the hidden layers, both the past and future context can be used fully to improve the prediction results and can even extract information from the noise distribution. The method was validated with the synthetic dataset and the experimental dataset of monomeric fluorophore Cy5, and then applied to extract the membrane protein interaction dynamics from experimental data successfully.


2022 ◽  
Vol 355 ◽  
pp. 02022
Author(s):  
Chenglong Zhang ◽  
Li Yao ◽  
Jinjin Zhang ◽  
Junyong Wu ◽  
Baoguo Shan ◽  
...  

Combining actual conditions, power demand forecasting is affected by various uncertain factors such as meteorological factors, economic factors, and diversity of forecasting models, which increase the complexity of forecasting. In response to this problem, taking into account that different time step states will have different effects on the output, the attention mechanism is introduced into the method proposed in this paper, which improves the deep learning model. Improved models of convolutional neural networks (CNN) and long short-term memory (LSTM) that combine the attention mechanism are proposed respectively. Finally, according to the verification results of actual examples, it is proved that the proposed method can obtain a smaller error and the prediction performance are better compared with other models.


Sign in / Sign up

Export Citation Format

Share Document