Short-term load cross-forecasting using pattern-based neural models

Author(s):  
Grzegorz Dudek
Keyword(s):  
2016 ◽  
Vol 39 ◽  
Author(s):  
Stephen Grossberg

AbstractChristiansen & Chater's (C&C's) key goals for a language system have been realized by neural models for short-term storage of linguistic items in an Item-Order-Rank working memory, which inputs to Masking Fields that rapidly learn to categorize, or chunk, variable-length linguistic sequences, and choose the contextually most predictive list chunks while linguistic inputs are stored in the working memory.


The use of automatically generated summaries for long/short texts is commonly used in digital services. In this Paper, a successful approach at text generation using generative adversarial networks (GAN) has been studied. In this paper, we have studied various neural models for text generation. Our main focus was on generating text using Recurrent Neural Network (RNN) and its variants and analyze its result. We have generated and translated text varying number of epochs and temperature to improve the confidence of the model as well as by varying the size of input file. We were amazed to see how the Long-Short Term Memory (LSTM) model responded to these varying parameters. The performance of LSTMs was better when the appropriate size of dataset was given to the model for training. The resulting model is tested on different datasets originating of varying sizes. The evaluations show that the output generated by the model do not correlate with the corresponding datasets which means that the generated output is different from the dataset.


2003 ◽  
Vol 26 (6) ◽  
pp. 737-738 ◽  
Author(s):  
Stephen Grossberg

Neural models have proposed how short-term memory (STM) storage in working memory and long-term memory (LTM) storage and recall are linked and interact, but are realized by different mechanisms that obey different laws. The authors' data can be understood in the light of these models, which suggest that the authors may have gone too far in obscuring the differences between these processes.


Author(s):  
Huang-Chi Chen ◽  
Yi-Ching Lin ◽  
Yu-Ju Chen ◽  
Chuo-Yean Chang ◽  
Huang-Chu Huang ◽  
...  

Author(s):  
Vitor Hugo Ferreira ◽  
Alexandre Pinto Alves da Silva

After 1991, the literature on load forecasting has been dominated by neural network based proposals. However, one major risk in using neural models is the possibility of excessive training, i.e., data overfitting. The extent of nonlinearity provided by neural network based load forecasters, which depends on the input space representation, has been adjusted using heuristic procedures. The empirical nature of these procedures makes their application cumbersome and time consuming. Autonomous modeling including automatic input selection and model complexity control has been proposed recently for short-term load forecasting. However, these techniques require the specification of an initial input set that will be processed by the model in order to select the most relevant variables. This paper explores chaos theory as a tool from non-linear time series analysis to automatic select the lags of the load series data that will be used by the neural models. In this paper, Bayesian inference applied to multi-layered perceptrons and relevance vector machines are used in the development of autonomous neural models.


2021 ◽  
Vol 16 (1) ◽  
pp. 117-137
Author(s):  
Zsolt Lakatos

Modelljeimben a technikai indikátorok használatát kapcsolom össze a neurális hálós modellek előrejelző képességeivel. A technikai indikátorok használata mellett szól, hogy rövid távon a pénzügyi idősorok autokorreláltak, a neurális modellek pedig nemlineáris kapcsolatok modellezésére alkalmasak. A kapott eredmények révén azt a következtetést vontam le, hogy ugyan a neurális háló modellek optimalizációs képessége nagyon jó és alkalmazásukkal a megfelelő technikai indikátorok is meghatározhatók, de csak lassan képesek rátanulni az adatokra, így még a legkisebb tranzakciós költség alkalmazása mellett is csak a kezdeti befektetésünk elvesztését tudjuk halogatni. My present paper is the shortened version of my master's thesis in finance presented in November 2015, in which I presented the results of the research implemented in the Training Center for Bankers. In my models I combine the use of technical indicators with predictive capabilities of neural network models. The use of a technical indicator suggests that in the short term the financial timeseries are autocorrelated, and neural models are suitable for modeling nonlinear relationships. Based on the results I concluded that although the optimization capabilities of the neural network models are very good and their application can be determined by the appropriate technical indicators, but learning from timeseries data takes too much time, so even with the smallest transaction cost we can only delay the loss of our initial investment.


2005 ◽  
Vol 3 (1) ◽  
pp. 19-26
Author(s):  
Vitor Hugo Ferreira ◽  
Alexandre P. Alves da Silva

2016 ◽  
Vol 39 ◽  
Author(s):  
Mary C. Potter

AbstractRapid serial visual presentation (RSVP) of words or pictured scenes provides evidence for a large-capacity conceptual short-term memory (CSTM) that momentarily provides rich associated material from long-term memory, permitting rapid chunking (Potter 1993; 2009; 2012). In perception of scenes as well as language comprehension, we make use of knowledge that briefly exceeds the supposed limits of working memory.


Sign in / Sign up

Export Citation Format

Share Document