scholarly journals Application of Neural Network Models in Modelling Economic Time Series with Non-constant Volatility

2015 ◽  
Vol 34 ◽  
pp. 600-607 ◽  
Author(s):  
Lukas Falat ◽  
Zuzana Stanikova ◽  
Maria Durisova ◽  
Beata Holkova ◽  
Tatiana Potkanova
Healthcare ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 181 ◽  
Author(s):  
Patricia Melin ◽  
Julio Cesar Monica ◽  
Daniela Sanchez ◽  
Oscar Castillo

In this paper, a multiple ensemble neural network model with fuzzy response aggregation for the COVID-19 time series is presented. Ensemble neural networks are composed of a set of modules, which are used to produce several predictions under different conditions. The modules are simple neural networks. Fuzzy logic is then used to aggregate the responses of several predictor modules, in this way, improving the final prediction by combining the outputs of the modules in an intelligent way. Fuzzy logic handles the uncertainty in the process of making a final decision about the prediction. The complete model was tested for the case of predicting the COVID-19 time series in Mexico, at the level of the states and the whole country. The simulation results of the multiple ensemble neural network models with fuzzy response integration show very good predicted values in the validation data set. In fact, the prediction errors of the multiple ensemble neural networks are significantly lower than using traditional monolithic neural networks, in this way showing the advantages of the proposed approach.


Recently, the stock market prediction has become one of the essential application areas of time-series forecasting research. The successful prediction of the stock market can be better guided to the investors to maximize their profit and to minimize the risk of investment. The stock market data are very much complex, non-linear and dynamic. Due to this reason, still, it is a challenging task. In recent time, deep learning method has become one of the most popular machine learning methods for time-series forecasting due to their temporal feature extraction capabilities. In this paper, we have proposed a novel Deep Learning-based Integrated Stacked Model (DISM) that integrates both the 1D Convolution neural network and LSTM recurrent neural network to find the spatial and temporal features from the stock market data. Our proposed DISM is applied to forecast the stock market. Here, we have also compared our proposed DISM with the single structured stacked LSTM, and 1D Convolution neural network models, and some other statistical models. We have observed that our proposed DISM produces better results in terms of accuracy and stability.


2020 ◽  
Author(s):  
Dipayan Biswas ◽  
P. Sooryakiran ◽  
V. Srinivasa Chakravarthy

AbstractRecurrent neural networks with associative memory properties are typically based on fixed-point dynamics, which is fundamentally distinct from the oscillatory dynamics of the brain. There have been proposals for oscillatory associative memories, but here too, in the majority of cases, only binary patterns are stored as oscillatory states in the network. Oscillatory neural network models typically operate at a single/common frequency. At multiple frequencies, even a pair of oscillators with real coupling exhibits rich dynamics of Arnold tongues, not easily harnessed to achieve reliable memory storage and retrieval. Since real brain dynamics comprises of a wide range of spectral components, there is a need for oscillatory neural network models that operate at multiple frequencies. We propose an oscillatory neural network that can model multiple time series simultaneously by performing a Fourier-like decomposition of the signals. We show that these enhanced properties of a network of Hopf oscillators become possible by operating in the complex-variable domain. In this model, the single neural oscillator is modeled as a Hopf oscillator, with adaptive frequency and dynamics described over the complex domain. We propose a novel form of coupling, dubbed “power coupling,” between complex Hopf oscillators. With power coupling, expressed naturally only in the complex-variable domain, it is possible to achieve stable (normalized) phase relationships in a network of multifrequency oscillators. Network connections are trained either by Hebb-like learning or by delta rule, adapted to the complex domain. The network is capable of modeling N-channel Electroencephalogram time series with high accuracy and shows the potential as an effective model of large-scale brain dynamics.


2022 ◽  
Vol 2161 (1) ◽  
pp. 012005
Author(s):  
C R Karthik ◽  
Raghunandan ◽  
B Ashwath Rao ◽  
N V Subba Reddy

Abstract A time series is an order of observations engaged serially in time. The prime objective of time series analysis is to build mathematical models that provide reasonable descriptions from training data. The goal of time series analysis is to forecast the forthcoming values of a series based on the history of the same series. Forecasting of stock markets is a thought-provoking problem because of the number of possible variables as well as volatile noise that may contribute to the prices of the stock. However, the capability to analyze stock market leanings could be vital to investors, traders and researchers, hence has been of continued interest. Plentiful arithmetical and machine learning practices have been discovered for stock analysis and forecasting/prediction. In this paper, we perform a comparative study on two very capable artificial neural network models i) Deep Neural Network (DNN) and ii) Long Short-Term Memory (LSTM) a type of recurrent neural network (RNN) in predicting the daily variance of NIFTYIT in BSE (Bombay Stock Exchange) and NSE (National Stock Exchange) markets. DNN was chosen due to its capability to handle complex data with substantial performance and better generalization without being saturated. LSTM model was decided, as it contains intermediary memory which can hold the historic patterns and occurrence of the next prediction depends on the values that preceded it. With both networks, measures were taken to reduce overfitting. Daily predictions of the NIFTYIT index were made to test the generalizability of the models. Both networks performed well at making daily predictions, and both generalized admirably to make daily predictions of the NiftyIT data. The LSTM-RNN outpaced the DNN in terms of forecasting and thus, grips more potential for making longer-term estimates.


2010 ◽  
Vol 13 (4) ◽  
pp. 825-841 ◽  
Author(s):  
Dulakshi S. K. Karunasingha ◽  
A. W. Jayawardena ◽  
W. K. Li

Artificial Neural Networks (ANNs) are now widely used in many areas of science, medicine, finance and engineering. Analysis and prediction of time series of hydrological/and meteorological data is one such application. Problems that still exist in the application of ANN's are the lack of transparency and the expertise needed for training. An evolutionary algorithm-based method to train a type of neural networks called Product Units Based Neural Networks (PUNN) has been proposed in a 2006 study. This study investigates the applicability of this type of neural networks to hydrological time series prediction. The technique, with a few small changes to improve the performance, is applied to some benchmark time series as well as to a real hydrological time series for prediction. The results show that evolutionary PUNN produce more transparent models compared to widely used multilayer perceptron (MLP) neural network models. It is also seen that training of PUNN models requires less expertise compared to MLPs.


Sign in / Sign up

Export Citation Format

Share Document