scholarly journals Optimizing a quantum reservoir computer for time series prediction

2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Aki Kutvonen ◽  
Keisuke Fujii ◽  
Takahiro Sagawa

Abstract Quantum computing and neural networks show great promise for the future of information processing. In this paper we study a quantum reservoir computer (QRC), a framework harnessing quantum dynamics and designed for fast and efficient solving of temporal machine learning tasks such as speech recognition, time series prediction and natural language processing. Specifically, we study memory capacity and accuracy of a quantum reservoir computer based on the fully connected transverse field Ising model by investigating different forms of inter-spin interactions and computing timescales. We show that variation in inter-spin interactions leads to a better memory capacity in general, by engineering the type of interactions the capacity can be greatly enhanced and there exists an optimal timescale at which the capacity is maximized. To connect computational capabilities to physical properties of the underlaying system, we also study the out-of-time-ordered correlator and find that its faster decay implies a more accurate memory. Furthermore, as an example application on real world data, we use QRC to predict stock values.

Author(s):  
Julia El Zini ◽  
Yara Rizk ◽  
Mariette Awad

AbstractRecurrent neural networks (RNN) have been successfully applied to various sequential decision-making tasks, natural language processing applications, and time-series predictions. Such networks are usually trained through back-propagation through time (BPTT) which is prohibitively expensive, especially when the length of the time dependencies and the number of hidden neurons increase. To reduce the training time, extreme learning machines (ELMs) have been recently applied to RNN training, reaching a 99% speedup on some applications. Due to its non-iterative nature, ELM training, when parallelized, has the potential to reach higher speedups than BPTT.In this work, we present Opt-PR-ELM, an optimized parallel RNN training algorithm based on ELM that takes advantage of the GPU shared memory and of parallel QR factorization algorithms to efficiently reach optimal solutions. The theoretical analysis of the proposed algorithm is presented on six RNN architectures, including LSTM and GRU, and its performance is empirically tested on ten time-series prediction applications. Opt-PR-ELM is shown to reach up to 461 times speedup over its sequential counterpart and to require up to 20x less time to train than parallel BPTT. Such high speedups over new generation CPUs are extremely crucial in real-time applications and IoT environments.


2015 ◽  
Vol 781 ◽  
pp. 523-526 ◽  
Author(s):  
Wassanun Sangjun ◽  
Supawat Supakwong ◽  
Suttipong Thajchayapong

This paper proposes a financial time-series prediction method consisting of á Trous wavelet transform and polynomial regression. The main purpose of employing á Trous wavelet transform is to decompose financial time-series signals into different resolutions where only relevant signal components are used for prediction. Also, á Trous wavelet transform is used to avoid the edge problem where only the past and present components of the time-series signal are taken into account. The decomposed time-series signals are then fed into the polynomial regression part to obtain predicted time-series signals. Using real-world data, performance evaluation is conducted based on total benefit and profit/loss where it is shown that á Trous wavelet transform contributes to a significant performance improvement.


Author(s):  
Muhammad Faheem Mushtaq ◽  
Urooj Akram ◽  
Muhammad Aamir ◽  
Haseeb Ali ◽  
Muhammad Zulqarnain

It is important to predict a time series because many problems that are related to prediction such as health prediction problem, climate change prediction problem and weather prediction problem include a time component. To solve the time series prediction problem various techniques have been developed over many years to enhance the accuracy of forecasting. This paper presents a review of the prediction of physical time series applications using the neural network models. Neural Networks (NN) have appeared as an effective tool for forecasting of time series.  Moreover, to resolve the problems related to time series data, there is a need of network with single layer trainable weights that is Higher Order Neural Network (HONN) which can perform nonlinearity mapping of input-output. So, the developers are focusing on HONN that has been recently considered to develop the input representation spaces broadly. The HONN model has the ability of functional mapping which determined through some time series problems and it shows the more benefits as compared to conventional Artificial Neural Networks (ANN). The goal of this research is to present the reader awareness about HONN for physical time series prediction, to highlight some benefits and challenges using HONN.


2019 ◽  
Vol 15 (2) ◽  
pp. 647-659 ◽  
Author(s):  
Zahra Moeini Najafabadi ◽  
Mehdi Bijari ◽  
Mehdi Khashei

Purpose This study aims to make investment decisions in stock markets using forecasting-Markowitz based decision-making approaches. Design/methodology/approach The authors’ approach offers the use of time series prediction methods including autoregressive, autoregressive moving average and artificial neural network, rather than calculating the expected rate of return based on distribution. Findings The results show that using time series prediction methods has a significant effect on improving investment decisions and the performance of the investments. Originality/value In this study, in contrast to previous studies, the alteration in the Markowitz model started with the investment expected rate of return. For this purpose, instead of considering the distribution of returns and determining the expected returns, time series prediction methods were used to calculate the future return of each asset. Then, the results of different time series methods replaced the expected returns in the Markowitz model. Finally, the overall performance of the method, as well as the performance of each of the prediction methods used, was examined in relation to nine stock market indices.


Energies ◽  
2020 ◽  
Vol 14 (1) ◽  
pp. 141
Author(s):  
Jacob Hale ◽  
Suzanna Long

Energy portfolios are overwhelmingly dependent on fossil fuel resources that perpetuate the consequences associated with climate change. Therefore, it is imperative to transition to more renewable alternatives to limit further harm to the environment. This study presents a univariate time series prediction model that evaluates sustainability outcomes of partial energy transitions. Future electricity generation at the state-level is predicted using exponential smoothing and autoregressive integrated moving average (ARIMA). The best prediction results are then used as an input for a sustainability assessment of a proposed transition by calculating carbon, water, land, and cost footprints. Missouri, USA was selected as a model testbed due to its dependence on coal. Of the time series methods, ARIMA exhibited the best performance and was used to predict annual electricity generation over a 10-year period. The proposed transition consisted of a one-percent annual decrease of coal’s portfolio share to be replaced with an equal share of solar and wind supply. The sustainability outcomes of the transition demonstrate decreases in carbon and water footprints but increases in land and cost footprints. Decision makers can use the results presented here to better inform strategic provisioning of critical resources in the context of proposed energy transitions.


2021 ◽  
Vol 181 ◽  
pp. 973-980
Author(s):  
Leonardo Sestrem de Oliveira ◽  
Sarah Beatriz Gruetzmacher ◽  
João Paulo Teixeira

2021 ◽  
pp. 1-13
Author(s):  
Qingtian Zeng ◽  
Xishi Zhao ◽  
Xiaohui Hu ◽  
Hua Duan ◽  
Zhongying Zhao ◽  
...  

Word embeddings have been successfully applied in many natural language processing tasks due to its their effectiveness. However, the state-of-the-art algorithms for learning word representations from large amounts of text documents ignore emotional information, which is a significant research problem that must be addressed. To solve the above problem, we propose an emotional word embedding (EWE) model for sentiment analysis in this paper. This method first applies pre-trained word vectors to represent document features using two different linear weighting methods. Then, the resulting document vectors are input to a classification model and used to train a text sentiment classifier, which is based on a neural network. In this way, the emotional polarity of the text is propagated into the word vectors. The experimental results on three kinds of real-world data sets demonstrate that the proposed EWE model achieves superior performances on text sentiment prediction, text similarity calculation, and word emotional expression tasks compared to other state-of-the-art models.


Sign in / Sign up

Export Citation Format

Share Document