Feedforward versus recurrent neural networks for forecasting monthly japanese yen exchange rates

1996 ◽  
Vol 3 (1) ◽  
pp. 59-75 ◽  
Author(s):  
Giovani Dematos ◽  
Milton S. Boyd ◽  
Bahman Kermanshahi ◽  
Nowrouz Kohzadi ◽  
Iebeling Kaastra

2005 ◽  
Vol 01 (01) ◽  
pp. 79-107 ◽  
Author(s):  
MAK KABOUDAN

Applying genetic programming and artificial neural networks to raw as well as wavelet-transformed exchange rate data showed that genetic programming may have good extended forecasting abilities. Although it is well known that most predictions of exchange rates using many alternative techniques could not deliver better forecasts than the random walk model, in this paper employing natural computational strategies to forecast three different exchange rates produced two extended forecasts (that go beyond one-step-ahead) that are better than naïve random walk predictions. Sixteen-step-ahead forecasts obtained using genetic programming outperformed the one- and sixteen-step-ahead random walk US dollar/Taiwan dollar exchange rate predictions. Further, sixteen-step-ahead forecasts of the wavelet-transformed US dollar/Japanese Yen exchange rate also using genetic programming outperformed the sixteen-step-ahead random walk predictions of the exchange rate. However, random walk predictions of the US dollar/British pound exchange rate outperformed all forecasts obtained using genetic programming. Random walk predictions of the same three exchange rates employing raw and wavelet-transformed data also outperformed all forecasts obtained using artificial neural networks.



2002 ◽  
pp. 189-204
Author(s):  
Jing Tao Yao ◽  
Chew Lim Tan

This chapter describes the application of neural networks in foreign exchange rate forecasting between American dollar and five other major currencies: Japanese yen, Deutsch mark, British pound, Swiss franc and Australian dollar. Technical indicators and time series data are fed to neural networks to mine, or discover, the underlying “rules” of the movement in currency exchange rates. The results presented in this chapter show that without the use of extensive market data or knowledge, useful prediction can be made and significant paper profit can be achieved for out-of-sample data with simple technical indicators. The neural-network-based forecasting is also shown to compare favorably with the traditional statistical approach.



1995 ◽  
Vol 10 (4) ◽  
pp. 347-364 ◽  
Author(s):  
Chung-Ming Kuan ◽  
Tung Liu


Author(s):  
Leong-Kwan Li ◽  
Wan-Kai Pang ◽  
Wing-Tong Yu ◽  
Marvin D. Troutt

Movements in foreign exchange rates are the results of collective human decisions, which are the results of the dynamics of their neurons. In this chapter, we demonstrate how to model these types of market behaviors by recurrent neural networks (RNN). The RNN approach can help us to forecast the short-term trend of foreign exchange rates. The application of forecasting techniques in the foreign exchange markets has become an important task in financial strategy. Our empirical results show that a discrete-time RNN performs better than the traditional methods in forecasting short-term foreign exchange rates.









2020 ◽  
Author(s):  
Dean Sumner ◽  
Jiazhen He ◽  
Amol Thakkar ◽  
Ola Engkvist ◽  
Esben Jannik Bjerrum

<p>SMILES randomization, a form of data augmentation, has previously been shown to increase the performance of deep learning models compared to non-augmented baselines. Here, we propose a novel data augmentation method we call “Levenshtein augmentation” which considers local SMILES sub-sequence similarity between reactants and their respective products when creating training pairs. The performance of Levenshtein augmentation was tested using two state of the art models - transformer and sequence-to-sequence based recurrent neural networks with attention. Levenshtein augmentation demonstrated an increase performance over non-augmented, and conventionally SMILES randomization augmented data when used for training of baseline models. Furthermore, Levenshtein augmentation seemingly results in what we define as <i>attentional gain </i>– an enhancement in the pattern recognition capabilities of the underlying network to molecular motifs.</p>



Sign in / Sign up

Export Citation Format

Share Document