Large Parallel architecture of CNN‐bidirectional LSTMs for implied volatility forecast

2021 ◽  
Author(s):  
Ji‐Eun Choi ◽  
Dong Wan Shin
2017 ◽  
Vol 5 (9) ◽  
pp. 41
Author(s):  
Guillermo Benavides

There has been substantial research effort aimed to forecast futures price return volatilities of financial assets. A significant part of the literature shows that volatility forecast accuracy is not easy to estimate regardless of the forecasting model applied. This paper examines the volatility accuracy of several volatility forecast models for the case of the Mexican peso-USD exchange rate futures returns. The models applied here are a univariate GARCH, a multivariate ARCH (the BEKK model), two option implied volatility models and a composite forecast model. The composite model includes time-series (historical) and option implied volatility forecasts. Different to other works in the literature, in this paper there is a more rigorous analysis of the option implied volatilities calculations. The results show that the option implied models are superior to the historical models in terms of accuracy and that the composite forecast model was the most accurate one (compared to the alternative models) having the lowest mean-squared-errors. However, the results should be taken with caution given that the coefficient of determination in the regressions was relatively low. According to these findings it is recommended to use a composite forecast model if both types of data are available i.e. the time-series (historical) and the option implied.


2021 ◽  
Vol 71 ◽  
pp. 943-954
Author(s):  
Dehong Liu ◽  
Yucong Liang ◽  
Lili Zhang ◽  
Peter Lung ◽  
Rizwan Ullah

MRS Bulletin ◽  
1997 ◽  
Vol 22 (10) ◽  
pp. 5-6
Author(s):  
Horst D. Simon

Recent events in the high-performance computing industry have concerned scientists and the general public regarding a crisis or a lack of leadership in the field. That concern is understandable considering the industry's history from 1993 to 1996. Cray Research, the historic leader in supercomputing technology, was unable to survive financially as an independent company and was acquired by Silicon Graphics. Two ambitious new companies that introduced new technologies in the late 1980s and early 1990s—Thinking Machines and Kendall Square Research—were commercial failures and went out of business. And Intel, which introduced its Paragon supercomputer in 1994, discontinued production only two years later.During the same time frame, scientists who had finished the laborious task of writing scientific codes to run on vector parallel supercomputers learned that those codes would have to be rewritten if they were to run on the next-generation, highly parallel architecture. Scientists who are not yet involved in high-performance computing are understandably hesitant about committing their time and energy to such an apparently unstable enterprise.However, beneath the commercial chaos of the last several years, a technological revolution has been occurring. The good news is that the revolution is over, leading to five to ten years of predictable stability, steady improvements in system performance, and increased productivity for scientific applications. It is time for scientists who were sitting on the fence to jump in and reap the benefits of the new technology.


1991 ◽  
Author(s):  
Eric A. Brewer ◽  
Chrysanthos N. Dellarocas ◽  
Adrian Colbrook ◽  
William E. Weihl

Sign in / Sign up

Export Citation Format

Share Document