A spectral analysis based heteroscedastic model for the estimation of value at risk

2018 ◽  
Vol 19 (3) ◽  
pp. 295-314
Author(s):  
Yang Zhao

Purpose This paper aims to focus on a better model to capture the trait of varying volatility in various financial time series, as well as to obtain reliable estimate of value at risk (VaR). Design/methodology/approach The typical methods in spectral analysis are used to obtain the sample of conditional mean, conditional variance and residual term. The generalized regression neural network is used to establish a time-varying non-linear model, and the non-parametric kernel density estimation method is applied for the estimation of VaR. Findings The proposed model is able to follow the heteroscedastic characteristic which is common in financial time series, and the estimated VaR is satisfactory. Practical implications The analysis method in this study allows the hedgers, bankers, financial analysts as well as economists to draw a better inference from financial time series. Also, relatively more precise estimate of the VaR value for a certain kind of financial asset is available. The model with its derived estimates would definitely help in developing other models. Originality/value Up-to-date, the study in literature which models financial time serial from the viewpoint of spectral analysis is rare to see. Thus, the proposed model, along with a comprehensive empirical study which reveals desirable result for the estimation of VaR would enrich the related researches at present.

2021 ◽  
Vol 14 (4) ◽  
pp. 702-713
Author(s):  
N. Prabakaran ◽  
Rajasekaran Palaniappan ◽  
R. Kannadasan ◽  
Satya Vinay Dudi ◽  
V. Sasidhar

PurposeWe propose a Machine Learning (ML) approach that will be trained from the available financial data and is able to gain the trends over the data and then uses the acquired knowledge for a more accurate forecasting of financial series. This work will provide a more precise results when weighed up to aged financial series forecasting algorithms. The LSTM Classic will be used to forecast the momentum of the Financial Series Index and also applied to its commodities. The network will be trained and evaluated for accuracy with various sizes of data sets, i.e. weekly historical data of MCX, GOLD, COPPER and the results will be calculated.Design/methodology/approachDesirable LSTM model for script price forecasting from the perspective of minimizing MSE. The approach which we have followed is shown below. (1) Acquire the Dataset. (2) Define your training and testing columns in the dataset. (3) Transform the input value using scalar. (4) Define the custom loss function. (5) Build and Compile the model. (6) Visualise the improvements in results.FindingsFinancial series is one of the very aged techniques where a commerce person would commerce financial scripts, make business and earn some wealth from these companies that vend a part of their business on trading manifesto. Forecasting financial script prices is complex tasks that consider extensive human–computer interaction. Due to the correlated nature of financial series prices, conventional batch processing methods like an artificial neural network, convolutional neural network, cannot be utilised efficiently for financial market analysis. We propose an online learning algorithm that utilises an upgraded of recurrent neural networks called long short-term memory Classic (LSTM). The LSTM Classic is quite different from normal LSTM as it has customised loss function in it. This LSTM Classic avoids long-term dependence on its metrics issues because of its unique internal storage unit structure, and it helps forecast financial time series. Financial Series Index is the combination of various commodities (time series). This makes Financial Index more reliable than the financial time series as it does not show a drastic change in its value even some of its commodities are affected. This work will provide a more precise results when weighed up to aged financial series forecasting algorithms.Originality/valueWe had built the customised loss function model by using LSTM scheme and have experimented on MCX index and as well as on its commodities and improvements in results are calculated for every epoch that we run for the whole rows present in the dataset. For every epoch we can visualise the improvements in loss. One more improvement that can be done to our model that the relationship between price difference and directional loss is specific to other financial scripts. Deep evaluations can be done to identify the best combination of these for a particular stock to obtain better results.


2016 ◽  
Vol 76 (1) ◽  
pp. 76-93 ◽  
Author(s):  
Thomas W. Sproul

Purpose – Turvey (2007, Physica A) introduced a scaled variance ratio procedure for testing the random walk hypothesis (RWH) for financial time series by estimating Hurst coefficients for a fractional Brownian motion model of asset prices. The purpose of this paper is to extend his work by making the estimation procedure robust to heteroskedasticity and by addressing the multiple hypothesis testing problem. Design/methodology/approach – Unbiased, heteroskedasticity consistent, variance ratio estimates are calculated for end of day price data for eight time lags over 12 agricultural commodity futures (front month) and 40 US equities from 2000-2014. A bootstrapped stepdown procedure is used to obtain appropriate statistical confidence for the multiplicity of hypothesis tests. The variance ratio approach is compared against regression-based testing for fractionality. Findings – Failing to account for bias, heteroskedasticity, and multiplicity of testing can lead to large numbers of erroneous rejections of the null hypothesis of efficient markets following an independent random walk. Even with these adjustments, a few futures contracts significantly violate independence for short lags at the 99 percent level, and a number of equities/lags violate independence at the 95 percent level. When testing at the asset level, futures prices are found not to contain fractional properties, while some equities do. Research limitations/implications – Only a subsample of futures and equities, and only a limited number of lags, are evaluated. It is possible that multiplicity adjustments for larger numbers of tests would result in fewer rejections of independence. Originality/value – This paper provides empirical evidence that violations of the RWH for financial time series are likely to exist, but are perhaps less common than previously thought.


Author(s):  
Philip L.H. Yu ◽  
Edmond H.C. Wu ◽  
W.K. Li

As a data mining technique, independent component analysis (ICA) is used to separate mixed data signals into statistically independent sources. In this chapter, we apply ICA for modeling multivariate volatility of financial asset returns which is a useful tool in portfolio selection and risk management. In the finance literature, the generalized autoregressive conditional heteroscedasticity (GARCH) model and its variants such as EGARCH and GJR-GARCH models have become popular standard tools to model the volatility processes of financial time series. Although univariate GARCH models are successful in modeling volatilities of financial time series, the problem of modeling multivariate time series has always been challenging. Recently, Wu, Yu, & Li (2006) suggested using independent component analysis (ICA) to decompose multivariate time series into statistically independent time series components and then separately modeled the independent components by univariate GARCH models. In this chapter, we extend this class of ICA-GARCH models to allow more flexible univariate GARCH-type models. We also apply the proposed models to compute the value-at-risk (VaR) for risk management applications. Backtesting and out-of-sample tests suggest that the ICA-GARCH models have a clear cut advantage over some other approaches in value-at-risk estimation.


2018 ◽  
Vol 13 (5) ◽  
pp. 881-894
Author(s):  
Yanfeng Sun ◽  
Minglei Zhang ◽  
Si Chen ◽  
Xiaohu Shi

Inspired by the embedding representation in Natural Language Processing (NLP), we develop a financial embedded vector representation model to abstract the temporal characteristics of financial time series. Original financial features are discretized firstly, and then each set of discretized features is considered as a “word” of NLP, while the whole financial time series corresponds to the “sentence” or “paragraph”. Therefore the embedded vector models in NLP could be applied to the financial time series. To test the proposed model, we use RBF neural networks as regression model to predict financial series by comparing the financial embedding vectors as input with the original features. Numerical results show that the prediction accuracy of the test data is improved for about 4-6 orders of magnitude, meaning that the financial embedded vector has a strong generalization ability.


Sign in / Sign up

Export Citation Format

Share Document