scholarly journals EFFICIENT ESTIMATION OF INTEGRATED VOLATILITY FUNCTIONALS UNDER GENERAL VOLATILITY DYNAMICS

2020 ◽  
pp. 1-44
Author(s):  
Jia Li ◽  
Yunxiao Liu

Abstract We provide an asymptotic theory for the estimation of a general class of smooth nonlinear integrated volatility functionals. Such functionals are broadly useful for measuring financial risk and estimating economic models using high-frequency transaction data. The theory is valid under general volatility dynamics, which accommodates both Itô semimartingales (e.g., jump-diffusions) and long-memory processes (e.g., fractional Brownian motions). We establish the semiparametric efficiency bound under a nonstandard nonergodic setting with infill asymptotics, and show that the proposed estimator attains this efficiency bound. These results on efficient estimation are further extended to a setting with irregularly sampled data.

2016 ◽  
Vol 10 (5) ◽  
pp. 1 ◽  
Author(s):  
Wen Cheong Chin ◽  
Min Cherng Lee ◽  
Tan Pei Pei ◽  
Grace Lee Ching Yap ◽  
ChristineTan Nya Ling

<p class="MsoNormal" style="text-align: justify; text-justify: inter-ideograph; tab-stops: 56.7pt;">This study explores the multipower variation integrated volatility estimates using high frequency data in financial stock market. The different combinations of multipower variation estimators are robust to drastic financial jumps and market microstructure noise. In order to examine the informationally market efficiency, we proposed a rolling window estimate procedures of Hurst parameter using the modified rescale-range approach. In order to test the robustness of the method, we have selected the S&amp;P500 as the empirical data. The empirical study found that the long memory cascading volatility is fluctuating across the studied period and drastically trim down after the subprime mortgage crisis. This time-varying long memory analysis allow us to understand the informationally market efficiency before and after the subprime mortgage crisis in U.S.</p>


Author(s):  
Federico Maddanu

AbstractThe estimation of the long memory parameter d is a widely discussed issue in the literature. The harmonically weighted (HW) process was recently introduced for long memory time series with an unbounded spectral density at the origin. In contrast to the most famous fractionally integrated process, the HW approach does not require the estimation of the d parameter, but it may be just as able to capture long memory as the fractionally integrated model, if the sample size is not too large. Our contribution is a generalization of the HW model, denominated the Generalized harmonically weighted (GHW) process, which allows for an unbounded spectral density at $$k \ge 1$$ k ≥ 1 frequencies away from the origin. The convergence in probability of the Whittle estimator is provided for the GHW process, along with a discussion on simulation methods. Fit and forecast performances are evaluated via an empirical application on paleoclimatic data. Our main conclusion is that the above generalization is able to model long memory, as well as its classical competitor, the fractionally differenced Gegenbauer process, does. In addition, the GHW process does not require the estimation of the memory parameter, simplifying the issue of how to disentangle long memory from a (moderately persistent) short memory component. This leads to a clear advantage of our formulation over the fractional long memory approach.


Author(s):  
Nils Damaschke ◽  
Volker Kühn ◽  
Holger Nobach

AbstractThe prediction and correction of systematic errors in direct spectral estimation from irregularly sampled data taken from a stochastic process is investigated. Different sampling schemes are investigated, which lead to such an irregular sampling of the observed process. Both kinds of sampling schemes are considered, stochastic sampling with non-equidistant sampling intervals from a continuous distribution and, on the other hand, nominally equidistant sampling with missing individual samples yielding a discrete distribution of sampling intervals. For both distributions of sampling intervals, continuous and discrete, different sampling rules are investigated. On the one hand, purely random and independent sampling times are considered. This is given only in those cases, where the occurrence of one sample at a certain time has no influence on other samples in the sequence. This excludes any preferred delay intervals or external selection processes, which introduce correlations between the sampling instances. On the other hand, sampling schemes with interdependency and thus correlation between the individual sampling instances are investigated. This is given whenever the occurrence of one sample in any way influences further sampling instances, e.g., any recovery times after one instance, any preferences of sampling intervals including, e.g., sampling jitter or any external source with correlation influencing the validity of samples. A bias-free estimation of the spectral content of the observed random process from such irregularly sampled data is the goal of this investigation.


2007 ◽  
Vol 27 (7) ◽  
pp. 643-668 ◽  
Author(s):  
Richard T. Baillie ◽  
Young-Wook Han ◽  
Robert J. Myers ◽  
Jeongseok Song

Sign in / Sign up

Export Citation Format

Share Document