scholarly journals Dependence structure in financial time series: Applications and evidence from wavelet analysis

2021 ◽  
Author(s):  
◽  
Long Hai Vo

<p>Conventional time series theory and spectral analysis have independently achieved significant popularity in mainstream economics and finance research over long periods. However, the fact remains that each is somewhat lacking if the other is absent. To overcome this problem, a new methodology, wavelet analysis, has been developed to capture all the information localized in time and in frequency, which provides us with an ideal tool to study non-stationary time series. This paper aims to explore the application of a variety of wavelet-based methodologies in conjunction with conventional techniques, such as the Generalized Autoregressive Conditional Heteroskedasticity (GARCH) models and long-memory parameter estimates, in analysing the short and long term dependence structure of financial returns and volatility. Specifically, by studying the long-memory property of these time series we hope to identify the source of their possible predictability. Above all else, we document the indispensable role of trading activities associated with low frequencies in determining the long-run dependence of volatility. It follows that GARCH models incorporating long-memory and asymmetric returns-volatility dynamics can provide reasonably accurate volatility forecasts. Additionally, the persistence parameter of returns, represented by the Hurst index, is observed to be correlated to trading profits obtained from typical technical rules designed to detect and capitalize on existing trending behaviour of stock prices. This implies that the Hurst index can be used as a good indicator of the long-memory characteristic of the market, which in turn drives such trending behaviour.</p>

2021 ◽  
Author(s):  
◽  
Long Hai Vo

<p>Conventional time series theory and spectral analysis have independently achieved significant popularity in mainstream economics and finance research over long periods. However, the fact remains that each is somewhat lacking if the other is absent. To overcome this problem, a new methodology, wavelet analysis, has been developed to capture all the information localized in time and in frequency, which provides us with an ideal tool to study non-stationary time series. This paper aims to explore the application of a variety of wavelet-based methodologies in conjunction with conventional techniques, such as the Generalized Autoregressive Conditional Heteroskedasticity (GARCH) models and long-memory parameter estimates, in analysing the short and long term dependence structure of financial returns and volatility. Specifically, by studying the long-memory property of these time series we hope to identify the source of their possible predictability. Above all else, we document the indispensable role of trading activities associated with low frequencies in determining the long-run dependence of volatility. It follows that GARCH models incorporating long-memory and asymmetric returns-volatility dynamics can provide reasonably accurate volatility forecasts. Additionally, the persistence parameter of returns, represented by the Hurst index, is observed to be correlated to trading profits obtained from typical technical rules designed to detect and capitalize on existing trending behaviour of stock prices. This implies that the Hurst index can be used as a good indicator of the long-memory characteristic of the market, which in turn drives such trending behaviour.</p>


2022 ◽  
Vol 9 ◽  
Author(s):  
Xiuzhen Zhang ◽  
Riquan Zhang ◽  
Zhiping Lu

This article develops two new empirical likelihood methods for long-memory time series models based on adjusted empirical likelihood and mean empirical likelihood. By application of Whittle likelihood, one obtains a score function that can be viewed as the estimating equation of the parameters of the long-memory time series model. An empirical likelihood ratio is obtained which is shown to be asymptotically chi-square distributed. It can be used to construct confidence regions. By adding pseudo samples, we simultaneously eliminate the non-definition of the original empirical likelihood and enhance the coverage probability. Finite sample properties of the empirical likelihood confidence regions are explored through Monte Carlo simulation, and some real data applications are carried out.


2006 ◽  
Vol 05 (04) ◽  
pp. 639-658 ◽  
Author(s):  
BENJAMIN W. WAH ◽  
MING-LUN QIAN

In this paper, we develop a new constrained artificial-neural-network (ANN) formulation and the associated learning algorithm for predicting stock prices, a difficult time-series prediction problem. We characterize daily stock prices as a noisy non-stationary time series and identify its predictable low-frequency components. Using a recurrent finite-impulse-response ANN, we formulate the learning problem as a constrained optimization problem, develop constraints for incorporating cross validations, and solve the learning problem using algorithms based on the theory of extended saddle points for nonlinear constrained optimization. Finally, we illustrate our prediction results on ten stock-price time series. Our main contributions in this paper are the channel-specific low-pass filtering of noisy time series obtained by wavelet decomposition, the transformation of the low-pass signals to improve their stationarity, and the incorporation of constraints on cross validation that can improve the accuracy of predictions. Our experimental results demonstrate good prediction accuracy and annual returns.


Fractals ◽  
2006 ◽  
Vol 14 (03) ◽  
pp. 205-222 ◽  
Author(s):  
YINGCHUN ZHOU ◽  
MURAD S. TAQQU

A stationary time series is said to be long-range dependent (LRD) if its autocovariance function decays as a power of the lag, in such a way that the sum (over all lags) of the autocovariances diverges. The asymptotic rate of decay is determined by a parameter H, called the Hurst parameter. The time series is said to be short-range dependent (H = 1/2) if the sum converges. It is commonly believed that a random permutation of a sequence maintains the marginal distribution of each element but destroys the dependence, and in particular, that a random permutation of an LRD sequence creates a new sequence whose estimate of Hurst parameter H is close to 1/2. This paper provides a theoretical basis for investigating these claims. In reality, a complete random permutation does not destroy the covariances, but merely equalizes them. The common value of the equalized covariances depends on the length N of the original sequence and it decreases to 0 as N → ∞. Using the periodogram method, we explain why one is led to think, mistakenly, that the randomized sequence yields an "estimated H close to 1/2.".


2009 ◽  
Vol 25 (1) ◽  
pp. 195-210 ◽  
Author(s):  
Xiaofeng Shao

We propose generalized portmanteau-type test statistics in the frequency domain to test independence between two stationary time series. The test statistics are formed analogous to the one in the paper by Chen and Deo (2004, Econometric Theory 20, 382–416), who extended the applicability of the portmanteau goodness-of-fit test to the long memory case. Under the null hypothesis of independence, the asymptotic standard normal distributions of the proposed statistics are derived under fairly mild conditions. In particular, each time series is allowed to possess short memory, long memory, or antipersistence. A simulation study shows that the tests have reasonable size and power properties.


1986 ◽  
Vol 23 (A) ◽  
pp. 41-54 ◽  
Author(s):  
Emanuel Parzen

An approach to time series model identification is described which involves the simultaneous use of frequency, time and quantile domain algorithms; the approach is called quantile spectral analysis. It proposes a framework to integrate the analysis of long-memory (non-stationary) time series with the analysis of short-memory (stationary) time series.


1986 ◽  
Vol 23 (A) ◽  
pp. 41-54 ◽  
Author(s):  
Emanuel Parzen

An approach to time series model identification is described which involves the simultaneous use of frequency, time and quantile domain algorithms; the approach is called quantile spectral analysis. It proposes a framework to integrate the analysis of long-memory (non-stationary) time series with the analysis of short-memory (stationary) time series.


Sign in / Sign up

Export Citation Format

Share Document