SEVERAL FUNDAMENTAL PROPERTIES OF DCCA CROSS-CORRELATION COEFFICIENT

Fractals ◽  
2017 ◽  
Vol 25 (02) ◽  
pp. 1750017 ◽  
Author(s):  
XIAOJUN ZHAO ◽  
PENGJIAN SHANG ◽  
JINGJING HUANG

The (detrended cross-correlation analysis) DCCA cross-correlation coefficient was proposed to measure the level of long-range cross-correlations between two non-stationary time series on multiple time scales. It has been applied to diverse areas of interest, although many properties of this method are not clear. In this paper, we theoretically study several fundamental properties of the DCCA cross-correlation coefficient, which contributes to acquiring more statistical characteristics of this measure. We resort to a synthetic time series that is followed by the integration and the detrending procedures of the DCCA cross-correlation coefficient, which divide the steps to estimate the coefficient into two portions. The former portion, including the integration and the detrending, is proved to be a linear transformation. The second portion is devoted to measuring Pearson’s [Formula: see text] between two synthetic time series. We confirm that the DCCA cross-correlation coefficient is also a linear measure by definition. The simulations including the ARFIMA processes and the multifractal binomial measures are numerically analyzed, which confirm the theoretical analysis.

2021 ◽  
pp. 2250012
Author(s):  
G. F. Zebende ◽  
E. F. Guedes

A correlogram is a statistical tool that is used to check time-series memory by computing the auto-correlation coefficient as a function of the time lag. If the time-series has no memory, then the auto-correlation must be close to zero for any time lag, otherwise if there is a memory, then the auto-correlations must be significantly different from zero. Therefore, based on the robust detrended cross-correlation coefficient, [Formula: see text], we propose the detrended correlogram method in this paper, which will be tested for some time-series (simulated and empirical). This new statistical tool is able to visualize a complete map of the auto-correlation for many time lags and time-scales, and can therefore analyze the memory effect for any time-series.


2015 ◽  
Vol 26 (06) ◽  
pp. 1550071 ◽  
Author(s):  
Wenbin Shi ◽  
Pengjian Shang

This paper is devoted to multiscale cross-correlation analysis on stock market time series, where multiscale DCCA cross-correlation coefficient as well as multiscale cross-sample entropy (MSCE) is applied. Multiscale DCCA cross-correlation coefficient is a realization of DCCA cross-correlation coefficient on multiple scales. The results of this method present a good scaling characterization. More significantly, this method is able to group stock markets by areas. Compared to multiscale DCCA cross-correlation coefficient, MSCE presents a more remarkable scaling characterization and the value of each log return of financial time series decreases with the increasing of scale factor. But the results of grouping is not as good as multiscale DCCA cross-correlation coefficient.


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Keqiang Dong ◽  
Xiaojie Gao

In this paper, we develop a new method to measure the nonlinear interactions between nonstationary time series based on the detrended cross-correlation coefficient analysis. We describe how a nonlinear interaction may be obtained by eliminating the influence of other variables on two simultaneous time series. By applying two artificially generated signals, we show that the new method is working reliably for determining the cross-correlation behavior of two signals. We also illustrate the application of this method in finance and aeroengine systems. These analyses suggest that the proposed measure, derived from the detrended cross-correlation coefficient analysis, may be used to remove the influence of other variables on the cross-correlation between two simultaneous time series.


Author(s):  
Jia-Rong Yeh ◽  
Chung-Kang Peng ◽  
Norden E. Huang

Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal’s complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease.


2021 ◽  
Author(s):  
Ravi Kumar Guntu ◽  
Ankit Agarwal

<p>Model-free gradation of predictability of a geophysical system is essential to quantify how much inherent information is contained within the system and evaluate different forecasting methods' performance to get the best possible prediction. We conjecture that Multiscale Information enclosed in a given geophysical time series is the only input source for any forecast model. In the literature, established entropic measures dealing with grading the predictability of a time series at multiple time scales are limited. Therefore, we need an additional measure to quantify the information at multiple time scales, thereby grading the predictability level. This study introduces a novel measure, Wavelet Entropy Energy Measure (WEEM), based on Wavelet entropy to investigate a time series's energy distribution. From the WEEM analysis, predictability can be graded low to high. The difference between the entropy of a wavelet energy distribution of a time series and entropy of wavelet energy of white noise is the basis for gradation. The metric quantifies the proportion of the deterministic component of a time series in terms of energy concentration, and its range varies from zero to one. One corresponds to high predictable due to its high energy concentration and zero representing a process similar to the white noise process having scattered energy distribution. The proposed metric is normalized, handles non-stationarity, independent of the length of the data. Therefore, it can explain the evolution of predictability for any geophysical time series (ex: precipitation, streamflow, paleoclimate series) from past to the present. WEEM metric's performance can guide the forecasting models in getting the best possible prediction of a geophysical system by comparing different methods. </p>


Sign in / Sign up

Export Citation Format

Share Document