Wavelet-based Estimation of Generalized Fractional Process

2007 ◽  
Vol 46 (02) ◽  
pp. 117-120 ◽  
Author(s):  
A. Kawanaka ◽  
A. Gonzaga

Summary Objectives : This paper aims to propose an estimation procedure for the parameters of a generalized fractional process, a fairly general model of long-memory applicable in modeling biomedical signals whose autocorrelations exhibit hyperbolic decay. Methods : We derive a wavelet-based weighted least squares estimator of the long-memory parameter based on the maximal-overlap estimator of the wavelet variance. Short-memory parameters can then be estimated using standard methods. We illustrate our approach by an example applying ECG heart rate data. Results and Conclusion : The proposed method is relatively computationally and statistically efficient. It allows for estimation of the long-memory parameter without knowledge of the short-memory parameters. Moreover it provides a more general model of biomedical signals that exhibit periodic long-range dependence, such as ECG data, whose relatively unobtrusive recording may be advantageous in assessing or predicting some physiological or pathological conditions from the estimated values of the parameters.

Author(s):  
Federico Maddanu

AbstractThe estimation of the long memory parameter d is a widely discussed issue in the literature. The harmonically weighted (HW) process was recently introduced for long memory time series with an unbounded spectral density at the origin. In contrast to the most famous fractionally integrated process, the HW approach does not require the estimation of the d parameter, but it may be just as able to capture long memory as the fractionally integrated model, if the sample size is not too large. Our contribution is a generalization of the HW model, denominated the Generalized harmonically weighted (GHW) process, which allows for an unbounded spectral density at $$k \ge 1$$ k ≥ 1 frequencies away from the origin. The convergence in probability of the Whittle estimator is provided for the GHW process, along with a discussion on simulation methods. Fit and forecast performances are evaluated via an empirical application on paleoclimatic data. Our main conclusion is that the above generalization is able to model long memory, as well as its classical competitor, the fractionally differenced Gegenbauer process, does. In addition, the GHW process does not require the estimation of the memory parameter, simplifying the issue of how to disentangle long memory from a (moderately persistent) short memory component. This leads to a clear advantage of our formulation over the fractional long memory approach.


2009 ◽  
Vol 25 (3) ◽  
pp. 764-792 ◽  
Author(s):  
Rohit Deo ◽  
Clifford M. Hurvich ◽  
Philippe Soulier ◽  
Yi Wang

We establish sufficient conditions on durations that are stationary with finite variance and memory parameter$d \in [0,{\textstyle{1 \over 2}})$to ensure that the corresponding counting processN(t) satisfies VarN(t) ~Ct2d+1(C> 0) ast→ ∞, with the same memory parameter$d \in [0,{\textstyle{1 \over 2}})$that was assumed for the durations. Thus, these conditions ensure that the memory parameter in durations propagates to the same memory parameter in the counts. We then show that any autoregressive conditional duration ACD(1,1) model with a sufficient number of finite moments yields short memory in counts, whereas any long memory stochastic duration model withd> 0 and all finite moments yields long memory in counts, with the samed. Finally, we provide some results about the propagation of long memory to the empirically relevant case of realized variance estimates affected by market microstructure noise contamination.


2011 ◽  
Vol 28 (2) ◽  
pp. 457-470 ◽  
Author(s):  
Offer Lieberman ◽  
Roy Rosemarin ◽  
Judith Rousseau

Consistency, asymptotic normality, and efficiency of the maximum likelihood estimator for stationary Gaussian time series were shown to hold in the short memory case by Hannan (1973, Journal of Applied Probability 10, 130–145) and in the long memory case by Dahlhaus (1989, Annals of Statistics 34, 1045–1047). In this paper we extend these results to the entire stationarity region, including the case of antipersistence and noninvertibility.


2013 ◽  
Vol 29 (6) ◽  
pp. 1196-1237 ◽  
Author(s):  
Adam Mccloskey ◽  
Pierre Perron

We propose estimators of the memory parameter of a time series that are robust to a wide variety of random level shift processes, deterministic level shifts, and deterministic time trends. The estimators are simple trimmed versions of the popular log-periodogram regression estimator that employ certain sample-size-dependent and, in some cases, data-dependent trimmings that discard low-frequency components. We also show that a previously developed trimmed local Whittle estimator is robust to the same forms of data contamination. Regardless of whether the underlying long- or short-memory process is contaminated by level shifts or deterministic trends, the estimators are consistent and asymptotically normal with the same limiting variance as their standard untrimmed counterparts. Simulations show that the trimmed estimators perform their intended purpose quite well, substantially decreasing both finite-sample bias and root mean-squared error in the presence of these contaminating components. Furthermore, we assess the trade-offs involved with their use when such components are not present but the underlying process exhibits strong short-memory dynamics or is contaminated by noise. To balance the potential finite-sample biases involved in estimating the memory parameter, we recommend a particular adaptive version of the trimmed log-periodogram estimator that performs well in a wide variety of circumstances. We apply the estimators to stock market volatility data to find that various time series typically thought to be long-memory processes actually appear to be short- or very weak long-memory processes contaminated by level shifts or deterministic trends.


2018 ◽  
Vol 55 (2) ◽  
pp. 543-558 ◽  
Author(s):  
M. du Roy de Chaumaray

Abstract We simultaneously estimate the four parameters of a subcritical Heston process. We do not restrict ourselves to the case where the stochastic volatility process never reaches zero. In order to avoid the use of unmanageable stopping times and a natural but intractable estimator, we use a weighted least-squares estimator. We establish strong consistency and asymptotic normality for this estimator. Numerical simulations are also provided, illustrating the favorable performance of our estimation procedure.


2016 ◽  
Vol 20 (4) ◽  
Author(s):  
Richard T. Baillie ◽  
George Kapetanios

AbstractA substantial amount of recent time series research has emphasized semi-parameteric estimators of a long memory parameter and we provide a selective review of the literature on this issue. We consider such estimators applied to the issue of estimating the parameters relating to a short memory process which is embedded within the long memory process. We consider the fractional differencing filter and the subsequent properties of a two step estimator of the short memory parameters. We conclude that while the semi-parametric estimators can have excellent properties in terms of estimating the long memory parameter, they do not have good properties when applied to the two step estimator of short memory


2001 ◽  
Vol 38 (04) ◽  
pp. 1033-1054 ◽  
Author(s):  
Liudas Giraitis ◽  
Piotr Kokoszka ◽  
Remigijus Leipus

The paper studies the impact of a broadly understood trend, which includes a change point in mean and monotonic trends studied by Bhattacharyaet al.(1983), on the asymptotic behaviour of a class of tests designed to detect long memory in a stationary sequence. Our results pertain to a family of tests which are similar to Lo's (1991) modifiedR/Stest. We show that both long memory and nonstationarity (presence of trend or change points) can lead to rejection of the null hypothesis of short memory, so that further testing is needed to discriminate between long memory and some forms of nonstationarity. We provide quantitative description of trends which do or do not fool theR/S-type long memory tests. We show, in particular, that a shift in mean of a magnitude larger thanN-½, whereNis the sample size, affects the asymptotic size of the tests, whereas smaller shifts do not do so.


2009 ◽  
Vol 12 (03) ◽  
pp. 297-317 ◽  
Author(s):  
ANOUAR BEN MABROUK ◽  
HEDI KORTAS ◽  
SAMIR BEN AMMOU

In this paper, fractional integrating dynamics in the return and the volatility series of stock market indices are investigated. The investigation is conducted using wavelet ordinary least squares, wavelet weighted least squares and the approximate Maximum Likelihood estimator. It is shown that the long memory property in stock returns is approximately associated with emerging markets rather than developed ones while strong evidence of long range dependence is found for all volatility series. The relevance of the wavelet-based estimators, especially, the approximate Maximum Likelihood and the weighted least squares techniques is proved in terms of stability and estimation accuracy.


Sign in / Sign up

Export Citation Format

Share Document