scholarly journals Diligence in determining the appropriate form of stationarity

2014 ◽  
Vol 14 (1) ◽  
Author(s):  
André Heymans ◽  
Chris Van Heerden ◽  
Jan Van Greunen ◽  
Gary Van Vuuren

Orientation: One of the most vexing problems of modelling time series data is determining the appropriate form of stationarity, as it can have a significant influence on the model’s explanatory properties, which makes interpreting the results problematic.Research purpose: This article challenged the assumption that most financial time series are first differenced stationary. The common difference first, ask questions later approach was revisited by taking a more systematic approach when analysing the statistical properties of financial time series data.Motivation for the study: Since Nelson and Plosser’s (1982) argued that many macroeconomic time series are difference stationary, many econometricians simply differenced data in order to achieve stationarity. However, the inherent properties of time series data have changed over the past 30 years. This necessitates a proper evaluation of the properties of data before deciding on the appropriate course of action, in order to avoid over-differencing which causes variables to lose their explanatory ability that leads to spurious results.Research approach, design and method: This article introduced a rigorous process that enables econometricians to determine the most appropriate form of stationarity, which is led by the underlying statistical properties of several financial and economic variables.Main findings: The results highlighted the importance of consulting the d parameter to makea more informed decision, rather than only assuming that the data are I(1). Evidence also suggested that the appropriate form of stationarity can vary, but emphasises the importance to consider a series to be fractionally differenced.Practical/managerial implications: Only when data are correctly classified and transformed accordingly will the data be neither under- nor over-differenced, thus enhancing the validity of the results generated by statistical models.Contribution/value-add: By utilising this rigorous process, econometricians will be able to generate more accurate out-of-sample forecasts, as already proven by Van Greunen, Heymans,Van Heerden and Van Vuuren (2014).

2007 ◽  
Vol 18 (02) ◽  
pp. 235-252 ◽  
Author(s):  
DILIP P. AHALPARA ◽  
JITENDRA C. PARIKH

Dynamics of complex systems is studied by first considering a chaotic time series generated by Lorenz equations and adding noise to it. The trend (smooth behavior) is separated from fluctuations at different scales using wavelet analysis and a prediction method proposed by Lorenz is applied to make out of sample predictions at different regions of the time series. The prediction capability of this method is studied by considering several improvements over this method. We then apply this approach to a real financial time series. The smooth time series is modeled using techniques of non linear dynamics. Our results for predictions suggest that the modified Lorenz method gives better predictions compared to those from the original Lorenz method. Fluctuations are analyzed using probabilistic considerations.


2021 ◽  
Vol 11 (9) ◽  
pp. 3876
Author(s):  
Weiming Mai ◽  
Raymond S. T. Lee

Chart patterns are significant for financial market behavior analysis. Lots of approaches have been proposed to detect specific patterns in financial time series data, most of them can be categorized as distance-based or training-based. In this paper, we applied a trainable continuous Hopfield Neural Network for financial time series pattern matching. The Perceptually Important Points (PIP) segmentation method is used as the data preprocessing procedure to reduce the fluctuation. We conducted a synthetic data experiment on both high-level noisy data and low-level noisy data. The result shows that our proposed method outperforms the Template Based (TB) and Euclidean Distance (ED) and has an advantage over Dynamic Time Warping (DTW) in terms of the processing time. That indicates the Hopfield network has a potential advantage over other distance-based matching methods.


2012 ◽  
Vol 2012 ◽  
pp. 1-21 ◽  
Author(s):  
Md. Rabiul Islam ◽  
Md. Rashed-Al-Mahfuz ◽  
Shamim Ahmad ◽  
Md. Khademul Islam Molla

This paper presents a subband approach to financial time series prediction. Multivariate empirical mode decomposition (MEMD) is employed here for multiband representation of multichannel financial time series together. Autoregressive moving average (ARMA) model is used in prediction of individual subband of any time series data. Then all the predicted subband signals are summed up to obtain the overall prediction. The ARMA model works better for stationary signal. With multiband representation, each subband becomes a band-limited (narrow band) signal and hence better prediction is achieved. The performance of the proposed MEMD-ARMA model is compared with classical EMD, discrete wavelet transform (DWT), and with full band ARMA model in terms of signal-to-noise ratio (SNR) and mean square error (MSE) between the original and predicted time series. The simulation results show that the MEMD-ARMA-based method performs better than the other methods.


2005 ◽  
Vol 50 (01) ◽  
pp. 1-8 ◽  
Author(s):  
PETER M. ROBINSON

Much time series data are recorded on economic and financial variables. Statistical modeling of such data is now very well developed, and has applications in forecasting. We review a variety of statistical models from the viewpoint of "memory", or strength of dependence across time, which is a helpful discriminator between different phenomena of interest. Both linear and nonlinear models are discussed.


Sign in / Sign up

Export Citation Format

Share Document