scholarly journals Robust Lag Weighted Lasso for Time Series Model

2021 ◽  
Vol 19 (1) ◽  
pp. 2-15
Author(s):  
Tahir R. Dikheel ◽  
Alaa Q. Yaseen

The lag-weighted lasso was introduced to deal with lag effects when identifying the true model in time series. This method depends on weights to reflect both the coefficient size and the lag effects. However, the lag weighted lasso is not robust. To overcome this problem, we propose robust lag weighted lasso methods. Both the simulation study and the real data example show that the proposed methods outperform the other existing methods.

Algorithms ◽  
2020 ◽  
Vol 13 (4) ◽  
pp. 95 ◽  
Author(s):  
Johannes Stübinger ◽  
Katharina Adler

This paper develops the generalized causality algorithm and applies it to a multitude of data from the fields of economics and finance. Specifically, our parameter-free algorithm efficiently determines the optimal non-linear mapping and identifies varying lead–lag effects between two given time series. This procedure allows an elastic adjustment of the time axis to find similar but phase-shifted sequences—structural breaks in their relationship are also captured. A large-scale simulation study validates the outperformance in the vast majority of parameter constellations in terms of efficiency, robustness, and feasibility. Finally, the presented methodology is applied to real data from the areas of macroeconomics, finance, and metal. Highest similarity show the pairs of gross domestic product and consumer price index (macroeconomics), S&P 500 index and Deutscher Aktienindex (finance), as well as gold and silver (metal). In addition, the algorithm takes full use of its flexibility and identifies both various structural breaks and regime patterns over time, which are (partly) well documented in the literature.


2017 ◽  
Vol 2017 ◽  
pp. 1-8
Author(s):  
Ali Alkenani ◽  
Tahir R. Dikheel

The elimination of insignificant predictors and the combination of predictors with indistinguishable coefficients are the two issues raised in searching for the true model. Pairwise Absolute Clustering and Sparsity (PACS) achieves both goals. Unfortunately, PACS is sensitive to outliers due to its dependency on the least-squares loss function which is known to be very sensitive to unusual data. In this article, the sensitivity of PACS to outliers has been studied. Robust versions of PACS (RPACS) have been proposed by replacing the least squares and nonrobust weights in PACS with MM-estimation and robust weights depending on robust correlations instead of person correlation, respectively. A simulation study and two real data applications have been used to assess the effectiveness of the proposed methods.


Author(s):  
Haji A. Haji ◽  
Kusman Sadik ◽  
Agus Mohamad Soleh

Simulation study is used when real world data is hard to find or time consuming to gather and it involves generating data set by specific statistical model or using random sampling. A simulation of the process is useful to test theories and understand behavior of the statistical methods. This study aimed to compare ARIMA and Fuzzy Time Series (FTS) model in order to identify the best model for forecasting time series data based on 100 replicates on 100 generated data of the ARIMA (1,0,1) model.There are 16 scenarios used in this study as a combination between 4 data generation variance error values (0.5, 1, 3,5) with 4 ARMA(1,1) parameter values. Furthermore, The performances were evaluated based on three metric mean absolute percentage error (MAPE),Root mean squared error (RMSE) and Bias statistics criterion to determine the more appropriate method and performance of model. The results of the study show a lowest bias for the chen fuzzy time series model and the performance of all measurements is small then other models. The results also proved that chen method is compatible with the advanced forecasting techniques in all of the consided situation in providing better forecasting accuracy.


2015 ◽  
Vol 26 (12) ◽  
pp. 1550137 ◽  
Author(s):  
A. Q. Pei ◽  
J. Wang

A financial time series model is developed and investigated by the oriented percolation system (one of the statistical physics systems). The nonlinear and statistical behaviors of the return interval time series are studied for the proposed model and the real stock market by applying visibility graph (VG) and multifractal detrended fluctuation analysis (MF-DFA). We investigate the fluctuation behaviors of return intervals of the model for different parameter settings, and also comparatively study these fluctuation patterns with those of the real financial data for different threshold values. The empirical research of this work exhibits the multifractal features for the corresponding financial time series. Further, the VGs deviated from both of the simulated data and the real data show the behaviors of small-world, hierarchy, high clustering and power-law tail for the degree distributions.


2007 ◽  
Vol 46 (7) ◽  
pp. 1125-1129 ◽  
Author(s):  
Alexander Gluhovsky ◽  
Ernest Agee

Abstract Linear parametric models are commonly assumed and used for unknown data-generating mechanisms. This study demonstrates the value of inferring statistics of meteorological and climatological time series by using a computer-intensive subsampling method that allows one to avoid time series analysis anchored in parametric models with imposed perceived physical assumptions. A first-order autoregressive model, typically adopted as the default model for correlated time series in climate studies, has been selected and altered with a nonlinear component to provide insight into possible errors in estimation due to nonlinearities in the real data-generating mechanism. The nonlinearity undetected by basic diagnostic procedures is shown to invalidate statistical inference based on the linear model, whereas the inference derived through subsampling remains valid. It is argued that subsampling and other resampling methods are preferable in complex dependent-data situations that are typical for atmospheric and climatic series when the real data-generating mechanism is unknown.


2019 ◽  
Vol 67 (1) ◽  
pp. 21-26
Author(s):  
Zakir Hossain ◽  
Atikur Rahman ◽  
Moyazzem Hossain ◽  
Jamil Hasan Karami

In time series analysis, over-differencing is a common phenomenon to make the data to be stationary. However, it is not always a good idea to take over-differencing in order to ensure the stationarity of time series data. In this paper, the effect of over-differencing has been investigated via a simulation study to observe how far or how close the fitted model from the true one. Simulation results show that the fitted model is found to be different and very far from the true model because of over-differencing in most of the scenarios considered in this study. In practice, it may be worthy to consider differencing as well as suitable transformation of the time series data to make it stationary. Both transformation and differencing are used for a non-stationary time series data on average monthly house prices to ensure it to be stationary. We then analyse the data and make prediction for the future values. Dhaka Univ. J. Sci. 67(1): 21-26, 2019 (January)


2021 ◽  
Vol 13 (15) ◽  
pp. 2906
Author(s):  
Anurag Kulshrestha ◽  
Ling Chang ◽  
Alfred Stein

Sinkholes are sudden disasters that are usually small in size and occur at unexpected locations. They may cause serious damage to life and property. Sinkhole-prone areas can be monitored using Interferometric Synthetic Aperture Radar (InSAR) time series. Defining a pattern using InSAR-derived spatio-temporal deformations, this study presents a sinkhole pattern detector, called the Sinkhole Scanner. The Sinkhole Scanner includes a spatio-temporal mathematical model such as a 2-dimensional time evolving Gaussian function as a kernel, which moves over the study area using a sliding window approach. The scanner attempts to fit the model over deformation time series of Constantly Coherent Scatterers (CCS) intersected by the window and returns the posterior variance as a measure of goodness of fit. In this way, the scanner searches for subsiding regions resembling sinkhole shapes over a sinkhole prone area. It is designed to detect large sinkholes with a high efficiency, and small sinkholes with a lower efficiency. It is tested at four different spatial scales, and on a simulated and real set of deformation data. Real data were obtained from Sentinel-1A SLC data in IW mode, over Ireland where a large sinkhole occurred on 24 September 2018. The Sinkhole Scanner was able to identify a pattern of low posterior variance zones consistent with the simulated set. In case of the real data, it is able to identify significantly low posterior variance zones near the sinkhole area with the lowest value being 51.1% of the maximum value. The results from Sinkhole Scanner over the real sinkhole site were compared with Multiple Hypothesis Testing (MHT), which identifies Breakpoint and Heaviside temporal anomalies in the deformation time series of CCS. MHT was able to identify high likelihood for Heaviside anomalies in deformation time series of CCS near the sinkhole site about 10 epochs before the sinkhole occurrence. We show that the Sinkhole Scanner is efficient in monitoring a large area and search for sinkholes and that MHT can be used successively to identify temporal anomalies in the vicinity of areas detected by the Sinkhole Scanner. Future research may address other Sinkhole shapes whereas the underlying stochastic model may be adjusted. We conclude that the Sinkhole Scanner is important to be applied at different levels of scale to converge on potential sinkhole centers.


2013 ◽  
Vol 2013 ◽  
pp. 1-11 ◽  
Author(s):  
Francesca Ieva ◽  
Anna Maria Paganoni ◽  
Paolo Zanini

Atrial Fibrillation (AF) is the most common cardiac arrhythmia. It naturally tends to become a chronic condition, and chronic Atrial Fibrillation leads to an increase in the risk of death. The study of the electrocardiographic signal, and in particular of the tachogram series, is a usual and effective way to investigate the presence of Atrial Fibrillation and to detect when a single event starts and ends. This work presents a new statistical method to deal with the identification of Atrial Fibrillation events, based on the order identification of the ARIMA models used for describing the RR time series that characterize the different phases of AF (pre-, during, and post-AF). A simulation study is carried out in order to assess the performance of the proposed method. Moreover, an application to real data concerning patients affected by Atrial Fibrillation is presented and discussed. Since the proposed method looks at structural changes of ARIMA models fitted on the RR time series for the AF event with respect to the pre- and post-AF phases, it is able to identify starting and ending points of an AF event even when AF follows or comes before irregular heartbeat time slots.


2018 ◽  
pp. 49-68 ◽  
Author(s):  
M. E. Mamonov

Our analysis documents that the existence of hidden “holes” in the capital of not yet failed banks - while creating intertemporal pressure on the actual level of capital - leads to changing of maturity of loans supplied rather than to contracting of their volume. Long-term loans decrease, whereas short-term loans rise - and, what is most remarkably, by approximately the same amounts. Standardly, the higher the maturity of loans the higher the credit risk and, thus, the more loan loss reserves (LLP) banks are forced to create, increasing the pressure on capital. Banks that already hide “holes” in the capital, but have not yet faced with license withdrawal, must possess strong incentives to shorten the maturity of supplied loans. On the one hand, it raises the turnovers of LLP and facilitates the flexibility of capital management; on the other hand, it allows increasing the speed of shifting of attracted deposits to loans to related parties in domestic or foreign jurisdictions. This enlarges the potential size of ex post revealed “hole” in the capital and, therefore, allows us to assume that not every loan might be viewed as a good for the economy: excessive short-term and insufficient long-term loans can produce the source for future losses.


Sign in / Sign up

Export Citation Format

Share Document