scholarly journals Multivariate Fractional Components Analysis

Author(s):  
Tobias Hartl ◽  
Roland Jucknewitz

Abstract We propose a setup for fractionally cointegrated time series which is formulated in terms of latent integrated and short-memory components. It accommodates nonstationary processes with different fractional orders and cointegration of different strengths and is applicable in high-dimensional settings. In an application to realized covariance matrices, we find that orthogonal short- and long-memory components provide a reasonable fit and competitive out-of-sample performance compared with several competing methods.

2012 ◽  
Vol 24 (11) ◽  
pp. 2994-3024 ◽  
Author(s):  
Varun Raj Kompella ◽  
Matthew Luciw ◽  
Jürgen Schmidhuber

We introduce here an incremental version of slow feature analysis (IncSFA), combining candid covariance-free incremental principal components analysis (CCIPCA) and covariance-free incremental minor components analysis (CIMCA). IncSFA's feature updating complexity is linear with respect to the input dimensionality, while batch SFA's (BSFA) updating complexity is cubic. IncSFA does not need to store, or even compute, any covariance matrices. The drawback to IncSFA is data efficiency: it does not use each data point as effectively as BSFA. But IncSFA allows SFA to be tractably applied, with just a few parameters, directly on high-dimensional input streams (e.g., visual input of an autonomous agent), while BSFA has to resort to hierarchical receptive-field-based architectures when the input dimension is too high. Further, IncSFA's updates have simple Hebbian and anti-Hebbian forms, extending the biological plausibility of SFA. Experimental results show IncSFA learns the same set of features as BSFA and can handle a few cases where BSFA fails.


2019 ◽  
Vol 3 (1) ◽  
pp. 243-256
Author(s):  
Peter M. Robinson

AbstractWe discuss developments and future prospects for statistical modeling and inference for spatial data that have long memory. While a number of contributons have been made, the literature is relatively small and scattered, compared to the literatures on long memory time series on the one hand, and spatial data with short memory on the other. Thus, over several topics, our discussions frequently begin by surveying relevant work in these areas that might be extended in a long memory spatial setting.


Bernoulli ◽  
2017 ◽  
Vol 23 (4A) ◽  
pp. 2299-2329 ◽  
Author(s):  
Ansgar Steland ◽  
Rainer von Sachs

2002 ◽  
Vol 18 (2) ◽  
pp. 278-296 ◽  
Author(s):  
Katsuto Tanaka

The measurement error problem that we consider in this paper is concerned with the situation where time series data of various kinds—short memory, long memory, and random walk processes—are contaminated by white noise. We suggest a unified approach to testing for the existence of such noise. It is found that the power of our test crucially depends on the underlying process.


2021 ◽  
Vol 2021 (026) ◽  
pp. 1-52
Author(s):  
Dong Hwan Oh ◽  
◽  
Andrew J. Patton ◽  

This paper proposes a dynamic multi-factor copula for use in high dimensional time series applications. A novel feature of our model is that the assignment of individual variables to groups is estimated from the data, rather than being pre-assigned using SIC industry codes, market capitalization ranks, or other ad hoc methods. We adapt the k-means clustering algorithm for use in our application and show that it has excellent finite-sample properties. Applying the new model to returns on 110 US equities, we find around 20 clusters to be optimal. In out-of-sample forecasts, we find that a model with as few as five estimated clusters significantly outperforms an otherwise identical model with 21 clusters formed using two-digit SIC codes.


Sign in / Sign up

Export Citation Format

Share Document