scholarly journals Multioutput Convolution Spectral Mixture for Gaussian Processes

Author(s):  
Kai Chen ◽  
Twan van Laarhoven ◽  
Perry Groot ◽  
Jinsong Chen ◽  
Elena Marchiori
2021 ◽  
Author(s):  
Kai Chen ◽  
Twan van Laarhoven ◽  
Elena Marchiori

AbstractLong-term forecasting involves predicting a horizon that is far ahead of the last observation. It is a problem of high practical relevance, for instance for companies in order to decide upon expensive long-term investments. Despite the recent progress and success of Gaussian processes (GPs) based on spectral mixture kernels, long-term forecasting remains a challenging problem for these kernels because they decay exponentially at large horizons. This is mainly due to their use of a mixture of Gaussians to model spectral densities. Characteristics of the signal important for long-term forecasting can be unravelled by investigating the distribution of the Fourier coefficients of (the training part of) the signal, which is non-smooth, heavy-tailed, sparse, and skewed. The heavy tail and skewness characteristics of such distributions in the spectral domain allow to capture long-range covariance of the signal in the time domain. Motivated by these observations, we propose to model spectral densities using a skewed Laplace spectral mixture (SLSM) due to the skewness of its peaks, sparsity, non-smoothness, and heavy tail characteristics. By applying the inverse Fourier Transform to this spectral density we obtain a new GP kernel for long-term forecasting. In addition, we adapt the lottery ticket method, originally developed to prune weights of a neural network, to GPs in order to automatically select the number of kernel components. Results of extensive experiments, including a multivariate time series, show the beneficial effect of the proposed SLSM kernel for long-term extrapolation and robustness to the choice of the number of mixture components.


2020 ◽  
Vol 31 (12) ◽  
pp. 5613-5623
Author(s):  
Kai Chen ◽  
Twan van Laarhoven ◽  
Perry Groot ◽  
Jinsong Chen ◽  
Elena Marchiori

Author(s):  
Kai Chen ◽  
Twan van Laarhoven ◽  
Perry Groot ◽  
Jinsong Chen ◽  
Elena Marchiori

Multi-task Gaussian processes (MTGPs) are a powerful approach for modeling structured dependencies among multiple tasks. Researchers on MTGPs have contributed to enhance this approach in various ways. Current MTGP methods, however, cannot model nonlinear task correlations in a general way. In this paper we address this problem. We focus on spectral mixture (SM) based kernels and propose an enhancement of this type of kernels, called multi-task generalized convolution spectral mixture (MT-GCSM) kernel. The MT-GCSM kernel can model nonlinear task correlations and mixtures dependency, including time and phase delay, not only between different tasks but also within a task at the spectral mixture level. Each task in MT-GCSM has its own generalized convolution spectral mixture kernel (GCSM) with a different number of convolution structures and all spectral mixtures from different tasks are dependent. Furthermore, the proposed kernel uses inner and outer full cross convolution between base spectral mixtures, so that the base spectral mixtures in the tasks are not necessarily aligned. Extensive experiments on synthetic and real-life datasets illustrate the difference between MT-GCSM and other kernels as well as the practical effectiveness of MT-GCSM.


Sign in / Sign up

Export Citation Format

Share Document