Online Learning with Radial Basis Function Networks
We investigate the benefits of feature selection, nonlinear modelling and online learning when forecasting in financial time series. We consider the sequential and continual learning sub-genres of online learning. The experiments we conduct show that there is a benefit to online transfer learning, beyond the sequential updating of recursive least-squares models. We show that feature representation transfer via radial basis function networks, which make use of clustering algorithms to construct a kernel Gram matrix, are more beneficial than treating each training vector as separate basis functions, as occurs with kernel Ridge regression. We also demonstrate quantitative procedures to determine the very structure of the networks. Finally, we conduct experiments on the log returns of financial time series and show that these online transfer learning models are able to outperform a random walk baseline, whereas the offline learning models struggle to do so.