Closed-loop ignition control using online learning of locally-tuned radial basis function networks

2021 ◽  
Author(s):  
Gabriel Borrageiro

We investigate the benefits of feature selection, nonlinear modelling and online learning when forecasting in financial time series. We consider the sequential and continual learning sub-genres of online learning. The experiments we conduct show that there is a benefit to online transfer learning, in the form of radial basis function networks, beyond the sequential updating of recursive least-squares models. We show that the radial basis function networks, which make use of clustering algorithms to construct a kernel Gram matrix, are more beneficial than treating each training vector as separate basis functions, as occurs with kernel Ridge regression. We demonstrate quantitative procedures to determine the very structure of the radial basis function networks. Finally, we conduct experiments on the log returns of financial time series and show that the online learning models, particularly the radial basis function networks, are able to outperform a random walk baseline, whereas the offline learning models struggle to do so.


2021 ◽  
Author(s):  
Gabriel Borrageiro

We investigate the benefits of feature selection, nonlinear modelling and online learning when forecasting financial time series. We combine sequential updating with continual learning, specifically transfer learning. We perform feature representation transfer through clustering algorithms that determine the analytical structure of radial basis function networks we construct. These networks achieve lower mean-square prediction errors than kernel ridge regression models, which arbitrarily use all training vectors as prototypes. We also demonstrate quantitative procedures to determine the very structure of the networks. Finally, we conduct experiments on the log-returns of financial time series and show that these online transfer learning models outperform a random-walk baseline. In contrast, the offline learning models struggle to do so.


2021 ◽  
Author(s):  
Gabriel Borrageiro

We investigate the benefits of feature selection, nonlinear modelling and online learning when forecasting in financial time series. We consider the sequential and continual learning sub-genres of online learning. The experiments we conduct show that there is a benefit to online transfer learning, beyond the sequential updating of recursive least-squares models. We show that feature representation transfer via radial basis function networks, which make use of clustering algorithms to construct a kernel Gram matrix, are more beneficial than treating each training vector as separate basis functions, as occurs with kernel Ridge regression. We also demonstrate quantitative procedures to determine the very structure of the networks. Finally, we conduct experiments on the log returns of financial time series and show that these online transfer learning models are able to outperform a random walk baseline, whereas the offline learning models struggle to do so.


2021 ◽  
Author(s):  
Gabriel Borrageiro

We investigate the benefits of feature selection, nonlinear modelling and online learning when forecasting in financial time series. We consider the sequential and continual learning sub-genres of online learning. The experiments we conduct show that there is a benefit to online transfer learning, in the form of radial basis function networks, beyond the sequential updating of recursive least-squares models. We show that the radial basis function networks, which make use of clustering algorithms to construct a kernel Gram matrix, are more beneficial than treating each training vector as separate basis functions, as occurs with kernel Ridge regression. We demonstrate quantitative procedures to determine the very structure of the radial basis function networks. Finally, we conduct experiments on the log returns of financial time series and show that the online learning models, particularly the radial basis function networks, are able to outperform a random walk baseline, whereas the offline learning models struggle to do so.


2021 ◽  
Author(s):  
Gabriel Borrageiro

We investigate the benefits of feature selection, nonlinear modelling and online learning when forecasting in financial time series. We consider the sequential and continual learning sub-genres of online learning. The experiments we conduct show that there is a benefit to online transfer learning, beyond the sequential updating of recursive least-squares models. We show that feature representation transfer via radial basis function networks, which make use of clustering algorithms to construct a kernel Gram matrix, are more beneficial than treating each training vector as separate basis functions, as occurs with kernel Ridge regression. We also demonstrate quantitative procedures to determine the very structure of the networks. Finally, we conduct experiments on the log returns of financial time series and show that these online transfer learning models are able to outperform a random walk baseline, whereas the offline learning models struggle to do so.


2021 ◽  
Author(s):  
Gabriel Borrageiro

We investigate the benefits of feature selection, nonlinear modelling and online learning when forecasting in financial time series. We consider the sequential and continual learning sub-genres of online learning. The experiments we conduct show that there is a benefit to online transfer learning, in the form of radial basis function networks, beyond the sequential updating of recursive least-squares models. We show that the radial basis function networks, which make use of clustering algorithms to construct a kernel Gram matrix, are more beneficial than treating each training vector as separate basis functions, as occurs with kernel Ridge regression. We demonstrate quantitative procedures to determine the very structure of the radial basis function networks. Finally, we conduct experiments on the log returns of financial time series and show that the online learning models, particularly the radial basis function networks, are able to outperform a random walk baseline, whereas the offline learning models struggle to do so.


1991 ◽  
Vol 3 (2) ◽  
pp. 246-257 ◽  
Author(s):  
J. Park ◽  
I. W. Sandberg

There have been several recent studies concerning feedforward networks and the problem of approximating arbitrary functionals of a finite number of real variables. Some of these studies deal with cases in which the hidden-layer nonlinearity is not a sigmoid. This was motivated by successful applications of feedforward networks with nonsigmoidal hidden-layer units. This paper reports on a related study of radial-basis-function (RBF) networks, and it is proved that RBF networks having one hidden layer are capable of universal approximation. Here the emphasis is on the case of typical RBF networks, and the results show that a certain class of RBF networks with the same smoothing factor in each kernel node is broad enough for universal approximation.


Sign in / Sign up

Export Citation Format

Share Document