Online prediction of noisy time series: Dynamic adaptive sparse kernel recursive least squares from sparse and adaptive tracking perspective

2020 ◽  
Vol 91 ◽  
pp. 103547
Author(s):  
Kai Zhong ◽  
Junzhu Ma ◽  
Min Han
2016 ◽  
Vol 2016 ◽  
pp. 1-11 ◽  
Author(s):  
Chunyuan Zhang ◽  
Qingxin Zhu ◽  
Xinzheng Niu

By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii)L2andL1regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can makeL1regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms.


2014 ◽  
Vol 1061-1062 ◽  
pp. 935-938
Author(s):  
Xin You Wang ◽  
Guo Fei Gao ◽  
Zhan Qu ◽  
Hai Feng Pu

The predictions of chaotic time series by applying the least squares support vector machine (LS-SVM), with comparison with the traditional-SVM and-SVM, were specified. The results show that, compared with the traditional SVM, the prediction accuracy of LS-SVM is better than the traditional SVM and more suitable for time series online prediction.


Sign in / Sign up

Export Citation Format

Share Document