Predictive Nyström method for kernel methods

2017 ◽  
Vol 234 ◽  
pp. 116-125 ◽  
Author(s):  
Jiangang Wu ◽  
Lizhong Ding ◽  
Shizhong Liao
2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Ling Wang ◽  
Hongqiao Wang ◽  
Guangyuan Fu

Extensions of kernel methods for the class imbalance problems have been extensively studied. Although they work well in coping with nonlinear problems, the high computation and memory costs severely limit their application to real-world imbalanced tasks. The Nyström method is an effective technique to scale kernel methods. However, the standard Nyström method needs to sample a sufficiently large number of landmark points to ensure an accurate approximation, which seriously affects its efficiency. In this study, we propose a multi-Nyström method based on mixtures of Nyström approximations to avoid the explosion of subkernel matrix, whereas the optimization to mixture weights is embedded into the model training process by multiple kernel learning (MKL) algorithms to yield more accurate low-rank approximation. Moreover, we select subsets of landmark points according to the imbalance distribution to reduce the model’s sensitivity to skewness. We also provide a kernel stability analysis of our method and show that the model solution error is bounded by weighted approximate errors, which can help us improve the learning process. Extensive experiments on several large scale datasets show that our method can achieve a higher classification accuracy and a dramatical speedup of MKL algorithms.


2016 ◽  
Vol 20 (5) ◽  
pp. 997-1019 ◽  
Author(s):  
Arik Nemtsov ◽  
Amir Averbuch ◽  
Alon Schclar

2020 ◽  
Vol 148 ◽  
pp. 107701
Author(s):  
César Bublitz ◽  
Fabio S. de Azevedo ◽  
Esequia Sauter

2018 ◽  
Vol 129 ◽  
pp. 9-15 ◽  
Author(s):  
Liangchi Li ◽  
Shenling Wang ◽  
Shuaijing Xu ◽  
Yuqi Yang

Sign in / Sign up

Export Citation Format

Share Document