scholarly journals Low-rank decomposition meets kernel learning: A generalized Nyström method

2017 ◽  
Vol 250 ◽  
pp. 1-15 ◽  
Author(s):  
Liang Lan ◽  
Kai Zhang ◽  
Hancheng Ge ◽  
Wei Cheng ◽  
Jun Liu ◽  
...  
2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Ling Wang ◽  
Hongqiao Wang ◽  
Guangyuan Fu

Extensions of kernel methods for the class imbalance problems have been extensively studied. Although they work well in coping with nonlinear problems, the high computation and memory costs severely limit their application to real-world imbalanced tasks. The Nyström method is an effective technique to scale kernel methods. However, the standard Nyström method needs to sample a sufficiently large number of landmark points to ensure an accurate approximation, which seriously affects its efficiency. In this study, we propose a multi-Nyström method based on mixtures of Nyström approximations to avoid the explosion of subkernel matrix, whereas the optimization to mixture weights is embedded into the model training process by multiple kernel learning (MKL) algorithms to yield more accurate low-rank approximation. Moreover, we select subsets of landmark points according to the imbalance distribution to reduce the model’s sensitivity to skewness. We also provide a kernel stability analysis of our method and show that the model solution error is bounded by weighted approximate errors, which can help us improve the learning process. Extensive experiments on several large scale datasets show that our method can achieve a higher classification accuracy and a dramatical speedup of MKL algorithms.


2018 ◽  
Vol 20 (10) ◽  
pp. 2659-2669 ◽  
Author(s):  
Jiandong Tian ◽  
Zhi Han ◽  
Weihong Ren ◽  
Xiai Chen ◽  
Yandong Tang

2017 ◽  
Vol 19 (5) ◽  
pp. 969-983 ◽  
Author(s):  
Hengyou Wang ◽  
Yigang Cen ◽  
Zhihai He ◽  
Ruizhen Zhao ◽  
Yi Cen ◽  
...  

Author(s):  
Chen Chen ◽  
Baochang Zhang ◽  
Alessio Del Bue ◽  
Vittorio Murino

Sign in / Sign up

Export Citation Format

Share Document