scholarly journals Adaptive learning rates for support vector machines working on data with low intrinsic dimension

2021 ◽  
Vol 49 (6) ◽  
Author(s):  
Thomas Hamm ◽  
Ingo Steinwart
2005 ◽  
Vol 03 (04) ◽  
pp. 357-371 ◽  
Author(s):  
CLINT SCOVEL ◽  
DON HUSH ◽  
INGO STEINWART

In this paper, we address learning rates for the density level detection (DLD) problem. We begin by proving a "No Free Lunch Theorem" showing that rates cannot be obtained in general. Then, we apply a recently established classification framework to obtain rates for DLD support vector machines under mild assumptions on the density.


2016 ◽  
Vol 28 (1) ◽  
pp. 71-88 ◽  
Author(s):  
Hongzhi Tong

We present a better theoretical foundation of support vector machines with polynomial kernels. The sample error is estimated under Tsybakov’s noise assumption. In bounding the approximation error, we take advantage of a geometric noise assumption that was introduced to analyze gaussian kernels. Compared with the previous literature, the error analysis in this note does not require any regularity of the marginal distribution or smoothness of Bayes’ rule. We thus establish the learning rates for polynomial kernels for a wide class of distributions.


2018 ◽  
Author(s):  
Nelson Marcelo Romero Aquino ◽  
Matheus Gutoski ◽  
Leandro Takeshi Hattori ◽  
Heitor Silvério Lopes

Sign in / Sign up

Export Citation Format

Share Document