Optimal Reduced-Set Vectors for Support Vector Machines with a Quadratic Kernel
Keyword(s):
To reduce computational cost, the discriminant function of a support vector machine (SVM) should be represented using as few vectors as possible. This problem has been tackled in different ways. In this article, we develop an explicit solution in the case of a general quadratic kernel k(x, x′) = (C + Dx⊺x′)2. For a given number of vectors, this solution provides the best possible approximation and can even recover the discriminant function if the number of used vectors is large enough. The key idea is to express the inhomogeneous kernel as a homogeneous kernel on a space having one dimension more than the original one and to follow the approach of Burges (1996).
2000 ◽
Vol 12
(11)
◽
pp. 2655-2684
◽
2008 ◽
pp. 1277-1282
2005 ◽
Vol 9
(6)
◽
pp. 698-707
2014 ◽
Vol 1061-1062
◽
pp. 935-938
2017 ◽
Vol 2
(1)
◽
pp. 1
2020 ◽
Vol 16
(5)
◽
pp. 155014772092163
2011 ◽
Vol 383-390
◽
pp. 1629-1634
2012 ◽
Vol 26
(2)
◽
pp. 109-115
◽
2018 ◽
Vol 18
(3)
◽
pp. 715-724
◽