Support Vector Classifier Trained by Gradient Descent

Author(s):  
Fengyu Gao ◽  
Chien-Hua Chen ◽  
Jer-Guang Hsieh ◽  
Jyh-Horng Jeng
2020 ◽  
Vol 4 (2) ◽  
pp. 329-335
Author(s):  
Rusydi Umar ◽  
Imam Riadi ◽  
Purwono

The failure of most startups in Indonesia is caused by team performance that is not solid and competent. Programmers are an integral profession in a startup team. The development of social media can be used as a strategic tool for recruiting the best programmer candidates in a company. This strategic tool is in the form of an automatic classification system of social media posting from prospective programmers. The classification results are expected to be able to predict the performance patterns of each candidate with a predicate of good or bad performance. The classification method with the best accuracy needs to be chosen in order to get an effective strategic tool so that a comparison of several methods is needed. This study compares classification methods including the Support Vector Machines (SVM) algorithm, Random Forest (RF) and Stochastic Gradient Descent (SGD). The classification results show the percentage of accuracy with k = 10 cross validation for the SVM algorithm reaches 81.3%, RF at 74.4%, and SGD at 80.1% so that the SVM method is chosen as a model of programmer performance classification on social media activities.


2021 ◽  
pp. 1-1
Author(s):  
Hai Yang ◽  
Lizao Zhang ◽  
Tao Luo ◽  
Haibo Liang ◽  
Li Li ◽  
...  

2013 ◽  
Vol 842 ◽  
pp. 746-749
Author(s):  
Bo Yang ◽  
Liang Zhang

A novel sparse weighted LSSVM classifier is proposed in this paper, which is based on Suykens weighted LSSVM. Unlike Suykens weighted LSSVM, the proposed weighted method is more suitable for classification. The distance between sample and classification border is used as the sample importance measure in our weighted method. Based on this importance measure, a new weight calculating function, using which can adjust the sparseness of weight, is designed. In order to solve the imbalance problem, a kind of normalization weights calculating method is proposed. Finally, the proposed method is used on digit recognition. Comparative experiment results show that the proposed sparse weighted LSSVM can improve the recognition correct rate effectively.


2003 ◽  
Vol 15 (9) ◽  
pp. 2227-2254 ◽  
Author(s):  
Wei Chu ◽  
S. Sathiya Keerthi ◽  
Chong Jin Ong

This letter describes Bayesian techniques for support vector classification. In particular, we propose a novel differentiable loss function, called the trigonometric loss function, which has the desirable characteristic of natural normalization in the likelihood function, and then follow standard gaussian processes techniques to set up a Bayesian framework. In this framework, Bayesian inference is used to implement model adaptation, while keeping the merits of support vector classifier, such as sparseness and convex programming. This differs from standard gaussian processes for classification. Moreover, we put forward class probability in making predictions. Experimental results on benchmark data sets indicate the usefulness of this approach.


2016 ◽  
Vol 20 (s1) ◽  
pp. S109-S119 ◽  
Author(s):  
G. López-González ◽  
N. Arana-Daniel ◽  
E. Bayro-Corrochano

Sign in / Sign up

Export Citation Format

Share Document