ramp loss function
Recently Published Documents


TOTAL DOCUMENTS

2
(FIVE YEARS 2)

H-INDEX

0
(FIVE YEARS 0)

2020 ◽  
Vol 2020 ◽  
pp. 1-15
Author(s):  
Cuiqing Zhang ◽  
Maojun Zhang ◽  
Xijun Liang ◽  
Zhonghang Xia ◽  
Jiangxia Nan

Due to its wide applications and learning efficiency, online ordinal regression using perceptron algorithms with interval labels (PRIL) has been increasingly applied to solve ordinal ranking problems. However, it is still a challenge for the PRIL method to handle noise labels, in which case the ranking results may change dramatically. To tackle this problem, in this paper, we propose noise-resilient online learning algorithms using ramp loss function, called PRIL-RAMP, and its nonlinear variant K-PRIL-RAMP, to improve the performance of PRIL method for noisy data streams. The proposed algorithms iteratively optimize the decision function under the framework of online gradient descent (OGD), and we justify the algorithms by showing the order preservation of thresholds. It is validated in the experiments that both approaches are more robust and efficient to noise labels than state-of-the-art online ordinal regression algorithms on real-world datasets.


2020 ◽  
Vol 34 (04) ◽  
pp. 5652-5659
Author(s):  
Kulin Shah ◽  
Naresh Manwani

Active learning is an important technique to reduce the number of labeled examples in supervised learning. Active learning for binary classification has been well addressed in machine learning. However, active learning of the reject option classifier remains unaddressed. In this paper, we propose novel algorithms for active learning of reject option classifiers. We develop an active learning algorithm using double ramp loss function. We provide mistake bounds for this algorithm. We also propose a new loss function called double sigmoid loss function for reject option and corresponding active learning algorithm. We offer a convergence guarantee for this algorithm. We provide extensive experimental results to show the effectiveness of the proposed algorithms. The proposed algorithms efficiently reduce the number of label examples required.


Sign in / Sign up

Export Citation Format

Share Document