scholarly journals LOSS FUNCTIONS AND DESCENT METHOD

2021 ◽  
Vol 55 (1 (254)) ◽  
pp. 29-35
Author(s):  
Hovhannes Z. Zohrabyan ◽  
Victor K. Ohanyan

In this paper, we showed that it is possible to use gradient descent method to get minimal error values of loss functions close to their Bayesian estimators. We calculated Bayesian estimators mathematically for different loss functions and tested them using gradient descent algorithm. This algorithm, working on Normal and Poisson distributions showed that it is possible to find minimal error values without having Bayesian estimators. Using Python, we tested the theory on loss functions with known Bayesian estimators as well as another loss functions, getting results proving the theory.

Sensors ◽  
2020 ◽  
Vol 20 (9) ◽  
pp. 2510
Author(s):  
Nam D. Vo ◽  
Minsung Hong ◽  
Jason J. Jung

The previous recommendation system applied the matrix factorization collaborative filtering (MFCF) technique to only single domains. Due to data sparsity, this approach has a limitation in overcoming the cold-start problem. Thus, in this study, we focus on discovering latent features from domains to understand the relationships between domains (called domain coherence). This approach uses potential knowledge of the source domain to improve the quality of the target domain recommendation. In this paper, we consider applying MFCF to multiple domains. Mainly, by adopting the implicit stochastic gradient descent algorithm to optimize the objective function for prediction, multiple matrices from different domains are consolidated inside the cross-domain recommendation system (CDRS). Additionally, we design a conceptual framework for CDRS, which applies to different industrial scenarios for recommenders across domains. Moreover, an experiment is devised to validate the proposed method. By using a real-world dataset gathered from Amazon Food and MovieLens, experimental results show that the proposed method improves 15.2% and 19.7% in terms of computation time and MSE over other methods on a utility matrix. Notably, a much lower convergence value of the loss function has been obtained from the experiment. Furthermore, a critical analysis of the obtained results shows that there is a dynamic balance between prediction accuracy and computational complexity.


2007 ◽  
Vol 19 (8) ◽  
pp. 2183-2244 ◽  
Author(s):  
Takafumi Kanamori ◽  
Takashi Takenouchi ◽  
Shinto Eguchi ◽  
Noboru Murata

Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, the truncation of loss functions is applied to contamination models that describe the occurrence of mislabels near decision boundaries. Numerical experiments illustrate that the proposed loss functions derived from the contamination models are useful for handling highly noisy data in comparison with other loss functions.


Entropy ◽  
2019 ◽  
Vol 21 (7) ◽  
pp. 644
Author(s):  
Baobin Wang ◽  
Ting Hu

In the framework of statistical learning, we study the online gradient descent algorithm generated by the correntropy-induced losses in Reproducing kernel Hilbert spaces (RKHS). As a generalized correlation measurement, correntropy has been widely applied in practice, owing to its prominent merits on robustness. Although the online gradient descent method is an efficient way to deal with the maximum correntropy criterion (MCC) in non-parameter estimation, there has been no consistency in analysis or rigorous error bounds. We provide a theoretical understanding of the online algorithm for MCC, and show that, with a suitable chosen scaling parameter, its convergence rate can be min–max optimal (up to a logarithmic factor) in the regression analysis. Our results show that the scaling parameter plays an essential role in both robustness and consistency.


Sign in / Sign up

Export Citation Format

Share Document