statistical learning theory
Recently Published Documents


TOTAL DOCUMENTS

159
(FIVE YEARS 12)

H-INDEX

22
(FIVE YEARS 2)

2020 ◽  
Vol 39 (6) ◽  
pp. 9045-9051
Author(s):  
Yun Hu ◽  
Ning Li ◽  
Chenyang Luo

During the COVID-19 epidemic, college students could not return to the school, which had a great impact on the talent training of colleges and universities. Based on the statistical learning theory, this paper puts forward an evaluation model for the cultivation of innovative talents in universities after the epidemic. In this paper, the evaluation index system of the quality of innovative and entrepreneurial personnel training in Universities, which is composed of four first-class indexes: environment, teaching links, teachers and students, is constructed. At the same time, this paper uses the fuzzy comprehensive evaluation method for empirical research. Firstly, the factor set of the evaluation object and the grade domain of the comprehensive evaluation are determined. Then, AHP is used to determine the weight of evaluation indexes and expert scoring method is used to determine the single factor fuzzy comprehensive evaluation matrix of each level. According to the evaluation matrix, the fuzzy relation between evaluation object and evaluation set is calculated. Finally, according to the principle of maximum membership degree, the evaluation grade corresponding to the maximum value in the fuzzy relation set is calculated as the evaluation result of the final evaluation object. The empirical results show that this method can improve the accuracy of the evaluation model of innovation and entrepreneurship talent training, and has a certain reference value for the talent training in Universities.


2020 ◽  
Vol 102 (3) ◽  
Author(s):  
Mauro Pastore ◽  
Pietro Rotondo ◽  
Vittorio Erba ◽  
Marco Gherardi

Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 438
Author(s):  
Ibrahim Alabdulmohsin

In this paper, we introduce the notion of “learning capacity” for algorithms that learn from data, which is analogous to the Shannon channel capacity for communication systems. We show how “learning capacity” bridges the gap between statistical learning theory and information theory, and we will use it to derive generalization bounds for finite hypothesis spaces, differential privacy, and countable domains, among others. Moreover, we prove that under the Axiom of Choice, the existence of an empirical risk minimization (ERM) rule that has a vanishing learning capacity is equivalent to the assertion that the hypothesis space has a finite Vapnik–Chervonenkis (VC) dimension, thus establishing an equivalence relation between two of the most fundamental concepts in statistical learning theory and information theory. In addition, we show how the learning capacity of an algorithm provides important qualitative results, such as on the relation between generalization and algorithmic stability, information leakage, and data processing. Finally, we conclude by listing some open problems and suggesting future directions of research.


Sign in / Sign up

Export Citation Format

Share Document