scholarly journals Jensen-Shannon Information Based Characterization of the Generalization Error of Learning Algorithms

Author(s):  
Gholamali Aminian ◽  
Laura Toni ◽  
Miguel R. D. Rodrigues
2019 ◽  
Vol 2019 ◽  
pp. 1-9
Author(s):  
Devotha G. Nyambo ◽  
Edith T. Luhanga ◽  
Zaipuna Q. Yonah

Characterization of smallholder farmers has been conducted in various researches by using machine learning algorithms, participatory and expert-based methods. All approaches used end up with the development of some subgroups known as farm typologies. The main purpose of this paper is to highlight the main approaches used to characterize smallholder farmers, presenting the pros and cons of the approaches. By understanding the nature and key advantages of the reviewed approaches, the paper recommends a hybrid approach towards having predictive farm typologies. Search of relevant research articles published between 2007 and 2018 was done on ScienceDirect and Google Scholar. By using a generated search query, 20 research articles related to characterization of smallholder farmers were retained. Cluster-based algorithms appeared to be the mostly used in characterizing smallholder farmers. However, being highly unpredictable and inconsistent, use of clustering methods calls in for a discussion on how well the developed farm typologies can be used to predict future trends of the farmers. A thorough discussion is presented and recommends use of supervised models to validate unsupervised models. In order to achieve predictive farm typologies, three stages in characterization are recommended as tested in smallholder dairy farmers datasets: (a) develop farm types from a comparative analysis of more than two unsupervised learning algorithms by using training models, (b) assess the training models’ robustness in predicting farm types for a testing dataset, and (c) assess the predictive power of the developed farm types from each algorithm by predicting the trend of several response variables.


Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 1021
Author(s):  
James Fullwood ◽  
Arthur J. Parzygnat

We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we call conditional information loss. Although not functorial, these information measures are semi-functorial, a concept we introduce that is definable in any Markov category. We also introduce the notion of an entropic Bayes’ rule for information measures, and we provide a characterization of conditional entropy in terms of this rule.


2021 ◽  
pp. 46-56
Author(s):  
Aissa Boudjella ◽  
Manal Y. Boudjella ◽  
Mohamed E. Bellebna ◽  
Nasreddine Aoumeur ◽  
Samir Belhouari

2021 ◽  
pp. 109926
Author(s):  
Kavindu Wijesinghe ◽  
Janith Wanni ◽  
Natasha K Banerjee ◽  
Sean Banerjee ◽  
Ajit Achuthan

Sign in / Sign up

Export Citation Format

Share Document