Generalized Jensen difference divergence measures and Fisher measure of information

Kybernetes ◽  
1995 ◽  
Vol 24 (2) ◽  
pp. 15-28
Author(s):  
L. Pardo ◽  
D. Morales ◽  
I.J. Taneja
1985 ◽  
Vol 35 (2) ◽  
pp. 145-156 ◽  
Author(s):  
Annibal P. Sant'anna ◽  
Inder Jeet Taneja

Mathematics ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 1423
Author(s):  
Javier Bonilla ◽  
Daniel Vélez ◽  
Javier Montero ◽  
J. Tinguaro Rodríguez

In the last two decades, information entropy measures have been relevantly applied in fuzzy clustering problems in order to regularize solutions by avoiding the formation of partitions with excessively overlapping clusters. Following this idea, relative entropy or divergence measures have been similarly applied, particularly to enable that kind of entropy-based regularization to also take into account, as well as interact with, cluster size variables. Particularly, since Rényi divergence generalizes several other divergence measures, its application in fuzzy clustering seems promising for devising more general and potentially more effective methods. However, previous works making use of either Rényi entropy or divergence in fuzzy clustering, respectively, have not considered cluster sizes (thus applying regularization in terms of entropy, not divergence) or employed divergence without a regularization purpose. Then, the main contribution of this work is the introduction of a new regularization term based on Rényi relative entropy between membership degrees and observation ratios per cluster to penalize overlapping solutions in fuzzy clustering analysis. Specifically, such Rényi divergence-based term is added to the variance-based Fuzzy C-means objective function when allowing cluster sizes. This then leads to the development of two new fuzzy clustering methods exhibiting Rényi divergence-based regularization, the second one extending the first by considering a Gaussian kernel metric instead of the Euclidean distance. Iterative expressions for these methods are derived through the explicit application of Lagrange multipliers. An interesting feature of these expressions is that the proposed methods seem to take advantage of a greater amount of information in the updating steps for membership degrees and observations ratios per cluster. Finally, an extensive computational study is presented showing the feasibility and comparatively good performance of the proposed methods.


2001 ◽  
Vol 93 (1-2) ◽  
pp. 1-16 ◽  
Author(s):  
Jan Beirlant ◽  
Luc Devroye ◽  
László Györfi ◽  
Igor Vajda

2010 ◽  
Vol 47 (1) ◽  
pp. 216-234 ◽  
Author(s):  
Filia Vonta ◽  
Alex Karagrigoriou

Measures of divergence or discrepancy are used either to measure mutual information concerning two variables or to construct model selection criteria. In this paper we focus on divergence measures that are based on a class of measures known as Csiszár's divergence measures. In particular, we propose a measure of divergence between residual lives of two items that have both survived up to some time t as well as a measure of divergence between past lives, both based on Csiszár's class of measures. Furthermore, we derive properties of these measures and provide examples based on the Cox model and frailty or transformation model.


2012 ◽  
Vol 8 (1) ◽  
pp. 17-32 ◽  
Author(s):  
K. Jain ◽  
Ram Saraswat

A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence MeasuresAn Information inequality by using convexity arguments and Jensen inequality is established in terms of Csiszar f-divergence measures. This inequality is applied in comparing particular divergences which play a fundamental role in Information theory, such as Kullback-Leibler distance, Hellinger discrimination, Chi-square distance, J-divergences and others.


2019 ◽  
Vol 36 (4) ◽  
pp. 3195-3209 ◽  
Author(s):  
Jiubing Liu ◽  
Xianzhong Zhou ◽  
Bing Huang ◽  
Huaxiong Li ◽  
Hengrong Ju

Sign in / Sign up

Export Citation Format

Share Document