A Study on Weighted Doubly Truncated Renyi Divergence

Author(s):  
Rajesh Moharana ◽  
Suchandan Kayal
Keyword(s):  
Mathematics ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 1423
Author(s):  
Javier Bonilla ◽  
Daniel Vélez ◽  
Javier Montero ◽  
J. Tinguaro Rodríguez

In the last two decades, information entropy measures have been relevantly applied in fuzzy clustering problems in order to regularize solutions by avoiding the formation of partitions with excessively overlapping clusters. Following this idea, relative entropy or divergence measures have been similarly applied, particularly to enable that kind of entropy-based regularization to also take into account, as well as interact with, cluster size variables. Particularly, since Rényi divergence generalizes several other divergence measures, its application in fuzzy clustering seems promising for devising more general and potentially more effective methods. However, previous works making use of either Rényi entropy or divergence in fuzzy clustering, respectively, have not considered cluster sizes (thus applying regularization in terms of entropy, not divergence) or employed divergence without a regularization purpose. Then, the main contribution of this work is the introduction of a new regularization term based on Rényi relative entropy between membership degrees and observation ratios per cluster to penalize overlapping solutions in fuzzy clustering analysis. Specifically, such Rényi divergence-based term is added to the variance-based Fuzzy C-means objective function when allowing cluster sizes. This then leads to the development of two new fuzzy clustering methods exhibiting Rényi divergence-based regularization, the second one extending the first by considering a Gaussian kernel metric instead of the Euclidean distance. Iterative expressions for these methods are derived through the explicit application of Lagrange multipliers. An interesting feature of these expressions is that the proposed methods seem to take advantage of a greater amount of information in the updating steps for membership degrees and observations ratios per cluster. Finally, an extensive computational study is presented showing the feasibility and comparatively good performance of the proposed methods.


2020 ◽  
Vol 9 (3) ◽  
pp. 613-631
Author(s):  
Khuram Ali Khan ◽  
Tasadduq Niaz ◽  
Đilda Pečarić ◽  
Josip Pečarić

Abstract In this work, some new functional of Jensen-type inequalities are constructed using Shannon entropy, f-divergence, and Rényi divergence, and some estimates are obtained for these new functionals. Also using the Zipf–Mandelbrot law and hybrid Zipf–Mandelbrot law, we investigate some bounds for these new functionals. Furthermore, we generalize these new functionals for m-convex function using Lidstone polynomial.


2015 ◽  
Vol 3 (1) ◽  
pp. 18-33 ◽  
Author(s):  
Rami Atar ◽  
Kenny Chowdhary ◽  
Paul Dupuis

Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 778 ◽  
Author(s):  
Amos Lapidoth ◽  
Christoph Pfister

Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.


Sign in / Sign up

Export Citation Format

Share Document