On the Convexity of Some Divergence Measures Based on Entropy Functions.

Author(s):  
J. Burbea ◽  
C. Radhakrishna Rao
2016 ◽  
Vol 11 (2) ◽  
pp. 205-209
Author(s):  
D.T. Siraeva

Invariant submodel of rank 2 on the subalgebra consisting of the sum of transfers for hydrodynamic equations with the equation of state in the form of pressure as the sum of density and entropy functions, is presented. In terms of the Lagrangian coordinates from condition of nonhyperbolic submodel solutions depending on the four essential constants are obtained. For simplicity, we consider the solution depending on two constants. The trajectory of particles motion, the motion of parallelepiped of the same particles are studied using the Maple.


2018 ◽  
Vol 13 (3) ◽  
pp. 59-63 ◽  
Author(s):  
D.T. Siraeva

Equations of hydrodynamic type with the equation of state in the form of pressure separated into a sum of density and entropy functions are considered. Such a system of equations admits a twelve-dimensional Lie algebra. In the case of the equation of state of the general form, the equations of gas dynamics admit an eleven-dimensional Lie algebra. For both Lie algebras the optimal systems of non-similar subalgebras are constructed. In this paper two partially invariant submodels of rank 3 defect 1 are constructed for two-dimensional subalgebras of the twelve-dimensional Lie algebra. The reduction of the constructed submodels to invariant submodels of eleven-dimensional and twelve-dimensional Lie algebras is proved.


Electronics ◽  
2021 ◽  
Vol 10 (6) ◽  
pp. 657
Author(s):  
Krzysztof Gajowniczek ◽  
Tomasz Ząbkowski

This paper presents two R packages ImbTreeEntropy and ImbTreeAUC to handle imbalanced data problems. ImbTreeEntropy functionality includes application of a generalized entropy functions, such as Rényi, Tsallis, Sharma–Mittal, Sharma–Taneja and Kapur, to measure impurity of a node. ImbTreeAUC provides non-standard measures to choose an optimal split point for an attribute (as well the optimal attribute for splitting) by employing local, semi-global and global AUC (Area Under the ROC curve) measures. Both packages are applicable for binary and multiclass problems and they support cost-sensitive learning, by defining a misclassification cost matrix, and weighted-sensitive learning. The packages accept all types of attributes, including continuous, ordered and nominal, where the latter type is simplified for multiclass problems to reduce the computational overheads. Both applications enable optimization of the thresholds where posterior probabilities determine final class labels in a way that misclassification costs are minimized. Model overfitting can be managed either during the growing phase or at the end using post-pruning. The packages are mainly implemented in R, however some computationally demanding functions are written in plain C++. In order to speed up learning time, parallel processing is supported as well.


Mathematics ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 1423
Author(s):  
Javier Bonilla ◽  
Daniel Vélez ◽  
Javier Montero ◽  
J. Tinguaro Rodríguez

In the last two decades, information entropy measures have been relevantly applied in fuzzy clustering problems in order to regularize solutions by avoiding the formation of partitions with excessively overlapping clusters. Following this idea, relative entropy or divergence measures have been similarly applied, particularly to enable that kind of entropy-based regularization to also take into account, as well as interact with, cluster size variables. Particularly, since Rényi divergence generalizes several other divergence measures, its application in fuzzy clustering seems promising for devising more general and potentially more effective methods. However, previous works making use of either Rényi entropy or divergence in fuzzy clustering, respectively, have not considered cluster sizes (thus applying regularization in terms of entropy, not divergence) or employed divergence without a regularization purpose. Then, the main contribution of this work is the introduction of a new regularization term based on Rényi relative entropy between membership degrees and observation ratios per cluster to penalize overlapping solutions in fuzzy clustering analysis. Specifically, such Rényi divergence-based term is added to the variance-based Fuzzy C-means objective function when allowing cluster sizes. This then leads to the development of two new fuzzy clustering methods exhibiting Rényi divergence-based regularization, the second one extending the first by considering a Gaussian kernel metric instead of the Euclidean distance. Iterative expressions for these methods are derived through the explicit application of Lagrange multipliers. An interesting feature of these expressions is that the proposed methods seem to take advantage of a greater amount of information in the updating steps for membership degrees and observations ratios per cluster. Finally, an extensive computational study is presented showing the feasibility and comparatively good performance of the proposed methods.


2001 ◽  
Vol 93 (1-2) ◽  
pp. 1-16 ◽  
Author(s):  
Jan Beirlant ◽  
Luc Devroye ◽  
László Györfi ◽  
Igor Vajda

1995 ◽  
Vol 138 (1-3) ◽  
pp. 319-326
Author(s):  
A. Meir ◽  
J.W. Moon

2010 ◽  
Vol 47 (1) ◽  
pp. 216-234 ◽  
Author(s):  
Filia Vonta ◽  
Alex Karagrigoriou

Measures of divergence or discrepancy are used either to measure mutual information concerning two variables or to construct model selection criteria. In this paper we focus on divergence measures that are based on a class of measures known as Csiszár's divergence measures. In particular, we propose a measure of divergence between residual lives of two items that have both survived up to some time t as well as a measure of divergence between past lives, both based on Csiszár's class of measures. Furthermore, we derive properties of these measures and provide examples based on the Cox model and frailty or transformation model.


Kybernetes ◽  
1995 ◽  
Vol 24 (2) ◽  
pp. 15-28
Author(s):  
L. Pardo ◽  
D. Morales ◽  
I.J. Taneja

2012 ◽  
Vol 8 (1) ◽  
pp. 17-32 ◽  
Author(s):  
K. Jain ◽  
Ram Saraswat

A New Information Inequality and Its Application in Establishing Relation Among Various f-Divergence MeasuresAn Information inequality by using convexity arguments and Jensen inequality is established in terms of Csiszar f-divergence measures. This inequality is applied in comparing particular divergences which play a fundamental role in Information theory, such as Kullback-Leibler distance, Hellinger discrimination, Chi-square distance, J-divergences and others.


Sign in / Sign up

Export Citation Format

Share Document