Kullback–Leibler divergence measure based tests concerning the biasness in a sample

2014 ◽  
Vol 21 ◽  
pp. 88-108
Author(s):  
Polychronis Economou ◽  
George Tzavelas
2007 ◽  
Vol 19 (3) ◽  
pp. 780-791 ◽  
Author(s):  
Raul Kompass

This letter presents a general parametric divergence measure. The metric includes as special cases quadratic error and Kullback-Leibler divergence. A parametric generalization of the two different multiplicative update rules for nonnegative matrix factorization by Lee and Seung (2001) is shown to lead to locally optimal solutions of the nonnegative matrix factorization problem with this new cost function. Numeric simulations demonstrate that the new update rule may improve the quadratic distance convergence speed. A proof of convergence is given that, as in Lee and Seung, uses an auxiliary function known from the expectation-maximization theoretical framework.


Entropy ◽  
2012 ◽  
Vol 14 (9) ◽  
pp. 1606-1626 ◽  
Author(s):  
Javier E. Contreras-Reyes ◽  
Reinaldo B. Arellano-Valle

2018 ◽  
Vol 97 (1) ◽  
Author(s):  
Carlos Granero-Belinchón ◽  
Stéphane G. Roux ◽  
Nicolas B. Garnier

2018 ◽  
Vol 18 (2) ◽  
pp. 155-177 ◽  
Author(s):  
Kaiwen Man ◽  
Jeffery R. Harring ◽  
Yunbo Ouyang ◽  
Sarah L. Thomas

Author(s):  
Ryan Ka Yau Lai ◽  
Youngah Do

This article explores a method of creating confidence bounds for information-theoretic measures in linguistics, such as entropy, Kullback-Leibler Divergence (KLD), and mutual information. We show that a useful measure of uncertainty can be derived from simple statistical principles, namely the asymptotic distribution of the maximum likelihood estimator (MLE) and the delta method. Three case studies from phonology and corpus linguistics are used to demonstrate how to apply it and examine its robustness against common violations of its assumptions in linguistics, such as insufficient sample size and non-independence of data points.


Sign in / Sign up

Export Citation Format

Share Document