Information Theoretic Global Measures of Dirac Equation With Morse and Trigonometric Rosen–Morse Potentials

2017 ◽  
Vol 58 (5) ◽  
Author(s):  
S. A. Najafizade ◽  
H. Hassanabadi ◽  
S. Zarrinkamar
2013 ◽  
Vol 58 (6) ◽  
pp. 523-533 ◽  
Author(s):  
V.M. Simulik ◽  
◽  
I.Yu. Krivsky ◽  
I.L. Lamer ◽  
◽  
...  

Author(s):  
Ryan Ka Yau Lai ◽  
Youngah Do

This article explores a method of creating confidence bounds for information-theoretic measures in linguistics, such as entropy, Kullback-Leibler Divergence (KLD), and mutual information. We show that a useful measure of uncertainty can be derived from simple statistical principles, namely the asymptotic distribution of the maximum likelihood estimator (MLE) and the delta method. Three case studies from phonology and corpus linguistics are used to demonstrate how to apply it and examine its robustness against common violations of its assumptions in linguistics, such as insufficient sample size and non-independence of data points.


Author(s):  
І. І. Гайсак ◽  
В. С. Морохович

Sign in / Sign up

Export Citation Format

Share Document