Automatic Detection of Answer Copying via Kullback-Leibler Divergence and K-Index

2010 ◽  
Vol 34 (6) ◽  
pp. 379-392 ◽  
Author(s):  
Dmitry I. Belov ◽  
Ronald D. Armstrong
2002 ◽  
Vol 39 (2) ◽  
pp. 115-132 ◽  
Author(s):  
Leonardo S. Sotaridona ◽  
Rob R. Meijer

2014 ◽  
Author(s):  
Douglas Martin ◽  
Rachel Swainson ◽  
Gillian Slessor ◽  
Jacqui Hutchison ◽  
Diana Marosi

2013 ◽  
Vol 61 (S 01) ◽  
Author(s):  
A Van Linden ◽  
J Kempfert ◽  
J Blumenstein ◽  
H Möllmann ◽  
WK Kim ◽  
...  

Author(s):  
Ryan Ka Yau Lai ◽  
Youngah Do

This article explores a method of creating confidence bounds for information-theoretic measures in linguistics, such as entropy, Kullback-Leibler Divergence (KLD), and mutual information. We show that a useful measure of uncertainty can be derived from simple statistical principles, namely the asymptotic distribution of the maximum likelihood estimator (MLE) and the delta method. Three case studies from phonology and corpus linguistics are used to demonstrate how to apply it and examine its robustness against common violations of its assumptions in linguistics, such as insufficient sample size and non-independence of data points.


Sign in / Sign up

Export Citation Format

Share Document