scholarly journals Infinite Arms Bandit: Optimality via Confidence Bounds

2022 ◽  
Author(s):  
Hock Peng Chan ◽  
Shouri Hu
Keyword(s):  
Author(s):  
Ryan Ka Yau Lai ◽  
Youngah Do

This article explores a method of creating confidence bounds for information-theoretic measures in linguistics, such as entropy, Kullback-Leibler Divergence (KLD), and mutual information. We show that a useful measure of uncertainty can be derived from simple statistical principles, namely the asymptotic distribution of the maximum likelihood estimator (MLE) and the delta method. Three case studies from phonology and corpus linguistics are used to demonstrate how to apply it and examine its robustness against common violations of its assumptions in linguistics, such as insufficient sample size and non-independence of data points.


Sign in / Sign up

Export Citation Format

Share Document