Testing of Coarsening Mechanisms: Coarsening at Random Versus Subgroup Independence

Author(s):  
Julia Plass ◽  
Marco E. G. V. Cattaneo ◽  
Georg Schollmeyer ◽  
Thomas Augustin
Keyword(s):  
1998 ◽  
Vol 30 (2) ◽  
pp. 278-279
Author(s):  
Richard D. Gill
Keyword(s):  

2008 ◽  
Vol 36 (5) ◽  
pp. 2409-2422 ◽  
Author(s):  
Richard D. Gill ◽  
Peter D. Grünwald

2003 ◽  
Vol 19 ◽  
pp. 243-278 ◽  
Author(s):  
P. D. Grunwald ◽  
J. Y. Halpern

As examples such as the Monty Hall puzzle show, applying conditioning to update a probability distribution on a ``naive space'', which does not take into account the protocol used, can often lead to counterintuitive results. Here we examine why. A criterion known as CAR (``coarsening at random'') in the statistical literature characterizes when ``naive'' conditioning in a naive space works. We show that the CAR condition holds rather infrequently, and we provide a procedural characterization of it, by giving a randomized algorithm that generates all and only distributions for which CAR holds. This substantially extends previous characterizations of CAR. We also consider more generalized notions of update such as Jeffrey conditioning and minimizing relative entropy (MRE). We give a generalization of the CAR condition that characterizes when Jeffrey conditioning leads to appropriate answers, and show that there exist some very simple settings in which MRE essentially never gives the right results. This generalizes and interconnects previous results obtained in the literature on CAR and MRE.


Sign in / Sign up

Export Citation Format

Share Document