scholarly journals Contextuality in canonical systems of random variables

Author(s):  
Ehtibar N. Dzhafarov ◽  
Víctor H. Cervantes ◽  
Janne V. Kujala

Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue ‘Second quantum revolution: foundational questions’.

1978 ◽  
Vol 15 (03) ◽  
pp. 639-644 ◽  
Author(s):  
Peter Hall

LetXn1≦Xn2≦ ··· ≦Xnndenote the order statistics from a sample ofnindependent, identically distributed random variables, and suppose that the variablesXnn, Xn,n–1, ···, when suitably normalized, have a non-trivial limiting joint distributionξ1,ξ2, ···, asn → ∞. It is well known that the limiting distribution must be one of just three types. We provide a canonical representation of the stochastic process {ξn,n≧ 1} in terms of exponential variables, and use this representation to obtain limit theorems forξnasn →∞.


2012 ◽  
Vol 49 (3) ◽  
pp. 758-772 ◽  
Author(s):  
Fred W. Huffer ◽  
Jayaram Sethuraman

An infinite sequence (Y1, Y2,…) of independent Bernoulli random variables with P(Yi = 1) = a / (a + b + i - 1), i = 1, 2,…, where a > 0 and b ≥ 0, will be called a Bern(a, b) sequence. Consider the counts Z1, Z2, Z3,… of occurrences of patterns or strings of the form {11}, {101}, {1001},…, respectively, in this sequence. The joint distribution of the counts Z1, Z2,… in the infinite Bern(a, b) sequence has been studied extensively. The counts from the initial finite sequence (Y1, Y2,…, Yn) have been studied by Holst (2007), (2008b), who obtained the joint factorial moments for Bern(a, 0) and the factorial moments of Z1, the count of the string {1, 1}, for a general Bern(a, b). We consider stopping the Bernoulli sequence at a random time and describe the joint distribution of counts, which extends Holst's results. We show that the joint distribution of counts from a class of randomly stopped Bernoulli sequences possesses the mixture of independent Poissons property: there is a random vector conditioned on which the counts are independent Poissons. To obtain these results, we extend the conditional marked Poisson process technique introduced in Huffer, Sethuraman and Sethuraman (2009). Our results avoid previous combinatorial and induction methods which generally only yield factorial moments.


2012 ◽  
Vol 49 (03) ◽  
pp. 758-772 ◽  
Author(s):  
Fred W. Huffer ◽  
Jayaram Sethuraman

An infinite sequence (Y 1, Y 2,…) of independent Bernoulli random variables with P(Y i = 1) = a / (a + b + i - 1), i = 1, 2,…, where a > 0 and b ≥ 0, will be called a Bern(a, b) sequence. Consider the counts Z 1, Z 2, Z 3,… of occurrences of patterns or strings of the form {11}, {101}, {1001},…, respectively, in this sequence. The joint distribution of the counts Z 1, Z 2,… in the infinite Bern(a, b) sequence has been studied extensively. The counts from the initial finite sequence (Y 1, Y 2,…, Y n ) have been studied by Holst (2007), (2008b), who obtained the joint factorial moments for Bern(a, 0) and the factorial moments of Z 1, the count of the string {1, 1}, for a general Bern(a, b). We consider stopping the Bernoulli sequence at a random time and describe the joint distribution of counts, which extends Holst's results. We show that the joint distribution of counts from a class of randomly stopped Bernoulli sequences possesses the mixture of independent Poissons property: there is a random vector conditioned on which the counts are independent Poissons. To obtain these results, we extend the conditional marked Poisson process technique introduced in Huffer, Sethuraman and Sethuraman (2009). Our results avoid previous combinatorial and induction methods which generally only yield factorial moments.


1978 ◽  
Vol 15 (3) ◽  
pp. 639-644 ◽  
Author(s):  
Peter Hall

Let Xn1 ≦ Xn2 ≦ ··· ≦ Xnn denote the order statistics from a sample of n independent, identically distributed random variables, and suppose that the variables Xnn, Xn, n–1, ···, when suitably normalized, have a non-trivial limiting joint distribution ξ1, ξ2, ···, as n → ∞. It is well known that the limiting distribution must be one of just three types. We provide a canonical representation of the stochastic process {ξn, n ≧ 1} in terms of exponential variables, and use this representation to obtain limit theorems for ξ n as n →∞.


1972 ◽  
Vol 31 (1) ◽  
pp. 131-140 ◽  
Author(s):  
Donald W. Zimmerman

The concepts of random error and reliability of measurements that are familiar in traditional theories based on the notions of “true values” and “errors” can be represented by a probability model having a simpler formal structure and fewer special assumptions about random sampling and independence of measurements. In this model formulas that relate observable events are derived from probability axioms and from primitive terms that refer to observable events, without an intermediate structure containing variances and correlations of “true” and “error” components of scores. While more economical in language and formalism, the model at the same time is more general than classical theories and applies to stochastic processes in which joint distributions of many dependent random variables are of interest. In addition, it clarifies some long-standing problems concerning “experimental independence” of measurements and the relation of sampling of individuals to sampling of measurements.


1958 ◽  
Vol 10 ◽  
pp. 222-229 ◽  
Author(s):  
J. R. Blum ◽  
H. Chernoff ◽  
M. Rosenblatt ◽  
H. Teicher

Let {Xn} (n = 1, 2 , …) be a stochastic process. The random variables comprising it or the process itself will be said to be interchangeable if, for any choice of distinct positive integers i 1, i 2, H 3 … , ik, the joint distribution of depends merely on k and is independent of the integers i 1, i 2, … , i k. It was shown by De Finetti (3) that the probability measure for any interchangeable process is a mixture of probability measures of processes each consisting of independent and identically distributed random variables.


2012 ◽  
Vol 49 (3) ◽  
pp. 895-900
Author(s):  
Sheldon M. Ross

We find the joint distribution of the lengths of the shortest paths from a specified node to all other nodes in a network in which the edge lengths are assumed to be independent heterogeneous exponential random variables. We also give an efficient way to simulate these lengths that requires only one generated exponential per node, as well as efficient procedures to use the simulated data to estimate quantities of the joint distribution.


1990 ◽  
Vol 33 (1) ◽  
pp. 24-28 ◽  
Author(s):  
Y. H. Wang

AbstractIn this paper, we consolidate into one two separate problems - dependent random variables with independent subsets and construction of a joint distribution with given marginals. Let N = {1,2,3,...} and X = {Xn; n ∊ N} be a sequence of random variables with nondegenerate one-dimensional marginal distributions {Fn; n ∊ N}. An example is constructed to show that there exists a sequence of random variables Y = {Yn; n ∊ N} such that the components of a subset of Y are independent if and only if its size is ≦ k, where k ≧ 2 is a prefixed integer. Furthermore, the one-dimensional marginal distributions of Y are those of X.


Sign in / Sign up

Export Citation Format

Share Document