scholarly journals Contextuality-by-Default Description of Bell Tests: Contextuality as the Rule and Not as an Exception

Entropy ◽  
2021 ◽  
Vol 23 (9) ◽  
pp. 1104
Author(s):  
Marian Kupczynski

Contextuality and entanglement are valuable resources for quantum computing and quantum information. Bell inequalities are used to certify entanglement; thus, it is important to understand why and how they are violated. Quantum mechanics and behavioural sciences teach us that random variables ‘measuring’ the same content (the answer to the same Yes or No question) may vary, if ‘measured’ jointly with other random variables. Alice’s and BoB′s raw data confirm Einsteinian non-signaling, but setting dependent experimental protocols are used to create samples of coupled pairs of distant ±1 outcomes and to estimate correlations. Marginal expectations, estimated using these final samples, depend on distant settings. Therefore, a system of random variables ‘measured’ in Bell tests is inconsistently connected and it should be analyzed using a Contextuality-by-Default approach, what is done for the first time in this paper. The violation of Bell inequalities and inconsistent connectedness may be explained using a contextual locally causal probabilistic model in which setting dependent variables describing measuring instruments are correctly incorporated. We prove that this model does not restrict experimenters’ freedom of choice which is a prerequisite of science. Contextuality seems to be the rule and not an exception; thus, it should be carefully tested.


Author(s):  
Janne V. Kujala ◽  
Ehtibar N. Dzhafarov

We discuss three measures of the degree of contextuality in contextual systems of dichotomous random variables. These measures are developed within the framework of the Contextuality-by-Default (CbD) theory, and apply to inconsistently connected systems (those with ‘disturbance’ allowed). For one of these measures of contextuality, presented here for the first time, we construct a corresponding measure of the degree of non-contextuality in non-contextual systems. The other two CbD-based measures do not suggest ways in which degree of non-contextuality of a non-contextual system can be quantified. We find the same to be true for the contextual fraction measure developed by Abramsky, Barbosa and Mansfield. This measure of contextuality is confined to consistently connected systems, but CbD allows one to generalize it to arbitrary systems. This article is part of the theme issue ‘Contextuality and probability in quantum mechanics and beyond’.



Author(s):  
Sauro Succi

Chapter 32 expounded the basic theory of quantum LB for the case of relativistic and non-relativistic wavefunctions, namely single-particle quantum mechanics. This chapter goes on to cover extensions of the quantum LB formalism to the overly challenging arena of quantum many-body problems and quantum field theory, along with an appraisal of prospective quantum computing implementations. Solving the single particle Schrodinger, or Dirac, equation in three dimensions is a computationally demanding task. This task, however, pales in front of the ordeal of solving the Schrodinger equation for the quantum many-body problem, namely a collection of many quantum particles, typically nuclei and electrons in a given atom or molecule.



Author(s):  
Petr Janas ◽  
Krejsa Martin

Abstract In probabilistic tasks, input random variables are often statistically dependent. This fact should be considered in correct computational procedures. In case of the newly developed Direct Optimized Probabilistic Calculation (DOProC), the statistically dependent variables can be expressed by the socalled multidimensional histograms, which can be used e.g. for probabilistic calculations and reliability assessment in the software system ProbCalc.



2021 ◽  
Author(s):  
Tim C Jenkins

Abstract Superposed wavefunctions in quantum mechanics lead to a squared amplitude that introduces interference into a probability density, which has long been a puzzle because interference between probability densities exists nowhere else in probability theory. In recent years, Man’ko and coauthors have successfully reconciled quantum and classic probability using a symplectic tomographic model. Nevertheless, there remains an unexplained coincidence in quantum mechanics, namely, that mathematically, the interference term in the squared amplitude of superposed wavefunctions gives the squared amplitude the form of a variance of a sum of correlated random variables, and we examine whether there could be an archetypical variable behind quantum probability that provides a mathematical foundation that observes both quantum and classic probability directly. The properties that would need to be satisfied for this to be the case are identified, and a generic hidden variable that satisfies them is found that would be present everywhere, transforming into a process-specific variable wherever a quantum process is active. Uncovering this variable confirms the possibility that it could be the stochastic archetype of quantum probability.



2021 ◽  
Author(s):  
Tim C Jenkins

Abstract Superposed wavefunctions in quantum mechanics lead to a squared amplitude that introduces interference into a probability density, which has long been a puzzle because interference between probability densities exists nowhere else in probability theory. In recent years Man’ko and co-authors have successfully reconciled quantum and classical probability using a symplectic tomographic model. Nevertheless, there remains an unexplained coincidence in quantum mechanics, namely that mathematically the interference term in the squared amplitude of superposed wavefunctions has the form of a variance of a sum of correlated random variables and we examine whether there could be an archetypical variable behind quantum probability that provides a mathematical foundation that observes both quantum and classical probability directly. The properties that would need to be satisfied for this to be the case are identified, and a generic variable that satisfies them is found that would be present everywhere, transforming into a process-specific variable wherever a quantum process is active. This hidden generic variable appears to be such an archetype.



2021 ◽  
Vol 2056 (1) ◽  
pp. 012059
Author(s):  
I N Balaba ◽  
G S Deryabina ◽  
I A Pinchuk ◽  
I V Sergeev ◽  
S B Zabelina

Abstract The article presents a historical overview of the development of the mathematical idea of a quantum computing model - a new computational strategy based on the postulates of quantum mechanics and having advantages over the traditional computational model based on the Turing machine; clarified the features of the operation of multi-qubit quantum systems, which ensure the creation of efficient algorithms; the principles of quantum computing are outlined and a number of efficient quantum algorithms are described that allow solving the problem of exponential growth of the complexity of certain problems.



Open Physics ◽  
2017 ◽  
Vol 15 (1) ◽  
pp. 762-768
Author(s):  
Douglas G. Danforth

AbstractThe general class,Λ, of Bell hidden variables is composed of two subclassesΛRandΛNsuch thatΛR⋃ΛN=ΛandΛR∩ΛN= {}. The classΛNis very large and contains random variables whose domain is the continuum, the reals. There are an uncountable infinite number of reals. Every instance of a real random variable is unique. The probability of two instances being equal is zero, exactly zero.ΛNinduces sample independence. All correlations are context dependent but not in the usual sense. There is no “spooky action at a distance”. Random variables, belonging toΛN, are independent from one experiment to the next. The existence of the classΛNmakes it impossible to derive any of the standard Bell inequalities used to define quantum entanglement.



Author(s):  
Phillip Kaye ◽  
Raymond Laflamme ◽  
Michele Mosca

We assume the reader has a strong background in elementary linear algebra. In this section we familiarize the reader with the algebraic notation used in quantum mechanics, remind the reader of some basic facts about complex vector spaces, and introduce some notions that might not have been covered in an elementary linear algebra course. The linear algebra notation used in quantum computing will likely be familiar to the student of physics, but may be alien to a student of mathematics or computer science. It is the Dirac notation, which was invented by Paul Dirac and which is used often in quantum mechanics. In mathematics and physics textbooks, vectors are often distinguished from scalars by writing an arrow over the identifying symbol: e.g a⃗. Sometimes boldface is used for this purpose: e.g. a. In the Dirac notation, the symbol identifying a vector is written inside a ‘ket’, and looks like |a⟩. We denote the dual vector for a (defined later) with a ‘bra’, written as ⟨a|. Then inner products will be written as ‘bra-kets’ (e.g. ⟨a|b⟩). We now carefully review the definitions of the main algebraic objects of interest, using the Dirac notation. The vector spaces we consider will be over the complex numbers, and are finite-dimensional, which significantly simplifies the mathematics we need. Such vector spaces are members of a class of vector spaces called Hilbert spaces. Nothing substantial is gained at this point by defining rigorously what a Hilbert space is, but virtually all the quantum computing literature refers to a finite-dimensional complex vector space by the name ‘Hilbert space’, and so we will follow this convention. We will use H to denote such a space. Since H is finite-dimensional, we can choose a basis and alternatively represent vectors (kets) in this basis as finite column vectors, and represent operators with finite matrices. As you see in Section 3, the Hilbert spaces of interest for quantum computing will typically have dimension 2n, for some positive integer n. This is because, as with classical information, we will construct larger state spaces by concatenating a string of smaller systems, usually of size two.



Author(s):  
Arthur Fine

Bell’s theorem is concerned with the outcomes of a special type of ‘correlation experiment’ in quantum mechanics. It shows that under certain conditions these outcomes would be restricted by a system of inequalities (the ‘Bell inequalities’) that contradict the predictions of quantum mechanics. Various experimental tests confirm the quantum predictions to a high degree and hence violate the Bell inequalities. Although these tests contain loopholes due to experimental inefficiencies, they do suggest that the assumptions behind the Bell inequalities are incompatible not only with quantum theory but also with nature. A central assumption used to derive the Bell inequalities is a species of no-action-at-a-distance, called ‘locality’: roughly, that the outcomes in one wing of the experiment cannot immediately be affected by measurements performed in another wing (spatially distant from the first). For this reason the Bell theorem is sometimes cited as showing that locality is incompatible with the quantum theory, and the experimental tests as demonstrating that nature is nonlocal. These claims have been contested.



Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 768 ◽  
Author(s):  
Francesco De De Martini ◽  
Fabio Sciarrino

Quantum teleportation is one of the most striking consequence of quantum mechanics and is defined as the transmission and reconstruction of an unknown quantum state over arbitrary distances. This concept was introduced for the first time in 1993 by Charles Bennett and coworkers, it has then been experimentally demonstrated by several groups under different conditions of distance, amount of particles and even with feed forward. After 20 years from its first realization, this contribution reviews the experimental implementations realized at the Quantum Optics Group of the University of Rome La Sapienza.



Sign in / Sign up

Export Citation Format

Share Document