EXTENSIONS OF QUANTUM THEORY CANONICALLY ASSOCIATED TO CLASSICAL PROBABILITY MEASURES

Author(s):  
Luigi Accardi
Author(s):  
Kamil Szpojankowski

In the paper we study characterizations of probability measures in free probability. By constancy of regressions for random variable 𝕍1/2(𝕀 - 𝕌)𝕍1/2 given by 𝕍1/2𝕌𝕍1/2, where 𝕌 and 𝕍 are free, we characterize free Poisson and free binomial distributions. Our paper is a free probability analogue of results known in classical probability,3 where gamma and beta distributions are characterized by constancy of 𝔼((V(1 - U))i|UV), for i ∈ {-2, -1, 1, 2}. This paper together with previous results18 exhaust all cases of characterizations from Ref. 3.


2021 ◽  
pp. 31-92
Author(s):  
Jochen Rau

This chapter explains the approach of ‘operationalism’, which in a physical theory admits only concepts associated with concrete experimental procedures, and lays out its consequences for propositions about measurements, their logical structure, and states. It illustrates these with toy examples where the ability to perform measurements is limited by design. For systems composed of several constituents this chapter introduces the notions of composite and reduced states, statistical independence, and correlations. It examines what it means for multiple systems to be prepared identically, and how this is represented mathematically. The operational requirement that there must be procedures to measure and prepare a state is examined, and the ensuing constraints derived. It is argued that these constraint leave only one alternative to classical probability theory that is consistent, universal, and fully operational, namely, quantum theory.


2013 ◽  
Vol 55 (1) ◽  
pp. 85-94
Author(s):  
Jana Havlíčková

Abstract In the classical probability, as well as in the fuzzy probability theory, random events and probability measures are modelled by functions into the closed unit interval [0,1]. Using elementary methods of category theory, we present a classification of the extensions of generalized probability measures (probability measures and integrals with respect to probability measures) from a suitable class of generalized random events to a larger class having some additional (algebraic and/or topological) properties. The classification puts into a perspective the classical and some recent constructions related to the extension of sequentially continuous functions.


Author(s):  
Miguel Ángel Lozada Aguilar ◽  
Andrei Khrennikov ◽  
Klaudia Oleschko ◽  
María de Jesús Correa

The paper starts with a brief review of the literature about uncertainty in geological, geophysical and petrophysical data. In particular, we present the viewpoints of experts in geophysics on the application of Bayesian inference and subjective probability. Then we present arguments that the use of classical probability theory (CP) does not match completely the structure of geophysical data. We emphasize that such data are characterized by contextuality and non-Kolmogorovness (the impossibility to use the CP model), incompleteness as well as incompatibility of some geophysical measurements. These characteristics of geophysical data are similar to the characteristics of quantum physical data. Notwithstanding all this, contextuality can be seen as a major deviation of quantum theory from classical physics. In particular, the contextual probability viewpoint is the essence of the Växjö interpretation of quantum mechanics. We propose to use quantum probability (QP) for decision-making during the characterization, modelling, exploring and management of the intelligent hydrocarbon reservoir . Quantum Bayesianism (QBism), one of the recently developed information interpretations of quantum theory, can be used as the interpretational basis for such QP decision-making in geology, geophysics and petroleum projects design and management. This article is part of the themed issue ‘Second quantum revolution: foundational questions’.


2013 ◽  
Vol 11 (01) ◽  
pp. 1350013 ◽  
Author(s):  
JACEK JURKOWSKI

Due to some ambiguity in the definition of mutual Tsallis entropy in classical probability theory, its generalization to quantum theory is discussed and, as a consequence, two types of generalized quantum discords, called q-discords, are defined in terms of quantum Tsallis entropy. Both q-discords for two-qubit Werner and isotropic states are determined and compared and it is shown that one of them is non-negative, at least for states under investigation, for all q > 0. Finally, an analytical expression for q-discord of certain family of two-qubit X-states is presented. Using this example, we show that both types of q-discords can take negative values for some q > 1, hence their use as correlations measures is rather limited.


2016 ◽  
Vol 2016 ◽  
pp. 1-7 ◽  
Author(s):  
Karl Hess ◽  
Hans De Raedt ◽  
Kristel Michielsen

In 1862, George Boole derived an inequality for variables that represents a demarcation line between possible and impossible experience. This inequality forms an important milestone in the epistemology of probability theory and probability measures. In 1985 Leggett and Garg derived a physics related inequality, mathematically identical to Boole’s, that according to them represents a demarcation between macroscopic realism and quantum mechanics. We show that a wide gulf separates the “sense impressions” and corresponding data, as well as the postulates of macroscopic realism, from the mathematical abstractions that are used to derive the inequality of Leggett and Garg. If the gulf can be bridged, one may indeed derive the said inequality, which is then clearly a demarcation between possible and impossible experience: it cannot be violated and is not violated by quantum theory. This implies that the Leggett-Garg inequality does not mean that the SQUID flux is not there when nobody looks, as Leggett and Garg suggest, but instead that the probability measures may not be what Leggett and Garg have assumed them to be, when no data can be secured that directly relate to them. We show that similar considerations apply to other quantum interpretation-puzzles such as two-slit experiments.


2019 ◽  
Vol 39 (2) ◽  
pp. 237-258 ◽  
Author(s):  
Włodzimierz Bryc ◽  
Raouf Fakhfakh ◽  
Wojciech Młotkowski

This paper studies variance functions of Cauchy–Stieltjes Kernel CSK families generated by compactly supported centered probability measures. We describe several operations that allow us to construct additional variance functions from known ones. We construct a class of examples which exhausts all cubic variance functions, and provide examples of polynomial variance functions of arbitrary degree. We also relate CSK families with polynomial variance functions to generalized orthogonality.Our main results are stated solely in terms of classical probability; some proofs rely on analytic machinery of free probability.


Author(s):  
Markus Müller

These lecture notes provide a basic introduction to the framework of generalized probabilistic theories (GPTs) and a sketch of a reconstruction of quantum theory (QT) from simple operational principles. To build some intuition for how physics could be even more general than quantum, I present two conceivable phenomena beyond QT: superstrong nonlocality and higher-order interference. Then I introduce the framework of GPTs, generalizing both quantum and classical probability theory. Finally, I summarize a reconstruction of QT from the principles of Tomographic Locality, Continuous Reversibility, and the Subspace Axiom. In particular, I show why a quantum bit is described by a Bloch ball, why it is three-dimensional, and how one obtains the complex numbers and operators of the usual representation of QT.


2021 ◽  
Vol 3 (1) ◽  
pp. 242-252
Author(s):  
Emmanuel M. Pothos ◽  
Oliver J. Waddup ◽  
Prince Kouassi ◽  
James M. Yearsley

There has been a growing trend to develop cognitive models based on the mathematics of quantum theory. A common theme in the motivation of such models has been findings which apparently challenge the applicability of classical formalisms, specifically ones based on classical probability theory. Classical probability theory has had a singularly important place in cognitive theory, because of its (in general) descriptive success but, more importantly, because in decision situations with low, equivalent stakes it offers a multiply justified normative standard. Quantum cognitive models have had a degree of descriptive success and proponents of such models have argued that they reveal new intuitions or insights regarding decisions in uncertain situations. However, can quantum cognitive models further benefit from normative justifications analogous to those for classical probability models? If the answer is yes, how can we determine the rational status of a decision, which may be consistent with quantum theory, but inconsistent with classical probability theory? In this paper, we review the proposal from Pothos, Busemeyer, Shiffrin, and Yearsley (2017), that quantum decision models benefit from normative justification based on the Dutch Book Theorem, in exactly the same way as models based on classical probability theory.


Sign in / Sign up

Export Citation Format

Share Document