classical probability theory
Recently Published Documents


TOTAL DOCUMENTS

44
(FIVE YEARS 12)

H-INDEX

8
(FIVE YEARS 0)

2021 ◽  
pp. 1-27
Author(s):  
Kurusch Ebrahimi-Fard ◽  
Frédéric Patras ◽  
Nikolas Tapia ◽  
Lorenzo Zambotti

Abstract Wick polynomials and Wick products are studied in the context of noncommutative probability theory. It is shown that free, Boolean, and conditionally free Wick polynomials can be defined and related through the action of the group of characters over a particular Hopf algebra. These results generalize our previous developments of a Hopf-algebraic approach to cumulants and Wick products in classical probability theory.


Author(s):  
Andrei Khrennikov

We start with the discussion on misapplication of classical probability theory by Feynman in his analysis of the two slit experiment (by following the critical argumentation of Koopman, Ballentine, and the author of this paper). The seed of Feynman's conclusion on the impossibility to apply the classical probabilistic description for the two slit experiment is treatment of conditional probabilities corresponding to different experimental contexts as unconditional ones. Then we move to the Bell type inequalities. Bell applied classical probability theory in the same manner as Feynman and, as can be expected, he also obtained the impossibility statement. In contrast to Feynman, he formulated his no-go statement not in the probabilistic terms, but by appealing to nonlocality. This note can be considered as a part of the author's attempts for getting rid off nonlocality from quantum physics.


2021 ◽  
pp. 31-92
Author(s):  
Jochen Rau

This chapter explains the approach of ‘operationalism’, which in a physical theory admits only concepts associated with concrete experimental procedures, and lays out its consequences for propositions about measurements, their logical structure, and states. It illustrates these with toy examples where the ability to perform measurements is limited by design. For systems composed of several constituents this chapter introduces the notions of composite and reduced states, statistical independence, and correlations. It examines what it means for multiple systems to be prepared identically, and how this is represented mathematically. The operational requirement that there must be procedures to measure and prepare a state is examined, and the ensuing constraints derived. It is argued that these constraint leave only one alternative to classical probability theory that is consistent, universal, and fully operational, namely, quantum theory.


Author(s):  
Markus Müller

These lecture notes provide a basic introduction to the framework of generalized probabilistic theories (GPTs) and a sketch of a reconstruction of quantum theory (QT) from simple operational principles. To build some intuition for how physics could be even more general than quantum, I present two conceivable phenomena beyond QT: superstrong nonlocality and higher-order interference. Then I introduce the framework of GPTs, generalizing both quantum and classical probability theory. Finally, I summarize a reconstruction of QT from the principles of Tomographic Locality, Continuous Reversibility, and the Subspace Axiom. In particular, I show why a quantum bit is described by a Bloch ball, why it is three-dimensional, and how one obtains the complex numbers and operators of the usual representation of QT.


2021 ◽  
Vol 3 (1) ◽  
pp. 242-252
Author(s):  
Emmanuel M. Pothos ◽  
Oliver J. Waddup ◽  
Prince Kouassi ◽  
James M. Yearsley

There has been a growing trend to develop cognitive models based on the mathematics of quantum theory. A common theme in the motivation of such models has been findings which apparently challenge the applicability of classical formalisms, specifically ones based on classical probability theory. Classical probability theory has had a singularly important place in cognitive theory, because of its (in general) descriptive success but, more importantly, because in decision situations with low, equivalent stakes it offers a multiply justified normative standard. Quantum cognitive models have had a degree of descriptive success and proponents of such models have argued that they reveal new intuitions or insights regarding decisions in uncertain situations. However, can quantum cognitive models further benefit from normative justifications analogous to those for classical probability models? If the answer is yes, how can we determine the rational status of a decision, which may be consistent with quantum theory, but inconsistent with classical probability theory? In this paper, we review the proposal from Pothos, Busemeyer, Shiffrin, and Yearsley (2017), that quantum decision models benefit from normative justification based on the Dutch Book Theorem, in exactly the same way as models based on classical probability theory.


2020 ◽  
Vol 39 (3) ◽  
pp. 2647-2655
Author(s):  
Naidan Feng ◽  
Yongquan Liang

 Aiming at the imprecise and uncertain data and knowledge, this paper proposes a novel prior assumption by the rough set theory. The performance of the classical Bayesian classifier is improved through this study. We applied the operations of approximations to represent the imprecise knowledge accurately, and the concept of approximation quality is first applied in this method. Thus, this paper provides a novel rough set theory based prior probability in classical Bayesian classifier and the corresponding rough set prior Bayesian classifier. And we chose 18 public datasets to evaluate the performance of the proposed model compared with the classical Bayesian classifier and Bayesian classifier with Dirichlet prior assumption. Sufficient experimental results verified the effectiveness of the proposed method. The mainly impacts of our proposed method are: firstly, it provides a novel methodology which combines the rough set theory with the classical probability theory; secondly, it improves the accuracy of prior assumptions; thirdly, it provides an appropriate prior probability to the classical Bayesian classifier which can improve its performance only by improving the accuracy of prior assumption and without any effect to the likelihood probability; fourthly, the proposed method provides a novel and effective method to deal with the imprecise and uncertain data; last but not least, this methodology can be extended and applied to other concepts of classical probability theory, which providing a novel methodology to the probability theory.


2020 ◽  
Author(s):  
William Icefield

When quantum mechanics is understood as a new generalized theory of probability - to be called the quantum probability theory - mysteries and controversies regarding quantum mechanics are dissolved. In the classical probability theory, that a measurement of some system requires an additional measurement apparatus is of insignificant importance - in the quantum probability theory, this comes to change. For one central single reason around a particular classical probability equation, the generalized probability view has not gained much traction, despite the fact that this essentially echoes (and provides logical underpinnings of) the conventional wisdom that `quantum mechanics just works as it is.' A classical probability axiom is just an initial intuition - there is no reason why we have to dogmatically cling onto axioms that can clearly be generalized. Issues with the principle of indifference in the classical probability theory are emphasized, along with the quantum reconstruction project of deriving quantum mechanics from epistemic requirements and potential quantum gravity consequences from the principle of maximum entropy.


2020 ◽  
Author(s):  
Andrei Tchougreeff

The field of continuous molecular shape and symmetry (dis)similarity quantifiers habitually called measures (specifically continuous shape measures - CShM or continuous symmetry measures - CSM) is obfuscated by the combinatorial numerical algorithms used in the field which restricts the applicability to the molecules containing up to twenty equivalent atoms. In the present paper we analyze this problem using various tools of classical probability theory as well as of one-particle and many-particle quantum mechanics. Applying these allows us to lift the combinatorial restriction and to identify the adequate renumbering of atoms (vertices) without considering all N! permutations of an N-vertex set so that in the end purely geometric molecular shape (dis)similarity quantifier can be defined. Developed methods can be easily implemented in the relevant computer code.<br>


2020 ◽  
Author(s):  
Andrei Tchougreeff

The field of continuous molecular shape and symmetry (dis)similarity quantifiers habitually called measures (specifically continuous shape measures - CShM or continuous symmetry measures - CSM) is obfuscated by the combinatorial numerical algorithms used in the field which restricts the applicability to the molecules containing up to twenty equivalent atoms. In the present paper we analyze this problem using various tools of classical probability theory as well as of one-particle and many-particle quantum mechanics. Applying these allows us to lift the combinatorial restriction and to identify the adequate renumbering of atoms (vertices) without considering all N! permutations of an N-vertex set so that in the end purely geometric molecular shape (dis)similarity quantifier can be defined. Developed methods can be easily implemented in the relevant computer code.<br>


2019 ◽  
Vol 52 (2) ◽  
pp. 157-186
Author(s):  
Adam Burchardt

Abstract Cumulants are a notion that comes from the classical probability theory; they are an alternative to a notion of moments. We adapt the probabilistic concept of cumulants to the setup of a linear space equipped with two multiplication structures. We present an algebraic formula which involves those two multiplications as a sum of products of cumulants. In our approach, beside cumulants, we make use of standard combinatorial tools as forests and their colourings. We also show that the resulting statement can be understood as an analogue of Leonov–Shiryaev’s formula. This purely combinatorial presentation leads to some conclusions about structure constant of Jack characters.


Sign in / Sign up

Export Citation Format

Share Document