foundations of probability
Recently Published Documents





Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 968
Fritiof Wallentin

It is shown that the hallmark quantum phenomenon of contextuality is present in classical statistical mechanics (CSM). It is first shown that the occurrence of contextuality is equivalent to there being observables that can differentiate between pure and mixed states. CSM is formulated in the formalism of quantum mechanics (FQM), a formulation commonly known as the Koopman–von Neumann formulation (KvN). In KvN, one can then show that such a differentiation between mixed and pure states is possible. As contextuality is a probabilistic phenomenon and as it is exhibited in both classical physics and ordinary quantum mechanics (OQM), it is concluded that the foundational issues regarding quantum mechanics are really issues regarding the foundations of probability.

Synthese ◽  
2021 ◽  
Miklós Rédei ◽  
Zalán Gyenis

AbstractIt is shown that by realizing the isomorphism features of the frequency and geometric interpretations of probability, Reichenbach comes very close to the idea of identifying mathematical probability theory with measure theory in his 1949 work on foundations of probability. Some general features of Reichenbach’s axiomatization of probability theory are pointed out as likely obstacles that prevented him making this conceptual move. The role of isomorphisms of Kolmogorovian probability measure spaces is specified in what we call the “Maxim of Probabilism”, which states that a necessary condition for a concept to be probabilistic is its invariance with respect to measure-theoretic isomorphisms. The functioning of the Maxim of Probabilism is illustrated by the example of conditioning via conditional expectations.

2021 ◽  
pp. 221-256
Manfred Borovcnik

In this paper, we analyse the various meanings of probability and its different applications, and we focus especially on the classical, the frequentist, and the subjectivist view. We describe the different problems of how probability can be measured in each of the approaches, and how each of them can be well justified by a mathematical theory. We analyse the foundations of probability, where the scientific analysis of the theory that allows for a frequentist interpretation leads to unsolvable problems. Kolmogorov’s axiomatic theory does not suffice to establish statistical inference without further definitions and principles. Finally, we show how statistical inference essentially determines the meaning of probability and a shift emerges from purely objectivist views to a complementary conception of probability with frequentist and subjectivist constituents. For didactical purpose, the result of the present analyses explains basic problems of teaching, originating from a biased focus on frequentist aspects of probability. It also indicates a high priority for the design of suitable learning paths to a complementary conception of probability. In the applications, modellers use information in a pragmatic way processing this information regardless of its connotation into formal mathematical models, which are always thought as essentially wrong but useful.

2020 ◽  
Gane Samb LO ◽  
Aladji Babacar Niang ◽  
Lois Chinwendu Okereke

This book introduces to the theory of probabilities from the beginning. Assuming that the reader possesses the normal mathematical level acquired at the end of the secondary school, we aim to equip him with a solid basis in probability theory. The theory is preceded by a general chapter on counting methods. Then, the theory of probabilities is presented in a discrete framework. Two objectives are sought. The first is to give the reader the ability to solve a large number of problems related to probability theory, including application problems in a variety of disciplines. The second is to prepare the reader before he takes course on the mathematical foundations of probability theory. In this later book, the reader will concentrate more on mathematical concepts, while in the present text, experimental frameworks are mostly found. If both objectives are met, the reader will have already acquired a definitive experience in problem-solving ability with the tools of probability theory and at the same time he is ready to move on to a theoretical course on probability theory based on the theory of Measure and Integration. The book ends with a chapter that allows the reader to begin an intermediate course in mathematical statistics.

2020 ◽  
Vol 2 ◽  
pp. 3
Tobias Fritz ◽  
Eigil Fjeldgren Rischel

Markov categories are a recent category-theoretic approach to the foundations of probability and statistics. Here we develop this approach further by treating infinite products and the Kolmogorov extension theorem. This is relevant for all aspects of probability theory in which infinitely many random variables appear at a time. These infinite tensor products ⨂i∈JXi come in two versions: a weaker but more general one for families of objects (Xi)i∈J in semicartesian symmetric monoidal categories, and a stronger but more specific one for families of objects in Markov categories.As a first application, we state and prove versions of the zero--one laws of Kolmogorov and Hewitt--Savage for Markov categories. This gives general versions of these results which can be instantiated not only in measure-theoretic probability, where they specialize to the standard ones in the setting of standard Borel spaces, but also in other contexts.

2019 ◽  
Vol 13 (3) ◽  
pp. 593-610

AbstractThis article builds on a recent paper coauthored by the present author, H. Hosni and F. Montagna. It is meant to contribute to the logical foundations of probability theory on many-valued events and, specifically, to a deeper understanding of the notion of strict coherence. In particular, we will make use of geometrical, measure-theoretical and logical methods to provide three characterizations of strict coherence on formulas of infinite-valued Łukasiewicz logic.

Max A. Little

Statistical machine learning and statistical DSP are built on the foundations of probability theory and random variables. Different techniques encode different dependency structure between these variables. This structure leads to specific algorithms for inference and estimation. Many common dependency structures emerge naturally in this way, as a result, there are many common patterns of inference and estimation that suggest general algorithms for this purpose. So, it becomes important to formalize these algorithms; this is the purpose of this chapter. These general algorithms can often lead to substantial computational savings over more brute-force approaches, another benefit that comes from studying the structure of these models in the abstract.

Philosophy ◽  
2019 ◽  
Darrell P. Rowbottom

Sir Karl Popper was one of the most prolific philosophers of the 20th century, and remains one of the most influential. His most significant academic contributions were in the philosophy of science, and he collaborated with several scientists. His social and political philosophy was largely responsible for his extra-academic fame, and has had more impact. Overall, his work was exceptionally broad. He also made contributions to ancient philosophy, epistemology, metaphysics, philosophy of biology, philosophy of education, philosophy of history, philosophy of mind (and psychology), philosophy of physics, and the foundations of probability theory (both mathematical and philosophical).

Sign in / Sign up

Export Citation Format

Share Document