scholarly journals The Paradigm of Complex Probability and Thomas Bayes’ Theorem

2021 ◽  
Author(s):  
Abdo Abou Jaoude

The mathematical probability concept was set forth by Andrey Nikolaevich Kolmogorov in 1933 by laying down a five-axioms system. This scheme can be improved to embody the set of imaginary numbers after adding three new axioms. Accordingly, any stochastic phenomenon can be performed in the set C of complex probabilities which is the summation of the set R of real probabilities and the set M of imaginary probabilities. Our objective now is to encompass complementary imaginary dimensions to the stochastic phenomenon taking place in the “real” laboratory in R and as a consequence to gauge in the sets R, M, and C all the corresponding probabilities. Hence, the probability in the entire set C = R + M is incessantly equal to one independently of all the probabilities of the input stochastic variable distribution in R, and subsequently the output of the random phenomenon in R can be evaluated totally in C. This is due to the fact that the probability in C is calculated after the elimination and subtraction of the chaotic factor from the degree of our knowledge of the nondeterministic phenomenon. We will apply this novel paradigm to the classical Bayes’ theorem in probability theory.


2018 ◽  
Vol 7 (2) ◽  
pp. 50
Author(s):  
Pierpaolo Angelini ◽  
Angela De Sanctis

The notion of exchangeability referring to random events is investigated by using a geometric scheme of representation of possible alternatives. When we distribute among them our sensations of probability, we point out the multilinear essence of exchangeability by means of this scheme. Since we observe a natural one-to-one correspondence between multilinear maps and linear maps, we are able to underline that linearity concept is the most meaningful mathematical concept of probability theory. Exchangeability hypothesis is maintained for mixtures of Bernoulli processes in the same way. We are the first in the world to do this kind of work and for this reason we believe that it is inevitable that our references limit themselves only to those pioneering works which do not keep the real and deep meaning of probability concept a secret, unlike the current ones.



1970 ◽  
Vol 133 (1) ◽  
pp. 98
Author(s):  
J. F. C. Kingman ◽  
Martin Eisen




Radiocarbon ◽  
2001 ◽  
Vol 43 (2A) ◽  
pp. 373-380 ◽  
Author(s):  
Peter Steier ◽  
Werner Rom ◽  
Stephan Puchegger

The probabilistic radiocarbon calibration approach, which largely has replaced the intercept method in 14C dating, is based on the so-called Bayes' theorem (Bayes 1763). Besides single-sample calibration, Bayesian mathematics also supplies tools for combining 14C results of many samples with independent archaeological information such as typology or stratigraphy (Buck et al. 1996). However, specific assumptions in the “prior probabilities”, used to transform the archaeological information into mathematical probability distributions, may bias the results (Steier and Rom 2000). A general technique for guarding against such a bias is “sensitivity analysis”, in which a range of possible prior probabilities is tested. Only results that prove robust in this analysis should be used. We demonstrate the impact of this method for an assumed, yet realistic case of stratigraphically ordered samples from the Hallstatt period, i.e. the Early Iron Age in Central Europe.



2021 ◽  
Author(s):  
Abdo Abou Jaoude

The concept of mathematical probability was established in 1933 by Andrey Nikolaevich Kolmogorov by defining a system of five axioms. This system can be enhanced to encompass the imaginary numbers set after the addition of three novel axioms. As a result, any random experiment can be executed in the complex probabilities set C which is the sum of the real probabilities set R and the imaginary probabilities set M. We aim here to incorporate supplementary imaginary dimensions to the random experiment occurring in the “real” laboratory in R and therefore to compute all the probabilities in the sets R, M, and C. Accordingly, the probability in the whole set C = R + M is constantly equivalent to one independently of the distribution of the input random variable in R, and subsequently the output of the stochastic experiment in R can be determined absolutely in C. This is the consequence of the fact that the probability in C is computed after the subtraction of the chaotic factor from the degree of our knowledge of the nondeterministic experiment. We will apply this innovative paradigm to Isaac Newton’s classical mechanics and to prove as well in an original way an important property at the foundation of statistical physics.



2014 ◽  
Vol 24 (3) ◽  
Author(s):  
CHRISTOPHER P. PORTER

In this paper, I discuss the extent to which Kolmogorov drew upon von Mises' work in addressing the problem of why probability is applicable to events in the real world, which I refer to as the problem of the applicability of probability, or the applicability problem for short. In particular, I highlight the role of randomness in Kolmogorov's account, and I argue that this role differs significantly from the role that randomness plays in von Mises' account.



2014 ◽  
Vol 2 (1) ◽  
Author(s):  
Alexander Kovačec ◽  
Miguel M. R. Moreira ◽  
David P. Martins

AbstractAlon and Yuster give for independent identically distributed real or vector valued random variables X, Y combinatorially proved estimates of the form Prob(∥X − Y∥ ≤ b) ≤ c Prob(∥X − Y∥ ≤ a). We derive these using copositive matrices instead. By the same method we also give estimates for the real valued case, involving X + Y and X − Y, due to Siegmund-Schultze and von Weizsäcker as generalized by Dong, Li and Li. Furthermore, we formulate a version of the above inequalities as an integral inequality for monotone functions.



Synthese ◽  
2021 ◽  
Author(s):  
Miklós Rédei ◽  
Zalán Gyenis

AbstractIt is shown that by realizing the isomorphism features of the frequency and geometric interpretations of probability, Reichenbach comes very close to the idea of identifying mathematical probability theory with measure theory in his 1949 work on foundations of probability. Some general features of Reichenbach’s axiomatization of probability theory are pointed out as likely obstacles that prevented him making this conceptual move. The role of isomorphisms of Kolmogorovian probability measure spaces is specified in what we call the “Maxim of Probabilism”, which states that a necessary condition for a concept to be probabilistic is its invariance with respect to measure-theoretic isomorphisms. The functioning of the Maxim of Probabilism is illustrated by the example of conditioning via conditional expectations.



Sign in / Sign up

Export Citation Format

Share Document