Basic probability theory

Author(s):  
Thomas P. Trappenberg

The discussion provides a refresher of probability theory, in particular with respect to the formulations that build the theoretical language of modern machine learning. Probability theory is the formalism of random numbers, and this chapter outlines what these are and how they are characterized by probability density or probability mass functions. How such functions have traditionally been characterized is covered, and a review of how to work with such mathematical objects such as transforming density functions and how to measure differences between density function is presented. Definitions and basic operations with multiple random variables, including the Bayes law, are covered. The chapter ends with an outline of some important approximation techniques of so-called Monte Carlo methods.

Author(s):  
Eahsan Shahriary ◽  
Amir Hajibabaee

This book offers the students and researchers a unique introduction to Bayesian statistics. Authors provide a wonderful journey in the realm of Bayesian Probability and aspire readers to become Bayesian statisticians. The book starts with Introduction to Probability and covers Bayes’ Theorem, Probability Mass Functions, Probability Density Functions, The Beta-Binomial Conjugate, Markov chain Monte Carlo (MCMC), and Metropolis-Hastings Algorithm. The book is very well written, and topics are very to the point with real-world applications but does not provide examples for computing using common open-source software.


Author(s):  
M. D. Edge

This chapter considers the rules of probability. Probabilities are non-negative, they sum to one, and the probability that either of two mutually exclusive events occurs is the sum of the probability of the two events. Two events are said to be independent if the probability that they both occur is the product of the probabilities that each event occurs. Bayes’ theorem is used to update probabilities on the basis of new information, and it is shown that the conditional probabilities P(A|B) and P(B|A) are not the same. Finally, the chapter discusses ways in which distributions of random variables can be described, using probability mass functions for discrete random variables and probability density functions for continuous random variables.


2021 ◽  
Vol 15 (1) ◽  
pp. 408-433
Author(s):  
Margaux Dugardin ◽  
Werner Schindler ◽  
Sylvain Guilley

Abstract Extra-reductions occurring in Montgomery multiplications disclose side-channel information which can be exploited even in stringent contexts. In this article, we derive stochastic attacks to defeat Rivest-Shamir-Adleman (RSA) with Montgomery ladder regular exponentiation coupled with base blinding. Namely, we leverage on precharacterized multivariate probability mass functions of extra-reductions between pairs of (multiplication, square) in one iteration of the RSA algorithm and that of the next one(s) to build a maximum likelihood distinguisher. The efficiency of our attack (in terms of required traces) is more than double compared to the state-of-the-art. In addition to this result, we also apply our method to the case of regular exponentiation, base blinding, and modulus blinding. Quite surprisingly, modulus blinding does not make our attack impossible, and so even for large sizes of the modulus randomizing element. At the cost of larger sample sizes our attacks tolerate noisy measurements. Fortunately, effective countermeasures exist.


Author(s):  
Roohallah Alizadehsani ◽  
Mohamad Roshanzamir ◽  
Sadiq Hussain ◽  
Abbas Khosravi ◽  
Afsaneh Koohestani ◽  
...  

Mathematics ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 1409
Author(s):  
Marija Boričić Joksimović

We give some simple examples of applying some of the well-known elementary probability theory inequalities and properties in the field of logical argumentation. A probabilistic version of the hypothetical syllogism inference rule is as follows: if propositions A, B, C, A→B, and B→C have probabilities a, b, c, r, and s, respectively, then for probability p of A→C, we have f(a,b,c,r,s)≤p≤g(a,b,c,r,s), for some functions f and g of given parameters. In this paper, after a short overview of known rules related to conjunction and disjunction, we proposed some probabilized forms of the hypothetical syllogism inference rule, with the best possible bounds for the probability of conclusion, covering simultaneously the probabilistic versions of both modus ponens and modus tollens rules, as already considered by Suppes, Hailperin, and Wagner.


Author(s):  
Roy Billinton ◽  
Ronald N. Allan

Author(s):  
Ari S. Benjamin ◽  
Hugo L. Fernandes ◽  
Tucker Tomlinson ◽  
Pavan Ramkumar ◽  
Chris VerSteeg ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document