Probability Theory and Some Useful Probability Distributions

Author(s):  
Edwin J. Green ◽  
Andrew O. Finley ◽  
William E. Strawderman
2018 ◽  
Author(s):  
Daniel Mortlock

Mathematics is the language of quantitative science, and probability and statistics are the extension of classical logic to real world data analysis and experimental design. The basics of mathematical functions and probability theory are summarized here, providing the tools for statistical modeling and assessment of experimental results. There is a focus on the Bayesian approach to such problems (ie, Bayesian data analysis); therefore, the basic laws of probability are stated, along with several standard probability distributions (eg, binomial, Poisson, Gaussian). A number of standard classical tests (eg, p values, the t-test) are also defined and, to the degree possible, linked to the underlying principles of probability theory. This review contains 5 figures, 1 table, and 15 references. Keywords: Bayesian data analysis, mathematical models, power analysis, probability, p values, statistical tests, statistics, survey design


2014 ◽  
Vol DMTCS Proceedings vol. AT,... (Proceedings) ◽  
Author(s):  
Natasha Blitvić

International audience A <i>stabilized-interval-free </i> (SIF) permutation on [n], introduced by Callan, is a permutation that does not stabilize any proper interval of [n]. Such permutations are known to be the irreducibles in the decomposition of permutations along non-crossing partitions. That is, if $s_n$ denotes the number of SIF permutations on [n], $S(z)=1+\sum_{n\geq1} s_n z^n$, and $F(z)=1+\sum_{n\geq1} n! z^n$, then $F(z)= S(zF(z))$. This article presents, in turn, a decomposition of SIF permutations along non-crossing partitions. Specifically, by working with a convenient diagrammatic representation, given in terms of perfect matchings on alternating binary strings, we arrive at the \emphchord-connected permutations on [n], counted by $\{c_n\}_{n\geq1}$, whose generating function satisfies $S(z)= C(zS(z))$. The expressions at hand have immediate probabilistic interpretations, via the celebrated <i>moment-cumulant formula </i>of Speicher, in the context of the <i>free probability theory </i>of Voiculescu. The probability distributions that appear are the exponential and the complex Gaussian.


2015 ◽  
Author(s):  
PierGianLuca Porta Mana ◽  
Emiliano Torre ◽  
Vahid Rostami

This note summarizes some mathematical relations between the probability distributions for the states of a network of binary units and a subnetwork thereof, under an assumption of symmetry. These relations are standard results of probability theory, but seem to be rarely used in neuroscience. Some of their consequences for inferences between network and subnetwork, especially in connection with the maximum-entropy principle, are briefly discussed. The meanings and applicability of the assumption of symmetry are also discussed.


In this chapter, the authors discuss some basic concepts of probability theory and possibility theory that are useful when reading the subsequent chapters of this book. The multi-objective fuzzy stochastic programming models developed in this book are based on the concepts of advanced topics in fuzzy set theory and fuzzy random variables (FRVs). Therefore, for better understanding of these advanced areas, the authors at first presented some basic ideas of probability theory and probability density functions of different continuous probability distributions. Afterwards, the necessity of the introduction of the concept of fuzzy set theory, some important terms related to fuzzy set theory are discussed. Different defuzzification methodologies of fuzzy numbers (FNs) that are useful in solving the mathematical models in imprecisely defined decision-making environments are explored. The concept of using FRVs in decision-making contexts is defined. Finally, the development of different forms of fuzzy goal programming (FGP) techniques for solving multi-objective decision-making (MODM) problems is underlined.


Author(s):  
Kalman Ziha

Abstract The probabilistic safety analysis evaluates system reliability and failure probability by using statistics and probability theory but it cannot estimate the system uncertainties due to variabilities of system state probabilities. The article firstly resumes how the information entropy expresses the probabilistic uncertainties due to unevenness of probability distributions of system states. Next it argues that the conditional entropy with respect to system operational and failure states appropriately describes system redundancy and robustness, respectively. Finally the article concludes that the joint probabilistic uncertainties of reliability, redundancy and robustness defines the integral system safety. The concept of integral system safety allows more comprehensive definitions of favorable system functional properties, configuration evaluation, optimization and decision making in engineering.


1995 ◽  
Vol 7 (3) ◽  
pp. 580-595 ◽  
Author(s):  
Alan L. Yuille ◽  
Stelios M. Smirnakis ◽  
Lei Xu

Recent work by Becker and Hinton (1992) shows a promising mechanism, based on maximizing mutual information assuming spatial coherence, by which a system can self-organize to learn visual abilities such as binocular stereo. We introduce a more general criterion, based on Bayesian probability theory, and thereby demonstrate a connection to Bayesian theories of visual perception and to other organization principles for early vision (Atick and Redlich 1990). Methods for implementation using variants of stochastic learning are described.


Author(s):  
M. Vidyasagar

This chapter provides an introduction to probability and random variables. Probability theory is an attempt to formalize the notion of uncertainty in the outcome of an experiment. For instance, suppose an urn contains four balls, colored red, blue, white, and green respectively. Suppose we dip our hand in the urn and pull out one of the balls “at random.” What is the likelihood that the ball we pull out will be red? The chapter first defines a random variable and probability before discussing the function of a random variable and expected value. It then considers total variation distance, joint and marginal probability distributions, independence and conditional probability distributions, Bayes' rule, and maximum likelihood estimates. Finally, it describes random variables assuming infinitely many values, focusing on Markov and Chebycheff inequalities, Hoeffding's inequality, Monte Carlo simulation, and Cramér's theorem.


Mathematics ◽  
2019 ◽  
Vol 7 (2) ◽  
pp. 191 ◽  
Author(s):  
Shouzhen Zeng ◽  
Shahzaib Asharf ◽  
Muhammad Arif ◽  
Saleem Abdullah

A divergence measure plays a crucial part in discriminating two probability distributions and drawing inferences constructed on such discrimination. The intention of this study is to propose such a divergence measure based on Jensen inequality and exponential entropy in the settings of probability theory. Further, the idea has been generalized to fuzzy sets to familiarize a novel picture fuzzy divergence measure. Besides proposing the validity, some of its key properties are also deliberated. Finally, two illustrative examples are solved based on the proposed picture fuzzy divergence measure which shows the expediency and effectiveness of the proposed approach.


Sign in / Sign up

Export Citation Format

Share Document