Review Of Some Mathematical And Physical Subjects

Author(s):  
Abraham Nitzan

This chapter reviews some subjects in mathematics and physics that are used in different contexts throughout this book. The selection of subjects and the level of their coverage reflect the author’s perception of what potential users of this text were exposed to in their earlier studies. Therefore, only brief overview is given of some subjects while somewhat more comprehensive discussion is given of others. In neither case can the coverage provided substitute for the actual learning of these subjects that are covered in detail by many textbooks. A random variable is an observable whose repeated determination yields a series of numerical values (“realizations” of the random variable) that vary from trial to trial in a way characteristic of the observable. The outcomes of tossing a coin or throwing a die are familiar examples of discrete random variables. The position of a dust particle in air and the lifetime of a light bulb are continuous random variables. Discrete random variables are characterized by probability distributions; Pn denotes the probability that a realization of the given random variable is n. Continuous random variables are associated with probability density functions P(x): P(x1)dx denotes the probability that the realization of the variable x will be in the interval x1 . . . x1+dx.

Author(s):  
M. D. Edge

This chapter considers the rules of probability. Probabilities are non-negative, they sum to one, and the probability that either of two mutually exclusive events occurs is the sum of the probability of the two events. Two events are said to be independent if the probability that they both occur is the product of the probabilities that each event occurs. Bayes’ theorem is used to update probabilities on the basis of new information, and it is shown that the conditional probabilities P(A|B) and P(B|A) are not the same. Finally, the chapter discusses ways in which distributions of random variables can be described, using probability mass functions for discrete random variables and probability density functions for continuous random variables.


Author(s):  
Therese M. Donovan ◽  
Ruth M. Mickey

This chapter focuses on probability mass functions. One of the primary uses of Bayesian inference is to estimate parameters. To do so, it is necessary to first build a good understanding of probability distributions. This chapter introduces the idea of a random variable and presents general concepts associated with probability distributions for discrete random variables. It starts off by discussing the concept of a function and goes on to describe how a random variable is a type of function. The binomial distribution and the Bernoulli distribution are then used as examples of the probability mass functions (pmf’s). The pmfs can be used to specify prior distributions, likelihoods, likelihood profiles and/or posterior distributions in Bayesian inference.


2021 ◽  
pp. 109-124
Author(s):  
Timothy E. Essington

The chapter “Random Variables and Probability” serves as both a review and a reference on probability. The random variable is the core concept in understanding probability, parameter estimation, and model selection. This chapter reviews the basic idea of a random variable and discusses the two main kinds of random variables: discrete random variables and continuous random variables. It covers the distinction between discrete and continuous random variables and outlines the most common probability mass or density functions used in ecology. Advanced sections cover distributions such as the gamma distribution, Student’s t-distribution, the beta distribution, the beta-binomial distribution, and zero-inflated models.


Author(s):  
Therese M. Donovan ◽  
Ruth M. Mickey

This chapter builds on probability distributions. Its focus is on general concepts associated with probability density functions (pdf’s), which are distributions associated with continuous random variables. The continuous uniform and normal distributions are highlighted as examples of pdf’s. These and other pdf’s can be used to specify prior distributions, likelihoods, and/or posterior distributions in Bayesian inference. Although this chapter specifically focuses on the continuous uniform and normal distributions, the concepts discussed in this chapter will apply to other continuous probability distributions. By the end of the chapter, the reader should be able to define and use the following terms for a continuous random variable: random variable, probability distribution, parameter, probability density, likelihood, and likelihood profile.


Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


Author(s):  
Robert H. Swendsen

The theory of probability developed in Chapter 3 for discrete random variables is extended to probability distributions, in order to treat the continuous momentum variables. The Dirac delta function is introduced as a convenient tool to transform continuous random variables, in analogy with the use of the Kronecker delta for discrete random variables. The properties of the Dirac delta function that are needed in statistical mechanics are presented and explained. The addition of two continuous random numbers is given as a simple example. An application of Bayesian probability is given to illustrate its significance. However, the components of the momenta of the particles in an ideal gas are continuous variables.


1987 ◽  
Vol 19 (3) ◽  
pp. 632-651 ◽  
Author(s):  
Ushio Sumita ◽  
Yasushi Masuda

We consider a class of functions on [0,∞), denoted by Ω, having Laplace transforms with only negative zeros and poles. Of special interest is the class Ω+ of probability density functions in Ω. Simple and useful conditions are given for necessity and sufficiency of f ∊ Ω to be in Ω+. The class Ω+ contains many classes of great importance such as mixtures of n independent exponential random variables (CMn), sums of n independent exponential random variables (PF∗n), sums of two independent random variables, one in CMr and the other in PF∗1 (CMPFn with n = r + l) and sums of independent random variables in CMn(SCM). Characterization theorems for these classes are given in terms of zeros and poles of Laplace transforms. The prevalence of these classes in applied probability models of practical importance is demonstrated. In particular, sufficient conditions are given for complete monotonicity and unimodality of modified renewal densities.


1984 ◽  
Vol 106 (1) ◽  
pp. 5-10 ◽  
Author(s):  
J. N. Siddall

The anomalous position of probability and statistics in both mathematics and engineering is discussed, showing that there is little consensus on concepts and methods. For application in engineering design, probability is defined as strictly subjective in nature. It is argued that the use of classical methods of statistics to generate probability density functions by estimating parameters for assumed theoretical distributions should be used with caution, and that the use of confidence limits is not really meaningful in a design context. Preferred methods are described, and a new evolutionary technique for developing probability distributions of new random variables is proposed. Although Bayesian methods are commonly considered to be subjective, it is argued that, in the engineering sense, they are really not. A general formulation of the probabilistic optimization problem is described, including the role of subjective probability density functions.


Sign in / Sign up

Export Citation Format

Share Document