Uniform Random Variables: Do They Exist in the Subjective Sense ?

1992 ◽  
Vol 42 (1-2) ◽  
pp. 125-128 ◽  
Author(s):  
Sudhakar Kunte ◽  
R.N. Rattihalli

Two methods of generating a random variable following a uniform distribution over [0, 1] on the basis of sequences of i.i.d. discrete random variables are discussed.

1975 ◽  
Vol 7 (4) ◽  
pp. 830-844 ◽  
Author(s):  
Lajos Takács

A sequence of random variables η0, η1, …, ηn, … is defined by the recurrence formula ηn = max (ηn–1 + ξn, 0) where η0 is a discrete random variable taking on non-negative integers only and ξ1, ξ2, … ξn, … is a semi-Markov sequence of discrete random variables taking on integers only. Define Δ as the smallest n = 1, 2, … for which ηn = 0. The random variable ηn can be interpreted as the content of a dam at time t = n(n = 0, 1, 2, …) and Δ as the time of first emptiness. This paper deals with the determination of the distributions of ηn and Δ by using the method of matrix factorisation.


Author(s):  
Therese M. Donovan ◽  
Ruth M. Mickey

This chapter focuses on probability mass functions. One of the primary uses of Bayesian inference is to estimate parameters. To do so, it is necessary to first build a good understanding of probability distributions. This chapter introduces the idea of a random variable and presents general concepts associated with probability distributions for discrete random variables. It starts off by discussing the concept of a function and goes on to describe how a random variable is a type of function. The binomial distribution and the Bernoulli distribution are then used as examples of the probability mass functions (pmf’s). The pmfs can be used to specify prior distributions, likelihoods, likelihood profiles and/or posterior distributions in Bayesian inference.


2021 ◽  
pp. 109-124
Author(s):  
Timothy E. Essington

The chapter “Random Variables and Probability” serves as both a review and a reference on probability. The random variable is the core concept in understanding probability, parameter estimation, and model selection. This chapter reviews the basic idea of a random variable and discusses the two main kinds of random variables: discrete random variables and continuous random variables. It covers the distinction between discrete and continuous random variables and outlines the most common probability mass or density functions used in ecology. Advanced sections cover distributions such as the gamma distribution, Student’s t-distribution, the beta distribution, the beta-binomial distribution, and zero-inflated models.


2020 ◽  
Author(s):  
Ahmad Sudi Pratikno

Probability to learn someone's chance in getting or winning an event. In the discrete random variable is more identical to repeated experiments, to form a pattern. Discrete random variables can be calculated as the probability distribution by calculating each value that might get a certain probability value.


1975 ◽  
Vol 7 (04) ◽  
pp. 830-844
Author(s):  
Lajos Takács

A sequence of random variables η 0, η 1, …, ηn , … is defined by the recurrence formula ηn = max (η n–1 + ξn , 0) where η 0 is a discrete random variable taking on non-negative integers only and ξ 1, ξ 2, … ξn , … is a semi-Markov sequence of discrete random variables taking on integers only. Define Δ as the smallest n = 1, 2, … for which ηn = 0. The random variable ηn can be interpreted as the content of a dam at time t = n(n = 0, 1, 2, …) and Δ as the time of first emptiness. This paper deals with the determination of the distributions of ηn and Δ by using the method of matrix factorisation.


Author(s):  
Abraham Nitzan

This chapter reviews some subjects in mathematics and physics that are used in different contexts throughout this book. The selection of subjects and the level of their coverage reflect the author’s perception of what potential users of this text were exposed to in their earlier studies. Therefore, only brief overview is given of some subjects while somewhat more comprehensive discussion is given of others. In neither case can the coverage provided substitute for the actual learning of these subjects that are covered in detail by many textbooks. A random variable is an observable whose repeated determination yields a series of numerical values (“realizations” of the random variable) that vary from trial to trial in a way characteristic of the observable. The outcomes of tossing a coin or throwing a die are familiar examples of discrete random variables. The position of a dust particle in air and the lifetime of a light bulb are continuous random variables. Discrete random variables are characterized by probability distributions; Pn denotes the probability that a realization of the given random variable is n. Continuous random variables are associated with probability density functions P(x): P(x1)dx denotes the probability that the realization of the variable x will be in the interval x1 . . . x1+dx.


2019 ◽  
Author(s):  
Tomohiro Nishiyama

The variance and the entropy power of a continuous random variable are bounded from below by the reciprocal of its Fisher information through the Cram\'{e}r-Rao bound and the Stam's inequality respectively. In this note, we introduce the Fisher information for discrete random variables and derive the discrete Cram\'{e}r-Rao-type bound and the discrete Stam's inequality.


Mathematics ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 1337
Author(s):  
Gytenis Lileika ◽  
Vigirdas Mackevičius

In this paper, we construct second-order weak split-step approximations of the CKLS and CEV processes that use generation of a three−valued random variable at each discretization step without switching to another scheme near zero, unlike other known schemes (Alfonsi, 2010; Mackevičius, 2011). To the best of our knowledge, no second-order weak approximations for the CKLS processes were constructed before. The accuracy of constructed approximations is illustrated by several simulation examples with comparison with schemes of Alfonsi in the particular case of the CIR process and our first-order approximations of the CKLS processes (Lileika– Mackevičius, 2020).


1986 ◽  
Vol 23 (04) ◽  
pp. 1013-1018
Author(s):  
B. G. Quinn ◽  
H. L. MacGillivray

Sufficient conditions are presented for the limiting normality of sequences of discrete random variables possessing unimodal distributions. The conditions are applied to obtain normal approximations directly for the hypergeometric distribution and the stationary distribution of a special birth-death process.


Mathematics ◽  
2021 ◽  
Vol 9 (9) ◽  
pp. 981
Author(s):  
Patricia Ortega-Jiménez ◽  
Miguel A. Sordo ◽  
Alfonso Suárez-Llorens

The aim of this paper is twofold. First, we show that the expectation of the absolute value of the difference between two copies, not necessarily independent, of a random variable is a measure of its variability in the sense of Bickel and Lehmann (1979). Moreover, if the two copies are negatively dependent through stochastic ordering, this measure is subadditive. The second purpose of this paper is to provide sufficient conditions for comparing several distances between pairs of random variables (with possibly different distribution functions) in terms of various stochastic orderings. Applications in actuarial and financial risk management are given.


Sign in / Sign up

Export Citation Format

Share Document