Fourier Analysis and Benfordʼs Law

Author(s):  
Steven J. Miller

This chapter continues the development of the theory of Benford's law. It uses Fourier analysis (in particular, Poisson Summation) to prove many systems either satisfy or almost satisfy the Fundamental Equivalence, and hence either obey Benford's law, or are well approximated by it. Examples range from geometric Brownian motions to random matrix theory to products and chains of random variables to special distributions. The chapter furthermore develops the notion of a Benford-good system. Unfortunately one of the conditions here concerns the cancelation in sums of translated errors related to the cumulative distribution function, and proving the required cancelation often requires techniques specific to the system of interest.

Author(s):  
Robert J Marks II

In this Chapter, we present application of Fourier analysis to probability, random variables and stochastic processes [1089, 1097, 1387, 1329]. Arandom variable, X, is the assignment of a number to the outcome of a random experiment. We can, for example, flip a coin and assign an outcome of a heads as X = 1 and a tails X = 0. Often the number is equated to the numerical outcome of the experiment, such as the number of dots on the face of a rolled die or the measurement of a voltage in a noisy circuit. The cumulative distribution function is defined by FX(x) = Pr[X ≤ x]. (4.1) The probability density function is the derivative fX(x) = d /dxFX(x). Our treatment of random variables focuses on use of Fourier analysis. Due to this viewpoint, the development we use is unconventional and begins immediately in the next section with discussion of properties of the probability density function.


Author(s):  
RONALD R. YAGER

We look at the issue of obtaining a variance like measure associated with probability distributions over ordinal sets. We call these dissonance measures. We specify some general properties desired in these dissonance measures. The centrality of the cumulative distribution function in formulating the concept of dissonance is pointed out. We introduce some specific examples of measures of dissonance.


2017 ◽  
Vol 20 (5) ◽  
pp. 939-951
Author(s):  
Amal Almarwani ◽  
Bashair Aljohani ◽  
Rasha Almutairi ◽  
Nada Albalawi ◽  
Alya O. Al Mutairi

2018 ◽  
Vol 47 (2) ◽  
pp. 53-67 ◽  
Author(s):  
Jalal Chachi

In this paper, rst a new notion of fuzzy random variables is introduced. Then, usingclassical techniques in Probability Theory, some aspects and results associated to a randomvariable (including expectation, variance, covariance, correlation coecient, etc.) will beextended to this new environment. Furthermore, within this framework, we can use thetools of general Probability Theory to dene fuzzy cumulative distribution function of afuzzy random variable.


Author(s):  
Arno Berger ◽  
Theodore P. Hill

This book provides the first comprehensive treatment of Benford's law, the surprising logarithmic distribution of significant digits discovered in the late nineteenth century. Establishing the mathematical and statistical principles that underpin this intriguing phenomenon, the text combines up-to-date theoretical results with overviews of the law's colorful history, rapidly growing body of empirical evidence, and wide range of applications. The book begins with basic facts about significant digits, Benford functions, sequences, and random variables, including tools from the theory of uniform distribution. After introducing the scale-, base-, and sum-invariance characterizations of the law, the book develops the significant-digit properties of both deterministic and stochastic processes, such as iterations of functions, powers of matrices, differential equations, and products, powers, and mixtures of random variables. Two concluding chapters survey the finitely additive theory and the flourishing applications of Benford's law. Carefully selected diagrams, tables, and close to 150 examples illuminate the main concepts throughout. The book includes many open problems, in addition to dozens of new basic theorems and all the main references. A distinguishing feature is the emphasis on the surprising ubiquity and robustness of the significant-digit law. The book can serve as both a primary reference and a basis for seminars and courses.


2021 ◽  
Author(s):  
Nefeli Moridis ◽  
W. John Lee ◽  
Wayne Sim ◽  
Thomas Blasingame

Abstract The objective of this work is to numerically estimate the fraction of Reserves assigned to each Reserves category of the PRMS matrix through a cumulative distribution function. We selected 38 wells from a Permian Basin dataset available to Texas A&M University. Previous work has shown that Swanson's Mean, which relates the Reserves categories through a cdf of a normal distribution, is an inaccurate method to determine the relationship of the Reserves categories with asymmetric distributions. Production data are lognormally distributed, regardless of basin type, thus cannot follow the SM concept. The Gaussian Quadrature (GQ) provides a methodology to accurately estimate the fraction of Reserves that lie in 1P, 2P, and 3P categories – known as the weights. Gaussian Quadrature is a numerical integration method that uses discrete random variables and a distribution that matches the original data. For this work, we associate the lognormal cumulative distribution function (CDF) with a set of discrete random variables that replace the production data, and determine the associated probabilities. The production data for both conventional and unconventional fields are lognormally distributed, thus we expect that this methodology can be implemented in any field. To do this, we performed probabilistic decline curve analysis (DCA) using Arps’ Hyperbolic model and Monte Carlo simulation to obtain the 1P, 2P, and 3P volumes, and calculated the relative weights of each Reserves category. We performed probabilistic rate transient analysis (RTA) using a commercial software to obtain the 1P, 2P, and 3P volumes, and calculated the relative weights of each Reserves category. We implemented the 3-, 5-, and 10-point GQ to obtain the weight and percentiles for each well. Once this was completed, we validated the GQ results by calculating the percent-difference between the probabilistic DCA, RTA, and GQ results. We increase the standard deviation to account for the uncertainty of Contingent and Prospective resources and implemented 3-, 5-, and 10-point GQ to obtain the weight and percentiles for each well. This allows us to also approximate the weights of these volumes to track them through the life of a given project. The probabilistic DCA, RTA and Reserves results indicate that the SM is an inaccurate method for estimating the relative weights of each Reserves category. The 1C, 2C, 3C, and 1U, 2U, and 3U Contingent and Prospective Resources, respectively, are distributed in a similar way but with greater variance, incorporated in the standard deviation. The results show that the GQ is able to capture an accurate representation of the Reserves weights through a lognormal CDF. Based on the proposed results, we believe that the GQ is accurate and can be used to approximate the relationship between the PRMS categories. This relationship will aid in booking Reserves to the SEC because it can be recreated for any field. These distributions of Reserves and resources other than Reserves (ROTR) are important for planning and for resource inventorying. The GQ provides a measure of confidence on the prediction of the Reserves weights because of the low percent difference between the probabilistic DCA, RTA, and GQ weights. This methodology can be implemented in both conventional and unconventional fields.


2018 ◽  
Vol 388 ◽  
pp. 350-381 ◽  
Author(s):  
Thealexa Becker ◽  
David Burt ◽  
Taylor C. Corcoran ◽  
Alec Greaves-Tunnell ◽  
Joseph R. Iafrate ◽  
...  

2016 ◽  
Vol 106 (5) ◽  
pp. 597-601 ◽  
Author(s):  
Matthew Gentzkow ◽  
Emir Kamenica

Rothschild and Stiglitz (1970) represent random variables as convex functions (integrals of the cumulative distribution function). Combining this representation with Blackwell's Theorem (1953), we characterize distributions of posterior means that can be induced by a signal. This characterization provides a novel way to analyze a class of Bayesian persuasion problems.


Sign in / Sign up

Export Citation Format

Share Document