Keynes Rejected the Concepts of Probabilistic Truth, True Expected Values, True Expectations, True Probability Distributions and True Probabilities: ‘Probability Begins and Ends With Probability’ (Keynes, 1921)

2020 ◽  
Author(s):  
Michael Emmett Brady
2004 ◽  
Vol 16 (4) ◽  
pp. 665-672 ◽  
Author(s):  
Rick L. Jenison ◽  
Richard A. Reale

The product-moment correlation coefficient is often viewed as a natural measure of dependence. However, this equivalence applies only in the context of elliptical distributions, most commonly the multivariate gaussian, where linear correlation indeed sufficiently describes the underlying dependence structure. Should the true probability distributions deviate from those with elliptical contours, linear correlation may convey misleading information on the actual underlying dependencies. It is often the case that probability distributions other than the gaussian distribution are necessary to properly capture the stochastic nature of single neurons, which as a consequence greatly complicates the construction of a flexible model of covariance. We show how arbitrary probability densities can be coupled to allow greater flexibility in the construction of multivariate neural population models.


Author(s):  
Baisravan HomChaudhuri

Abstract This paper focuses on distributionally robust controller design for avoiding dynamic and stochastic obstacles whose exact probability distribution is unknown. The true probability distribution of the disturbance associated with an obstacle, although unknown, is considered to belong to an ambiguity set that includes all the probability distributions that share the same first two moment. The controller thus focuses on ensuring the satisfaction of the probabilistic collision avoidance constraints for all probability distributions in the ambiguity set, hence making the solution robust to the true probability distribution of the stochastic obstacles. Techniques from robust optimization methods are used to model the distributionally robust probabilistic or chance constraints as a semi-definite programming (SDP) problem with linear matrix inequality (LMI) constraints that can be solved in a computationally tractable fashion. Simulation results for a robot obstacle avoidance problem shows the efficacy of our method.


Entropy ◽  
2021 ◽  
Vol 23 (9) ◽  
pp. 1122
Author(s):  
Serafín Moral ◽  
Andrés Cano ◽  
Manuel Gómez-Olmedo

Kullback–Leibler divergence KL(p,q) is the standard measure of error when we have a true probability distribution p which is approximate with probability distribution q. Its efficient computation is essential in many tasks, as in approximate computation or as a measure of error when learning a probability. In high dimensional probabilities, as the ones associated with Bayesian networks, a direct computation can be unfeasible. This paper considers the case of efficiently computing the Kullback–Leibler divergence of two probability distributions, each one of them coming from a different Bayesian network, which might have different structures. The paper is based on an auxiliary deletion algorithm to compute the necessary marginal distributions, but using a cache of operations with potentials in order to reuse past computations whenever they are necessary. The algorithms are tested with Bayesian networks from the bnlearn repository. Computer code in Python is provided taking as basis pgmpy, a library for working with probabilistic graphical models.


Author(s):  
Brendan Juba

Standard approaches to probabilistic reasoning require that one possesses an explicit model of the distribution in question. But, the empirical learning of models of probability distributions from partial observations is a problem for which efficient algorithms are generally not known. In this work we consider the use of bounded-degree fragments of the “sum-of-squares” logic as a probability logic. Prior work has shown that we can decide refutability for such fragments in polynomial-time. We propose to use such fragments to decide queries about whether a given probability distribution satisfies a given system of constraints and bounds on expected values. We show that in answering such queries, such constraints and bounds can be implicitly learned from partial observations in polynomial-time as well. It is known that this logic is capable of deriving many bounds that are useful in probabilistic analysis. We show here that it furthermore captures key polynomial-time fragments of resolution. Thus, these fragments are also quite expressive.


1997 ◽  
Vol 161 ◽  
pp. 197-201 ◽  
Author(s):  
Duncan Steel

AbstractWhilst lithopanspermia depends upon massive impacts occurring at a speed above some limit, the intact delivery of organic chemicals or other volatiles to a planet requires the impact speed to be below some other limit such that a significant fraction of that material escapes destruction. Thus the two opposite ends of the impact speed distributions are the regions of interest in the bioastronomical context, whereas much modelling work on impacts delivers, or makes use of, only the mean speed. Here the probability distributions of impact speeds upon Mars are calculated for (i) the orbital distribution of known asteroids; and (ii) the expected distribution of near-parabolic cometary orbits. It is found that cometary impacts are far more likely to eject rocks from Mars (over 99 percent of the cometary impacts are at speeds above 20 km/sec, but at most 5 percent of the asteroidal impacts); paradoxically, the objects impacting at speeds low enough to make organic/volatile survival possible (the asteroids) are those which are depleted in such species.


Author(s):  
B. D. Athey ◽  
A. L. Stout ◽  
M. F. Smith ◽  
J. P. Langmore

Although there is general agreement that Inactive chromosome fibers consist of helically packed nucleosomes, the pattern of packing is still undetermined. Only one of the proposed models, the crossed-linker model, predicts a variable diameter dependent on the length of DNA between nucleosomes. Measurements of the fiber diameter of negatively-stained and frozen- hydrated- chromatin from Thyone sperm (87bp linker) and Necturus erythrocytes (48bp linker) have been previously reported from this laboratory. We now introduce a more reliable method of measuring the diameters of electron images of fibrous objects. The procedure uses a modified version of the computer program TOTAL, which takes a two-dimensional projection of the fiber density (represented by the micrograph itself) and projects it down the fiber axis onto one dimension. We illustrate this method using high contrast, in-focus STEM images of TMV and chromatin from Thyone and Necturus. The measured diameters are in quantitative agreement with the expected values for the crossed-linker model for chromatin structure


2020 ◽  
Vol 3 (1) ◽  
pp. 10501-1-10501-9
Author(s):  
Christopher W. Tyler

Abstract For the visual world in which we operate, the core issue is to conceptualize how its three-dimensional structure is encoded through the neural computation of multiple depth cues and their integration to a unitary depth structure. One approach to this issue is the full Bayesian model of scene understanding, but this is shown to require selection from the implausibly large number of possible scenes. An alternative approach is to propagate the implied depth structure solution for the scene through the “belief propagation” algorithm on general probability distributions. However, a more efficient model of local slant propagation is developed as an alternative.The overall depth percept must be derived from the combination of all available depth cues, but a simple linear summation rule across, say, a dozen different depth cues, would massively overestimate the perceived depth in the scene in cases where each cue alone provides a close-to-veridical depth estimate. On the other hand, a Bayesian averaging or “modified weak fusion” model for depth cue combination does not provide for the observed enhancement of perceived depth from weak depth cues. Thus, the current models do not account for the empirical properties of perceived depth from multiple depth cues.The present analysis shows that these problems can be addressed by an asymptotic, or hyperbolic Minkowski, approach to cue combination. With appropriate parameters, this first-order rule gives strong summation for a few depth cues, but the effect of an increasing number of cues beyond that remains too weak to account for the available degree of perceived depth magnitude. Finally, an accelerated asymptotic rule is proposed to match the empirical strength of perceived depth as measured, with appropriate behavior for any number of depth cues.


Sign in / Sign up

Export Citation Format

Share Document