The overlapping coefficient as a measure of agreement between probability distributions and point estimation of the overlap of two normal densities

1989 ◽  
Vol 18 (10) ◽  
pp. 3851-3874 ◽  
Author(s):  
Henry F. Inman ◽  
Edwin L. Bradley
2019 ◽  
Vol 20 (01) ◽  
pp. 2050008 ◽  
Author(s):  
Lifeng Xin ◽  
Xiaozhen Li ◽  
Jiaxin Zhang ◽  
Yan Zhu ◽  
Lin Xiao

Over the last decades, the resonance-related dynamics for bridge systems subjected to a moving train has been researched and discussed from mechanics, physics and mathematics. In the current work, new perspectives of train-induced resonance analysis are investigated through introducing random propagation process into the train–bridge dynamic interactions. Besides, the Nataf-transformation-based point estimation method is applied to generate pseudorandom variables following arbitrarily correlated probability distributions. A three-dimensional (3D) nonlinear train-ballasted track–bridge interaction model founded on fundamental physical and mechanical principles is employed to convey and depict train–bridge interactions with random properties considered. After that, extensive applications are illustrated in detail for revealing the statistical characteristics of the so-called “random resonance”. Numerical results show that the critical train speeds associated with resonance and cancelation are random in essence owing to the variability of system parameters; the correlation between parameters exerts obvious influences on system dynamic behaviors; the last vehicle of a train will be in more violent vibrations compared to the front vehicles; the influences of track irregularities on the wheel–rail interactions are significantly greater than those of resonance.


2022 ◽  
pp. 1-24
Author(s):  
Kohei Ichikawa ◽  
Asaki Kataoka

Abstract Animals make efficient probabilistic inferences based on uncertain and noisy information from the outside environment. It is known that probabilistic population codes, which have been proposed as a neural basis for encoding probability distributions, allow general neural networks (NNs) to perform near-optimal point estimation. However, the mechanism of sampling-based probabilistic inference has not been clarified. In this study, we trained two types of artificial NNs, feedforward NN (FFNN) and recurrent NN (RNN), to perform sampling-based probabilistic inference. Then we analyzed and compared their mechanisms of sampling. We found that sampling in RNN was performed by a mechanism that efficiently uses the properties of dynamical systems, unlike FFNN. In addition, we found that sampling in RNNs acted as an inductive bias, enabling a more accurate estimation than in maximum a posteriori estimation. These results provide important arguments for discussing the relationship between dynamical systems and information processing in NNs.


Change point reflects a qualitative change in things. It has gained some applications in the field of reliability. In order to estimate the position parameters of the change point, a Bayesian change point model based on masked data and Gibbs sampling was proposed. By filling in missing lifetime data and introducing latent variables, the simple likelihood function is obtained for exponential distribution parallel system under censored data. This paper describes the probability distributions and random generation methods of the missing lifetime variables and latent variables, and obtains the full conditional distributions of the change point position parameters and other unknown parameters. By Gibbs sampling and estimation of unknown parameters, the estimates of the mean, median, and quantile of the parameter posterior distribution are obtained. The specific steps of Gibbs sampling are introduced in detail. The convergence of Gibbs sampling is also diagnosed. Random simulation results show that the estimations are fairly accurate.


2014 ◽  
Vol 10 (6) ◽  
pp. 4535-4552
Author(s):  
J. Liakka ◽  
J. T. Eronen ◽  
H. Tang ◽  
F. T. Portmann

Abstract. The combined use of proxy records and climate modelling is invaluable for obtaining a better understanding of past climates. However, many methods of model-proxy comparison in the literature are fundamentally problematic because larger errors in the proxy tend to yield a "better" agreement with the model. Here we quantify model-proxy agreement as a function to proxy uncertainty using the overlapping coefficient OVL, which measures the similarity between two probability distributions. We found that the model-proxy agreement is poor (OVL < 50%) if the proxy uncertainty (σp) is greater than three times the model variability (σm), even if the model and proxy have similar mean estimates. Hence only proxies that fulfil the condition σp < 3σm should be used for detailed quantitative evaluation of the model performance.


1997 ◽  
Vol 161 ◽  
pp. 197-201 ◽  
Author(s):  
Duncan Steel

AbstractWhilst lithopanspermia depends upon massive impacts occurring at a speed above some limit, the intact delivery of organic chemicals or other volatiles to a planet requires the impact speed to be below some other limit such that a significant fraction of that material escapes destruction. Thus the two opposite ends of the impact speed distributions are the regions of interest in the bioastronomical context, whereas much modelling work on impacts delivers, or makes use of, only the mean speed. Here the probability distributions of impact speeds upon Mars are calculated for (i) the orbital distribution of known asteroids; and (ii) the expected distribution of near-parabolic cometary orbits. It is found that cometary impacts are far more likely to eject rocks from Mars (over 99 percent of the cometary impacts are at speeds above 20 km/sec, but at most 5 percent of the asteroidal impacts); paradoxically, the objects impacting at speeds low enough to make organic/volatile survival possible (the asteroids) are those which are depleted in such species.


2020 ◽  
Vol 3 (1) ◽  
pp. 10501-1-10501-9
Author(s):  
Christopher W. Tyler

Abstract For the visual world in which we operate, the core issue is to conceptualize how its three-dimensional structure is encoded through the neural computation of multiple depth cues and their integration to a unitary depth structure. One approach to this issue is the full Bayesian model of scene understanding, but this is shown to require selection from the implausibly large number of possible scenes. An alternative approach is to propagate the implied depth structure solution for the scene through the “belief propagation” algorithm on general probability distributions. However, a more efficient model of local slant propagation is developed as an alternative.The overall depth percept must be derived from the combination of all available depth cues, but a simple linear summation rule across, say, a dozen different depth cues, would massively overestimate the perceived depth in the scene in cases where each cue alone provides a close-to-veridical depth estimate. On the other hand, a Bayesian averaging or “modified weak fusion” model for depth cue combination does not provide for the observed enhancement of perceived depth from weak depth cues. Thus, the current models do not account for the empirical properties of perceived depth from multiple depth cues.The present analysis shows that these problems can be addressed by an asymptotic, or hyperbolic Minkowski, approach to cue combination. With appropriate parameters, this first-order rule gives strong summation for a few depth cues, but the effect of an increasing number of cues beyond that remains too weak to account for the available degree of perceived depth magnitude. Finally, an accelerated asymptotic rule is proposed to match the empirical strength of perceived depth as measured, with appropriate behavior for any number of depth cues.


Sign in / Sign up

Export Citation Format

Share Document