Source & Sourceability: Towards a probabilistic framework for dendroprovenance based on hypothesis testing and Bayesian inference

2018 ◽  
Vol 47 ◽  
pp. 38-47 ◽  
Author(s):  
B. Lee Drake
2011 ◽  
Vol 128-129 ◽  
pp. 637-641
Author(s):  
Lan Luo ◽  
Qiong Hai Dai ◽  
Chun Xiang Xu ◽  
Shao Quan Jiang

The cipher algorithms are categorized by block cipher, stream cipher and HASH, and they are weighed in faithful transmission which is known as independent condition. In faithful transmission, the ciphers are studied because of their root cipher. Intelligent application of ciphers is a direction that uses Bayesian model of cognition science. Bayesian inference is a rational engine for solving such problems within a probabilistic framework, and consequently is the heart of most probabilistic models of weighing the ciphers. The approach of this paper is that ciphers, which are considered as a suitable weight cipher to kinds of networks, are ranged based on root ciphers. This paper shows the other kinds of transformation among the different cipher algorithms themselves.


2018 ◽  
Vol 1 (2) ◽  
pp. 281-295 ◽  
Author(s):  
Alexander Etz ◽  
Julia M. Haaf ◽  
Jeffrey N. Rouder ◽  
Joachim Vandekerckhove

Hypothesis testing is a special form of model selection. Once a pair of competing models is fully defined, their definition immediately leads to a measure of how strongly each model supports the data. The ratio of their support is often called the likelihood ratio or the Bayes factor. Critical in the model-selection endeavor is the specification of the models. In the case of hypothesis testing, it is of the greatest importance that the researcher specify exactly what is meant by a “null” hypothesis as well as the alternative to which it is contrasted, and that these are suitable instantiations of theoretical positions. Here, we provide an overview of different instantiations of null and alternative hypotheses that can be useful in practice, but in all cases the inferential procedure is based on the same underlying method of likelihood comparison. An associated app can be found at https://osf.io/mvp53/ . This article is the work of the authors and is reformatted from the original, which was published under a CC-By Attribution 4.0 International license and is available at https://psyarxiv.com/wmf3r/ .


2017 ◽  
Author(s):  
Guillermo CAMPITELLI

This tutorial on Bayesian inference targets psychological researchers who are trained in the null hypothesis testing approach and use of SPSS software. There a number ofexcellent quality tutorials on Bayesian inference, but their problem is that, they assume mathematical knowledge that most psychological researchers do not possess. Thistutorial starts from the idea that Bayesian inference is not more difficult than the traditional approach, but before being introduced to probability theory notation is necessary for the newcomer to understand simple probability principles, which could be explained without mathematical formulas or probability notation. For this purpose in this tutorial I use a simple tool-the parameter-data table-to explain how probability theory can easily be used to make inferences in research. Then I compare the Bayesian and the null hypothesis testing approach using the same tool. Only after having introduced these principles I show the formulas and notations and explain how they relate to the parameter-data table. It is to be expected that this tutorial will increase the use of Bayesian inference by psychological researchers. Moreover, Bayesian researchers may use this tutorial to teach Bayesian inference to undergraduate or postgraduate students.


2020 ◽  
Author(s):  
Donald Ray Williams ◽  
Joris Mulder

The R package BGGM provides tools for making Bayesian inference in Gaussian graphicalmodels (GGM). The methods are organized around two general approaches for Bayesian inference: (1) estimation and (2) hypothesis testing. The key distinction is that the formerfocuses on either the posterior or posterior predictive distribution (Gelman, Meng, & Stern,1996; see section 5 in Rubin, 1984), whereas the latter focuses on model comparison withthe Bayes factor (Jeffreys, 1961; Kass & Raftery, 1995).


Author(s):  
Daniel McNamee ◽  
Daniel M. Wolpert

Rationality principles such as optimal feedback control and Bayesian inference underpin a probabilistic framework that has accounted for a range of empirical phenomena in biological sensorimotor control. To facilitate the optimization of flexible and robust behaviors consistent with these theories, the ability to construct internal models of the motor system and environmental dynamics can be crucial. In the context of this theoretic formalism, we review the computational roles played by such internal models and the neural and behavioral evidence for their implementation in the brain.


1976 ◽  
Vol 8 (7) ◽  
pp. 741-752 ◽  
Author(s):  
E S Sheppard

The framework of Bayesian inference is proposed as a structure for unifying those highly disparate approaches to entropy modelling that have appeared in geography to date, and is used to illuminate the possibilities and shortcomings of some of these models. The inadequacy of most descriptive entropy statistics for measuring the information in a spatially-autocorrelated map is described. The contention that entropy maximization in itself provides theoretical justification for spatial models is critically evaluated. It is concluded that entropy should, first and foremost, be regarded as a technique to expand our methods of statistical inference and hypothesis testing, rather than one of theory construction.


Sign in / Sign up

Export Citation Format

Share Document