scholarly journals Bayesian Methods for Statistical Analysis

Author(s):  
Borek Puza
2021 ◽  
Vol 66 ◽  
pp. 126762
Author(s):  
Emma Shardlow ◽  
Caroline Linhart ◽  
Sameerah Connor ◽  
Erin Softely ◽  
Christopher Exley

2011 ◽  
Vol 34 (4) ◽  
pp. 206-207 ◽  
Author(s):  
Michael D. Lee

AbstractJones & Love (J&L) should have given more attention to Agnostic uses of Bayesian methods for the statistical analysis of models and data. Reliance on the frequentist analysis of Bayesian models has retarded their development and prevented their full evaluation. The Ecumenical integration of Bayesian statistics to analyze Bayesian models offers a better way to test their inferential and predictive capabilities.


2019 ◽  
Author(s):  
Koen Derks ◽  
Jacques de Swart ◽  
Eric-Jan Wagenmakers ◽  
Jan Wille ◽  
ruud wetzels

Statistical theory is fundamental to many auditing guidelines and procedures. In order to assist auditors with the required statistical analyses, and to advocate state-of-the-art Bayesian methods, we introduce JASP for Audit (JfA). JfA is easy-to-use, free-of-charge software that automatically follows the standard audit workflow, selects the appropriate statistical analysis, interprets the results, and produces a readable report. This approach reduces the potential for statistical errors and therefore increases audit quality. Next to the frequentist methods that currently dominate audit practice, JfA incorporates Bayesian counterparts of these methods that come with several advantages. For example, Bayesian statistics allows incorporation of expert knowledge directly into the statistical analyses, allowing for a decrease in sample size, and an increase in efficiency. In sum, JfA is designed with the auditor in mind, it guides the auditor through the statistical aspects of an audit, and therefore has the potential to increase audit efficiency and quality.


Author(s):  
Janet L. Peacock ◽  
Philip J. Peacock

This chapter describes the Bayesian approach to statistical analysis in contrast to the frequentist approach. It discusses how clinicians often use a Bayesian approach in interpreting clinical findings and forming management plans. It describes how Bayesian methods work including a description of prior and posterior distributions. The chapter outlines the role and choice of prior distributions and how they are combined with the data collected to provide an updated estimate of the unknown quantity being studied. It includes examples of the use of Bayesian methods in medicine, and discusses the pros and cons of the Bayesian approach compared to the frequentist approach. Finally, guidance is given on how to read and interpret Bayesian analyses in the medical literature.


2014 ◽  
Vol 30 (4) ◽  
pp. 703-714 ◽  
Author(s):  
Edson Zangiacomi Martinez ◽  
Jorge Alberto Achcar

2013 marked the 250th anniversary of the presentation of Bayes’ theorem by the philosopher Richard Price. Thomas Bayes was a figure little known in his own time, but in the 20th century the theorem that bears his name became widely used in many fields of research. The Bayes theorem is the basis of the so-called Bayesian methods, an approach to statistical inference that allows studies to incorporate prior knowledge about relevant data characteristics into statistical analysis. Nowadays, Bayesian methods are widely used in many different areas such as astronomy, economics, marketing, genetics, bioinformatics and social sciences. This study observed that a number of authors discussed recent advances in techniques and the advantages of Bayesian methods for the analysis of epidemiological data. This article presents an overview of Bayesian methods, their application to epidemiological research and the main areas of epidemiology which should benefit from the use of Bayesian methods in coming years.


1966 ◽  
Vol 24 ◽  
pp. 188-189
Author(s):  
T. J. Deeming

If we make a set of measurements, such as narrow-band or multicolour photo-electric measurements, which are designed to improve a scheme of classification, and in particular if they are designed to extend the number of dimensions of classification, i.e. the number of classification parameters, then some important problems of analytical procedure arise. First, it is important not to reproduce the errors of the classification scheme which we are trying to improve. Second, when trying to extend the number of dimensions of classification we have little or nothing with which to test the validity of the new parameters.Problems similar to these have occurred in other areas of scientific research (notably psychology and education) and the branch of Statistics called Multivariate Analysis has been developed to deal with them. The techniques of this subject are largely unknown to astronomers, but, if carefully applied, they should at the very least ensure that the astronomer gets the maximum amount of information out of his data and does not waste his time looking for information which is not there. More optimistically, these techniques are potentially capable of indicating the number of classification parameters necessary and giving specific formulas for computing them, as well as pinpointing those particular measurements which are most crucial for determining the classification parameters.


Author(s):  
Gianluigi Botton ◽  
Gilles L'espérance

As interest for parallel EELS spectrum imaging grows in laboratories equipped with commercial spectrometers, different approaches were used in recent years by a few research groups in the development of the technique of spectrum imaging as reported in the literature. Either by controlling, with a personal computer both the microsope and the spectrometer or using more powerful workstations interfaced to conventional multichannel analysers with commercially available programs to control the microscope and the spectrometer, spectrum images can now be obtained. Work on the limits of the technique, in terms of the quantitative performance was reported, however, by the present author where a systematic study of artifacts detection limits, statistical errors as a function of desired spatial resolution and range of chemical elements to be studied in a map was carried out The aim of the present paper is to show an application of quantitative parallel EELS spectrum imaging where statistical analysis is performed at each pixel and interpretation is carried out using criteria established from the statistical analysis and variations in composition are analyzed with the help of information retreived from t/γ maps so that artifacts are avoided.


2019 ◽  
Vol 62 (3) ◽  
pp. 577-586 ◽  
Author(s):  
Garnett P. McMillan ◽  
John B. Cannon

Purpose This article presents a basic exploration of Bayesian inference to inform researchers unfamiliar to this type of analysis of the many advantages this readily available approach provides. Method First, we demonstrate the development of Bayes' theorem, the cornerstone of Bayesian statistics, into an iterative process of updating priors. Working with a few assumptions, including normalcy and conjugacy of prior distribution, we express how one would calculate the posterior distribution using the prior distribution and the likelihood of the parameter. Next, we move to an example in auditory research by considering the effect of sound therapy for reducing the perceived loudness of tinnitus. In this case, as well as most real-world settings, we turn to Markov chain simulations because the assumptions allowing for easy calculations no longer hold. Using Markov chain Monte Carlo methods, we can illustrate several analysis solutions given by a straightforward Bayesian approach. Conclusion Bayesian methods are widely applicable and can help scientists overcome analysis problems, including how to include existing information, run interim analysis, achieve consensus through measurement, and, most importantly, interpret results correctly. Supplemental Material https://doi.org/10.23641/asha.7822592


Sign in / Sign up

Export Citation Format

Share Document