scholarly journals Quantum Retrodiction: Foundations and Controversies

Symmetry ◽  
2021 ◽  
Vol 13 (4) ◽  
pp. 586
Author(s):  
Stephen M. Barnett ◽  
John Jeffers ◽  
David T. Pegg

Prediction is the making of statements, usually probabilistic, about future events based on current information. Retrodiction is the making of statements about past events based on current information. We present the foundations of quantum retrodiction and highlight its intimate connection with the Bayesian interpretation of probability. The close link with Bayesian methods enables us to explore controversies and misunderstandings about retrodiction that have appeared in the literature. To be clear, quantum retrodiction is universally applicable and draws its validity directly from conventional predictive quantum theory coupled with Bayes’ theorem.

2019 ◽  
Vol 62 (3) ◽  
pp. 577-586 ◽  
Author(s):  
Garnett P. McMillan ◽  
John B. Cannon

Purpose This article presents a basic exploration of Bayesian inference to inform researchers unfamiliar to this type of analysis of the many advantages this readily available approach provides. Method First, we demonstrate the development of Bayes' theorem, the cornerstone of Bayesian statistics, into an iterative process of updating priors. Working with a few assumptions, including normalcy and conjugacy of prior distribution, we express how one would calculate the posterior distribution using the prior distribution and the likelihood of the parameter. Next, we move to an example in auditory research by considering the effect of sound therapy for reducing the perceived loudness of tinnitus. In this case, as well as most real-world settings, we turn to Markov chain simulations because the assumptions allowing for easy calculations no longer hold. Using Markov chain Monte Carlo methods, we can illustrate several analysis solutions given by a straightforward Bayesian approach. Conclusion Bayesian methods are widely applicable and can help scientists overcome analysis problems, including how to include existing information, run interim analysis, achieve consensus through measurement, and, most importantly, interpret results correctly. Supplemental Material https://doi.org/10.23641/asha.7822592


1995 ◽  
Vol 39 ◽  
pp. 163-176
Author(s):  
Michael Redhead

Popper wrote extensively on the quantum theory. In Logic der Forschung (LSD) he devoted a whole chapter to the topic, while the whole of Volume 3 of the Postscript to the Logic of Scientific Discovery is devoted to the quantum theory. This volume entitled Quantum Theory and the Schism in Physics (QTSP) incorporated a famous earlier essay, ‘Quantum Mechanics without “the Observer”’ (QM). In addition Popper's development of the propensity interpretation of probability was much influenced by his views on the role of probability in quantum theory, and he also wrote an insightful critique of the 1936 paper of Birkhoff and von Neumann on nondistributive quantum logic (BNIQM).


Author(s):  
Bradley E. Alger

This chapter covers the basics of Bayesian statistics, emphasizing the conceptual framework for Bayes’ Theorem. It works through several iterations of the theorem to demonstrate how the same equation is applied in different circumstances, from constructing and updating models to parameter evaluation, to try to establish an intuitive feel for it. The chapter also covers the philosophical underpinnings of Bayesianism and compares them with the frequentist perspective described in Chapter 5. It addresses the question of whether Bayesians are inductivists. Finally, the chapter shows how the Bayesian procedures of model selection and comparison can be pressed into service to allow Bayesian methods to be used in hypothesis testing in essentially the same way that various p-tests are used in the frequentist hypothesis testing framework.


Author(s):  
Eitel J.M. Lauria

Bayesian methods provide a probabilistic approach to machine learning. The Bayesian framework allows us to make inferences from data using probability models for values we observe and about which we want to draw some hypotheses. Bayes theorem provides the means of calculating the probability of a hypothesis (posterior probability) based on its prior probability, the probability of the observations and the likelihood that the observational data fit the hypothesis.


1994 ◽  
Vol 6 (5) ◽  
pp. 767-794 ◽  
Author(s):  
Stephen P. Luttrell

In this paper Bayesian methods are used to analyze some of the properties of a special type of Markov chain. The forward transitions through the chain are followed by inverse transitions (using Bayes' theorem) backward through a copy of the same chain; this will be called a folded Markov chain. If an appropriately defined Euclidean error (between the original input and its “reconstruction” via Bayes' theorem) is minimized with respect to the choice of Markov chain transition probabilities, then the familiar theories of both vector quantizers and self-organizing maps emerge. This approach is also used to derive the theory of self-supervision, in which the higher layers of a multilayer network supervise the lower layers, even though overall there is no external teacher.


2019 ◽  
Vol 45 (1) ◽  
pp. 47-68 ◽  
Author(s):  
Scott M. Lynch ◽  
Bryce Bartlett

Although Bayes’ theorem has been around for more than 250 years, widespread application of the Bayesian approach only began in statistics in 1990. By 2000, Bayesian statistics had made considerable headway into social science, but even now its direct use is rare in articles in top sociology journals, perhaps because of a lack of knowledge about the topic. In this review, we provide an overview of the key ideas and terminology of Bayesian statistics, and we discuss articles in the top journals that have used or developed Bayesian methods over the last decade. In this process, we elucidate some of the advantages of the Bayesian approach. We highlight that many sociologists are, in fact, using Bayesian methods, even if they do not realize it, because techniques deployed by popular software packages often involve Bayesian logic and/or computation. Finally, we conclude by briefly discussing the future of Bayesian statistics in sociology.


Data Mining ◽  
2011 ◽  
pp. 260-277
Author(s):  
Eitel J.M. Lauria ◽  
Giri Kumar Tayi

One of the major problems faced by data-mining technologies is how to deal with uncertainty. The prime characteristic of Bayesian methods is their explicit use of probability for quantifying uncertainty. Bayesian methods provide a practical method to make inferences from data using probability models for values we observe and about which we want to draw some hypotheses. Bayes’ Theorem provides the means of calculating the probability of a hypothesis (posterior probability) based on its prior probability, the probability of the observations, and the likelihood that the observational data fits the hypothesis. The purpose of this chapter is twofold: to provide an overview of the theoretical framework of Bayesian methods and its application to data mining, with special emphasis on statistical modeling and machine-learning techniques; and to illustrate each theoretical concept covered with practical examples. We will cover basic probability concepts, Bayes’ Theorem and its implications, Bayesian classification, Bayesian belief networks, and an introduction to simulation techniques.


2015 ◽  
Vol 58 (3) ◽  
pp. 121-134
Author(s):  
Igor Stefanovic

This article deals with actuality of Hume?s positive thesis about causality, specifically in modern science. According to Dauer, Hume in his Treatise of Human Nature does not deal with scientific theory which allows us, in modern times, to come to the truth, and then necessity. Also, he claims that observation alone, without theory is useless, which is the reason why we need science to predict future events. I intend to show that all three claims are incorrect, and to show an intimate connection of causality and our intuitions.


Sign in / Sign up

Export Citation Format

Share Document