God’s Dice: Bayesian Probability and Providence

2015 ◽  
Vol 27 (1) ◽  
pp. 4-24
Author(s):  
William R. Clough ◽  

The Reverend Thomas Bayes has recently become best known for his mathematical Theorem, but Bayes’ vocation, and primary identity, was that of minister. Bayes’ writings include a tract on divine benevolence and an essay on the philosophy of calculus as well as what has come to be known as Bayes’ Theorem. Two and a half centuries ago, Bayes affirmed both the Providence of God and the probabilistic nature of reality. This essay explores some implications of Bayes’ Theorem in light of his theology. The central thesis is that it is fruitful to make the connection between Bayes’ mathematical theory of probability, its implications when extended in time, and his view of God as the continuous in-breaking of the good tending to the benefit of all creation. In so doing, Bayes suggests ways to shed light on current theological and philosophical discussions, including theodicy, religion and science, and chance and Providence.

Author(s):  
Adriana Chira

Berlin 1996 (cited under Overviews) introduced the term “Atlantic Creoles” to describe Afro-descendants whose experiences in the age of the Atlantic slave trade were not primarily defined by the plantation. According to Berlin, Atlantic Creoles distinguished themselves through behaviors that “were more akin to those of confident, sophisticated natives than of vulnerable newcomers.” They displayed “linguistic dexterity, cultural plasticity, and social agility.” The term “Creole” is supposed to denote transformations in identity through encounters across cultural difference. Berlin applied this term to a generation that preceded the consolidation of plantation systems (prior to the 18th century), even though he alluded to the possibility of using this concept spatially, too—to describe Afro-descendants living outside plantation systems as late as the end of the 18th century. Landers 1999 (cited under Overviews) took up this latter approach systematically. Scholars have since applied the label “Atlantic Creoles” broadly to cultural and political brokers who drew on repertoires from Africa, Europe, and the Americas as seamen, traders, diplomats, litigants, settlers, wives, workers, or healers. According to Berlin, the term was not meant to obscure the violence that Afro-descendants were subjected to, but to capture a historical moment when racial categories were more fluid and some could access opportunities. Berlin’s piece has a vast legacy. It drew attention to an array of Afro-diasporic experiences and emphasized the role of West Africans in the making of early Atlantic networks. Since 1996, attention to Africans in Atlantic networks has expanded. Scholars have also examined more closely how their actions and trajectories can shed light on the arc of African history, not just the American one. Yet some scholars have critiqued the term “Atlantic Creoles” for excessive capaciousness. In Ferreira 2012 (cited under 18th Century and the Age of Revolutions), Roquinaldo Ferreira argues that it obliterates the specificity of African experiences within pluralistic communities in Africa. Other scholars have critiqued it for romanticizing mobility and insertion into state apparatuses. Upward mobility for some Afro-descendants could often only come with fewer opportunities for enslaved people. Finally, the term assumes a somewhat linear identity formation. In Sweet 2013 (cited under Healing, Religion, and Science), James Sweet argues that historians too often assume that Creole Afro-descendant identities move away from African cosmologies toward Western ones.


1999 ◽  
Vol 92 (9) ◽  
pp. 780-784
Author(s):  
Lawrence M. Lesser

Although introducing technology into our mathematics curricula allows us to tackle problems of size and complexity as never before, we face a danger of introducing tools to students before they have a sufficient understanding of how mathematics content within their reach can be used to shed light on the algorithms within the tools or on the use of the tools themselves. Fortunately, we can view mathematical theory and technology not as opponents but rather as partners that make the whole of mathematical understanding richer than the sum of its parts. Indeed, bringing technology into our classrooms can encourage new questions that technology-free mathematics must answer. This article focuses mainly on a common example in technology-rich mathematics curricula, namely, the line of best fit, followed by a discussion of two additional examples—interpolating polynomials and complete graphs. In each case, connections between theory and technology do not appear to be as widely known and used as they could be.


Dialogue ◽  
1975 ◽  
Vol 14 (2) ◽  
pp. 241-253 ◽  
Author(s):  
Michael Radner

Logicism is the doctrine that mathematics is reducible to logic. It is usually presented in two theses: (1) Every mathematical concept is definable in terms of logical concepts. (2) Every mathematical theorem is deducible from logical principles. In this paper, I am not concerned with the truth or falsity of (1) and (2). Rather, I am concerned with the underlying philosophical system. Logicism is connected with the names of Frege, Russell (and Whitehead). But the logicism which is familiar to most philosophers is not the original logicist system of Russell. Instead, we read either Russell's later Introduction to Mathematical Philosophy or articles by Carnap and Hempel. My purpose in this paper is to return to the original logicist system of Russell. This system, at least in essentials, lasts through the publication of the First Edition of Principia Mathematica. I believe that examination of this system will shed light on why certain difficulties arose in later logicism, including the logicist views of the Logical Empiricist movement. Further, the issues are closely connected with general doctrines on the nature of philosophical analysis.


2020 ◽  
Author(s):  
Xin Zhang

The study of visual illusions is an old subject and an important part of the psychology of human visual perception, but hitherto there has been no single principle able to explain radically different kinds of visual illusions conjointly. Such a principle does exist, as is to be shown, and has the virtue of being rigorous: it is the mathematical theory of Fourier analysis. A great many visual illusions are what happen when the visual objects involved undergo certain frequency filtering, a concept deduced from Fourier analysis. Phenomena thus explained belong in these distinct categories: brightness illusions, colour illusions, geometrical illusions, and motion illusions, all of which have been simulated with computer programmes based on this mathematical principle. Visual illusions obeying this principle have in fact been depicted in Western painting for centuries, and art can in certain ways shed light on the quest for the understanding of human vision.


Author(s):  
Hilmi Demir

Philosophers have used information theoretic concepts and theorems for philosophical purposes since the publication of Shannon’s seminal work, “The Mathematical Theory of Communication”. The efforts of different philosophers led to the formation of Philosophy of Information as a subfield of philosophy in the late 1990s (Floridi, in press). Although a significant part of those efforts was devoted to the mathematical formalism of information and communication theory, a thorough analysis of the fundamental mathematical properties of information-carrying relations has not yet been done. The point here is that a thorough analysis of the fundamental properties of information-carrying relations will shed light on some important controversies. The overall aim of this chapter is to begin this process of elucidation. It therefore includes a detailed examination of three semantic theories of information: Dretske’s entropy-based framework, Harms’ theory of mutual information and Cohen and Meskin’s counterfactual theory. These three theories are selected because they represent all lines of reasoning available in the literature in regard to the relevance of Shannon’s mathematical theory of information for philosophical purposes. Thus, the immediate goal is to cover the entire landscape of the literature with respect to this criterion. Moreover, this chapter offers a novel analysis of the transitivity of information-carrying relations.


2018 ◽  
Vol 12 (1) ◽  
pp. 97-143 ◽  
Author(s):  
MARCO PANZA ◽  
ANDREA SERENI

AbstractRecent discussions on Fregean and neo-Fregean foundations for arithmetic and real analysis pay much attention to what is called either ‘Application Constraint’ ($AC$) or ‘Frege Constraint’ ($FC$), the requirement that a mathematical theory be so outlined that it immediately allows explaining for its applicability. We distinguish between two constraints, which we, respectively, denote by the latter of these two names, by showing how$AC$generalizes Frege’s views while$FC$comes closer to his original conceptions. Different authors diverge on the interpretation of$FC$and on whether it applies to definitions of both natural and real numbers. Our aim is to trace the origins of$FC$and to explore how different understandings of it can be faithful to Frege’s views about such definitions and to his foundational program. After rehearsing the essential elements of the relevant debate (§1), we appropriately distinguish$AC$from$FC$(§2). We discuss six rationales which may motivate the adoption of different instances of$AC$and$FC$(§3). We turn to the possible interpretations of$FC$(§4), and advance a Semantic$FC$(§4.1), arguing that while it suits Frege’s definition of natural numbers (4.1.1), it cannot reasonably be imposed on definitions of real numbers (§4.1.2), for reasons only partly similar to those offered by Crispin Wright (§4.1.3). We then rehearse a recent exchange between Bob Hale and Vadim Batitzky to shed light on Frege’s conception of real numbers and magnitudes (§4.2). We argue that an Architectonic version of$FC$is indeed faithful to Frege’s definition of real numbers, and compatible with his views on natural ones. Finally, we consider how attributing different instances of$FC$to Frege and appreciating the role of the Architectonic$FC$can provide a more perspicuous understanding of his foundational program, by questioning common pictures of his logicism (§5).


2019 ◽  
Vol 47 (6) ◽  
pp. 1733-1747 ◽  
Author(s):  
Christina Klausen ◽  
Fabian Kaiser ◽  
Birthe Stüven ◽  
Jan N. Hansen ◽  
Dagmar Wachten

The second messenger 3′,5′-cyclic nucleoside adenosine monophosphate (cAMP) plays a key role in signal transduction across prokaryotes and eukaryotes. Cyclic AMP signaling is compartmentalized into microdomains to fulfil specific functions. To define the function of cAMP within these microdomains, signaling needs to be analyzed with spatio-temporal precision. To this end, optogenetic approaches and genetically encoded fluorescent biosensors are particularly well suited. Synthesis and hydrolysis of cAMP can be directly manipulated by photoactivated adenylyl cyclases (PACs) and light-regulated phosphodiesterases (PDEs), respectively. In addition, many biosensors have been designed to spatially and temporarily resolve cAMP dynamics in the cell. This review provides an overview about optogenetic tools and biosensors to shed light on the subcellular organization of cAMP signaling.


2019 ◽  
Vol 62 (3) ◽  
pp. 577-586 ◽  
Author(s):  
Garnett P. McMillan ◽  
John B. Cannon

Purpose This article presents a basic exploration of Bayesian inference to inform researchers unfamiliar to this type of analysis of the many advantages this readily available approach provides. Method First, we demonstrate the development of Bayes' theorem, the cornerstone of Bayesian statistics, into an iterative process of updating priors. Working with a few assumptions, including normalcy and conjugacy of prior distribution, we express how one would calculate the posterior distribution using the prior distribution and the likelihood of the parameter. Next, we move to an example in auditory research by considering the effect of sound therapy for reducing the perceived loudness of tinnitus. In this case, as well as most real-world settings, we turn to Markov chain simulations because the assumptions allowing for easy calculations no longer hold. Using Markov chain Monte Carlo methods, we can illustrate several analysis solutions given by a straightforward Bayesian approach. Conclusion Bayesian methods are widely applicable and can help scientists overcome analysis problems, including how to include existing information, run interim analysis, achieve consensus through measurement, and, most importantly, interpret results correctly. Supplemental Material https://doi.org/10.23641/asha.7822592


2020 ◽  
Vol 29 (3S) ◽  
pp. 631-637
Author(s):  
Katja Lund ◽  
Rodrigo Ordoñez ◽  
Jens Bo Nielsen ◽  
Dorte Hammershøi

Purpose The aim of this study was to develop a tool to gain insight into the daily experiences of new hearing aid users and to shed light on aspects of aided performance that may not be unveiled through standard questionnaires. Method The tool is developed based on clinical observations, patient experiences, expert involvement, and existing validated hearing rehabilitation questionnaires. Results An online tool for collecting data related to hearing aid use was developed. The tool is based on 453 prefabricated sentences representing experiences within 13 categories related to hearing aid use. Conclusions The tool has the potential to reflect a wide range of individual experiences with hearing aid use, including auditory and nonauditory aspects. These experiences may hold important knowledge for both the patient and the professional in the hearing rehabilitation process.


Sign in / Sign up

Export Citation Format

Share Document