How Is It Possible for Keynes's Theory of Logical Probability to Be 'Non-Probabilistic'? Answer. Only If You Have Been Confused by Reading Frank P. Ramsey's Reviews of Keynes's A Treatise on Probability

2014 ◽  
Author(s):  
Michael Emmett Brady
Keyword(s):  

1978 ◽  
Vol 17 (01) ◽  
pp. 1-10 ◽  
Author(s):  
P. Tautu ◽  
G. Wagner

This paper is an analysis of the most important mathematical aspects of medical diagnosis: logical probability, rationality and decision theory, gambling models, pattern analysis, hazy and fuzzy subsets theory and, finally, the stochastic inquiry process.



Author(s):  
Paul Humphreys

The term ‘probability’ and its cognates occur frequently in both everyday and philosophical discourse. Unlike many other concepts, it is unprofitable to view ‘probability’ as having a unique meaning. Instead, there exist a number of distinct, albeit related, concepts, of which we here mention five: the classical or equiprobable view, the relative frequency view, the subjectivist or personalist view, the propensity view, and the logical probability view. None of these captures all of our legitimate uses of the term ‘probability’, which range from the clearly subjective, as in our assessment of the likelihood of one football team beating another, through the inferential, as when one set of sentences lends a degree of inductive support to another sentence, to the obviously objective, as in the physical chance of a radioactive atom decaying in the next minute. It is often said that what all these interpretations have in common is that they are all described by the same simple mathematical theory – ‘the theory of probability’ to be found in most elementary probability textbooks – and it has traditionally been the task of any interpretation to conform to that theory. But this saying does not hold up under closer examination, and it is better to consider each approach as dealing with a separate subject matter, the structure of which determines the structure of the appropriate calculus.



2018 ◽  
Vol 182 ◽  
pp. 02039
Author(s):  
David Ellerman

Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences, and distinguishability, and is formalized using the distinctions (“dits”) of a partition (a pair of points distinguished by the partition). All the definitions of simple, joint, conditional, and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this talk is to outline the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., “qudits” of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates (cohered together in the pure superposition state being measured) that are distinguished by the measurement (decohered in the postmeasurement mixed state). Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states.



Philosophies ◽  
2020 ◽  
Vol 5 (4) ◽  
pp. 25
Author(s):  
Chenguang Lu

Many researchers want to unify probability and logic by defining logical probability or probabilistic logic reasonably. This paper tries to unify statistics and logic so that we can use both statistical probability and logical probability at the same time. For this purpose, this paper proposes the P–T probability framework, which is assembled with Shannon’s statistical probability framework for communication, Kolmogorov’s probability axioms for logical probability, and Zadeh’s membership functions used as truth functions. Two kinds of probabilities are connected by an extended Bayes’ theorem, with which we can convert a likelihood function and a truth function from one to another. Hence, we can train truth functions (in logic) by sampling distributions (in statistics). This probability framework was developed in the author’s long-term studies on semantic information, statistical learning, and color vision. This paper first proposes the P–T probability framework and explains different probabilities in it by its applications to semantic information theory. Then, this framework and the semantic information methods are applied to statistical learning, statistical mechanics, hypothesis evaluation (including falsification), confirmation, and Bayesian reasoning. Theoretical applications illustrate the reasonability and practicability of this framework. This framework is helpful for interpretable AI. To interpret neural networks, we need further study.



2018 ◽  
Vol 225 (2) ◽  
pp. 425-450
Author(s):  
Dr. Karim Mousa Hussein Mezban

     This research was devoted to elaborate various models of probability theory which is adopted by a number of philosophers of science in the twentieth century.  The debate between them about the validity of probability theory is shown through philosophical researches, the research is distributed to fifth sections which can be listed as follow: Part (1): The theory of classical probability (classical) has been devoted to knowledge of the basic model of probability theory. Part (2): the theory of repetitive probability adopted by the philosopher of science Hans Reichenbach to fill the lack of the model The basis of probability theory part (3): The theory of logical probability adopted by the philosopher of science Rudolf Carnab to fill the logical deficit in the theory of the repetitive probability of Rischenbach, and also included sparring between them. part(4): The theory of probability of vascular, divided into two parts: Section 4.a: Pragmatisms is adopted by philosopher Charles Pierce to enshrine pragmatism or tendency in the concept of probability. Section 4.b: Karl Popper's Probabilistic Probability Theory, in which he defended the inability of probability to justify the induction method. part(5): The theory of entropy probability, which was devoted to the contemporary theory of probability that took all previous models of probability according to information theory.



2020 ◽  
Vol 11 (4) ◽  
pp. 429-445
Author(s):  
Nicholas Danne ◽  

To justify inductive inference and vanquish classical skepticisms about human memory, external world realism, etc., Richard Fumerton proposes his “inferential internalism,” an epistemology whereby humans ‘see’ by Russellian acquaintance Keynesian probable relations (PRs) between propositions. PRs are a priori necessary relations of logical probability, akin to but not reducible to logical entailments, such that perceiving a PR between one’s evidence E and proposition P of unknown truth value justifies rational belief in P to an objective degree. A recent critic of inferential internalism is Alan Rhoda, who questions its psychological plausibility. Rhoda argues that in order to see necessary relations between propositions E and P, one would need acquaintance with too many propositions at once, since our evidence E is often complex. In this paper, I criticize Rhoda’s implausibility objection as too quick. Referencing the causal status effect (CSE) from psychology, I argue that some of the complex features of evidence E contribute to our type-categorizing it as E-type, and thus we do not need to ‘see’ all of the complex features when we see the PR between E and P. My argument leaves unchanged Fumerton’s justificatory role for the PR, but enhances its psychological plausibility.



The Monist ◽  
2018 ◽  
Vol 101 (4) ◽  
pp. 417-440
Author(s):  
Marta Sznajder


Sign in / Sign up

Export Citation Format

Share Document