semantic information theory
Recently Published Documents


TOTAL DOCUMENTS

7
(FIVE YEARS 2)

H-INDEX

2
(FIVE YEARS 1)

Philosophies ◽  
2020 ◽  
Vol 5 (4) ◽  
pp. 25
Author(s):  
Chenguang Lu

Many researchers want to unify probability and logic by defining logical probability or probabilistic logic reasonably. This paper tries to unify statistics and logic so that we can use both statistical probability and logical probability at the same time. For this purpose, this paper proposes the P–T probability framework, which is assembled with Shannon’s statistical probability framework for communication, Kolmogorov’s probability axioms for logical probability, and Zadeh’s membership functions used as truth functions. Two kinds of probabilities are connected by an extended Bayes’ theorem, with which we can convert a likelihood function and a truth function from one to another. Hence, we can train truth functions (in logic) by sampling distributions (in statistics). This probability framework was developed in the author’s long-term studies on semantic information, statistical learning, and color vision. This paper first proposes the P–T probability framework and explains different probabilities in it by its applications to semantic information theory. Then, this framework and the semantic information methods are applied to statistical learning, statistical mechanics, hypothesis evaluation (including falsification), confirmation, and Bayesian reasoning. Theoretical applications illustrate the reasonability and practicability of this framework. This framework is helpful for interpretable AI. To interpret neural networks, we need further study.


Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 384 ◽  
Author(s):  
Chenguang Lu

After long arguments between positivism and falsificationism, the verification of universal hypotheses was replaced with the confirmation of uncertain major premises. Unfortunately, Hemple proposed the Raven Paradox. Then, Carnap used the increment of logical probability as the confirmation measure. So far, many confirmation measures have been proposed. Measure F proposed by Kemeny and Oppenheim among them possesses symmetries and asymmetries proposed by Elles and Fitelson, monotonicity proposed by Greco et al., and normalizing property suggested by many researchers. Based on the semantic information theory, a measure b* similar to F is derived from the medical test. Like the likelihood ratio, measures b* and F can only indicate the quality of channels or the testing means instead of the quality of probability predictions. Furthermore, it is still not easy to use b*, F, or another measure to clarify the Raven Paradox. For this reason, measure c* similar to the correct rate is derived. Measure c* supports the Nicod Criterion and undermines the Equivalence Condition, and hence, can be used to eliminate the Raven Paradox. An example indicates that measures F and b* are helpful for diagnosing the infection of Novel Coronavirus, whereas most popular confirmation measures are not. Another example reveals that all popular confirmation measures cannot be used to explain that a black raven can confirm “Ravens are black” more strongly than a piece of chalk. Measures F, b*, and c* indicate that the existence of fewer counterexamples is more important than more positive examples’ existence, and hence, are compatible with Popper’s falsification thought.


2018 ◽  
Vol 22 (1) ◽  
pp. 139-151
Author(s):  
Samir Gorsky

Several logical puzzles, riddles and problems are defined based on the notion of games in informative contexts. Hintikka argues that epistemology or the theory of knowledge must be considered from the notion of information. So, knowledge cannot just be based on the notions of belief and justification. The present proposal will focus on the logical structure of information, and not only on the quantification of information as suggested by Claude A. Shannon (1916-2001) (Shannon 1948). In many cases, the information bits, although seemingly or factually contradictory, are quite relevant. The paraconsistent systems of logic offer a formalization of reasoning that can support certain contradictions. The well-known “Bar-Hillel–Carnap Paradox” (Bar-Hillel, 1964) causes embarrassment when it concludes that the informational content of a contradiction would be maximum, opposing the traditional notion that the semantic information must be true, and that contradictions are necessarily false.


Sign in / Sign up

Export Citation Format

Share Document