scholarly journals A natural extension of natural deduction

1984 ◽  
Vol 49 (4) ◽  
pp. 1284-1300 ◽  
Author(s):  
Peter Schroeder-Heister

One of the main ideas of calculi of natural deduction, as introduced by Jaśkowski and Gentzen, is that assumptions may be discharged in the course of a derivation. As regards sentential logic, this conception will be extended in so far as not only formulas but also rules may serve as assumptions which can be discharged. The resulting calculi and derivations with rules of any finite level are informally introduced in §1, while §§2 and 3 state formal definitions of the concepts involved and basic lemmata. Within this framework, a standard form for introduction and elimination rules for arbitrary n-ary sentential operators is motivated in §4, understood as a contribution to the theory of meaning for logical signs. §5 proves that the set {&, ∨, ⊃, ⋏} of standard intuitionistic connectives is complete, i.e. &, ∨, ⊃, and ⋏ suffice to express each n-ary sentential operator having rules of the standard form given in §4. §6 makes some remarks on related approaches. For an extension of the conception presented here to quantifier logic, see [11].


2021 ◽  
Author(s):  
Alex Belikov

Abstract In this work, we propose a variant of so-called informational semantics, a technique elaborated by Voishvillo, for two infectious logics, Deutsch’s ${\mathbf{S}_{\mathbf{fde}}}$ and Szmuc’s $\mathbf{dS}_{\mathbf{fde}}$. We show how the machinery of informational semantics can be effectively used to analyse truth and falsity conditions of disjunction and conjunction. Using this technique, it is possible to claim that disjunction and conjunction can be rightfully regarded as such, a claim which was disputed in the recent literature. Both ${\mathbf{S}_{\mathbf{fde}}}$ and $\mathbf{dS}_{\mathbf{fde}}$ are formalized in terms of natural deduction. This allows us to solve several problems: to develop a natural deduction calculus for ${\mathbf{S}_{\mathbf{fde}}}$ containing the standard form of disjunction elimination (in contrast to the calculus by Petrukhin), to introduce the first natural deduction calculus for $\mathbf{dS}_{\mathbf{fde}}$ and to reflect the fundamental symmetry between ${\mathbf{S}_{\mathbf{fde}}}$ and $\mathbf{dS}_{\mathbf{fde}}$ on proof-theoretical level forming a convenient basis for obtaining their well-known extensions $\mathbf{K}^{\mathbf{w}}_{\mathbf{3}}$ and $\mathbf{PWK}$.



2019 ◽  
Vol 16 (5) ◽  
pp. 159
Author(s):  
Nissim Francez

The paper  presents a plan for negation, proposing a paradigm shift from the Australian plan for negation,  leading to a family of contra-classical logics. The two main ideas are the following:  Instead of shifting points of evaluation (in a frame), shift the evaluated formula. Introduce an incompatibility set for every atomic formula, extended to any compound formula, and impose the condition on valuations that a formula evaluates to true iff all the formulas in its incompatibility set evaluate to false. Thus, atomic sentences are not independent in their truth-values.  The resulting negation, in addition to excluding the negated formula, provides a positive alternative to the negated formula. I  also present a sound and complete natural deduction proof systems for those logics. In addition, the kind of negation considered in this paper is shown to provide an innovative notion of grounding negation.   



2016 ◽  
Vol 31 (3) ◽  
pp. 278-321 ◽  
Author(s):  
Fu Zhang ◽  
Jingwei Cheng ◽  
Zongmin Ma

AbstractOntology, as a standard (World Wide Web Consortium recommendation) for representing knowledge in the Semantic Web, has become a fundamental and critical component for developing applications in different real-world scenarios. However, it is widely pointed out that classical ontology model is not sufficient to deal with imprecise and vague knowledge strongly characterizing some real-world applications. Thus, a requirement of extending ontologies naturally arises in many practical applications of knowledge-based systems, in particular the Semantic Web. In order to provide the necessary means to handle such vague and imprecise information there are today many proposals for fuzzy extensions to ontologies, and until now the literature on fuzzy ontologies has been flourishing. To investigate fuzzy ontologies and more importantly serve as helping readers grasp the main ideas and results of fuzzy ontologies, and to highlight an ongoing research on fuzzy approaches for knowledge semantic representation based on ontologies, as well as their applications on various domains,in this paper,we provide a comprehensive overview of fuzzy ontologies. In detail, wefirstintroduce fuzzy ontologies from the most common aspects such asrepresentation(including categories, formal definitions, representation languages, and tools of fuzzy ontologies),reasoning(including reasoning techniques and reasoners), andapplications(the most relevant applications about fuzzy ontologies). Then,the other important issueson fuzzy ontologies, such asconstruction,mapping,integration,query,storage,evaluation,extension, anddirections for future research, are also discussed in detail. Also, we make somecomparisons and analysesin our whole review.



2020 ◽  
pp. 32-78
Author(s):  
Pieter Adriaans

A computational theory of meaning tries to understand the phenomenon of meaning in terms of computation. Here we give an analysis in the context of Kolmogorov complexity. This theory measures the complexity of a data set in terms of the length of the smallest program that generates the data set on a universal computer. As a natural extension, the set of all programs that produce a data set on a computer can be interpreted as the set of meanings of the data set. We give an analysis of the Kolmogorov structure function and some other attempts to formulate a mathematical theory of meaning in terms of two-part optimal model selection. We show that such theories will always be context dependent: the invariance conditions that make Kolmogorov complexity a valid theory of measurement fail for this more general notion of meaning. One cause is the notion of polysemy: one data set (i.e., a string of symbols) can have different programs with no mutual information that compresses it. Another cause is the existence of recursive bijections between ℕ and ℕ2 for which the two-part code is always more efficient. This generates vacuous optimal two-part codes. We introduce a formal framework to study such contexts in the form of a theory that generalizes the concept of Turing machines to learning agents that have a memory and have access to each other’s functions in terms of a possible world semantics. In such a framework, the notions of randomness and informativeness become agent dependent. We show that such a rich framework explains many of the anomalies of the correct theory of algorithmic complexity. It also provides perspectives for, among other things, the study of cognitive and social processes. Finally, we sketch some application paradigms of the theory.



Author(s):  
Rick Salmon

This first chapter reviews the fundamental principles of fluid mechanics, emphasizing the relationship between the underlying microscopic description of the fluid as a swarm of molecules, and the much more useful (but less genuine) macroscopic description of the fluid as a set of continuous fields. Although it is certainly possible to study fluids without recognizing their true particulate nature, such an approach avoids important ideas about averaging that are needed later, especially in the study of turbulence. It is best to encounter these ideas at the earliest opportunity. Moreover, even if one adopts a strictly macroscopic viewpoint (as we eventually shall), one still has the choice between Eulerian and Lagrangian field theories. The Eulerian theory is the more useful and succinct, and most textbooks employ it exclusively. However, the Lagrangian theory, which regards the fluid as a continuous field of particles, is the more complete and illuminating, and it represents a natural extension of the ideas associated with the underlying molecular dynamics to the macroscopic level of description. All fluids are composed of molecules. We shall regard these molecules as point masses that exactly obey Newton’s laws of motion. This assumption is not precisely correct; the molecular motions are really governed by quantum mechanics. However, quantum effects are frequently unimportant, and the main ideas we want to develop are independent of the precise nature of the underlying molecular dynamics. It only matters that there be an exact underlying molecular dynamics, so that one could in principle predict the behavior of the whole fluid by solving the equations governing all of its molecules. It is of course utterly impractical to follow the motion of every molecule, because even the smallest volume of fluid contains an immense number of molecules. We are immediately forced to consider dynamical quantities that represent averages over many molecules.



1996 ◽  
Vol 6 (6) ◽  
pp. 579-612 ◽  
Author(s):  
Erik Barendsen ◽  
Sjaak Smetsers

We present two type systems for term graph rewriting: conventional typing and (polymorphic) uniqueness typing. The latter is introduced as a natural extension of simple algebraic and higher-order uniqueness typing. The systems are given in natural deduction style using an inductive syntax of graph denotations with familiar constructs such as let and case.The conventional system resembles traditional Curry-style typing systems in functional programming languages. Uniqueness typing extends this with reference count information. In both type systems, typing is preserved during evaluation, and types can be determined effectively. Moreover, with respect to a graph rewriting semantics, both type systems turn out to be sound.





2006 ◽  
Vol 71 (1) ◽  
pp. 35-66 ◽  
Author(s):  
Ross T. Brady

Fitch-style natural deduction was first introduced into relevant logic by Anderson in [1960], for the sentential logic E of entailment and its quantincational extension EQ. This was extended by Anderson and Belnap to the sentential relevant logics R and T and some of their fragments in [ENT1], and further extended to a wide range of sentential and quantified relevant logics by Brady in [1984]. This was done by putting conditions on the elimination rules, →E, ~E, ⋁E and ∃E, pertaining to the set of dependent hypotheses for formulae used in the application of the rule. Each of these rules were subjected to the same condition, this condition varying from logic to logic. These conditions, together with the set of natural deduction rules, precisely determine the particular relevant logic. Generally, this is a simpler representation of a relevant logic than the standard Routley-Meyer semantics, with its existential modelling conditions stated in terms of an otherwise arbitrary 3-place relation R, which is defined over a possibly infinite set of worlds. Readers are urged to refer to Brady [1984], if unfamiliar with the above natural deduction systems, but we will introduce in §2 a modified version in full detail.Natural deduction for classical logic was invented by Jaskowski and Gentzen, but it was Prawitz in [1965] who normalized natural deduction, streamlining its rules so as to allow a subformula property to be proved. (This key property ensures that each formula in the proof of a theorem is indeed a subformula of that theorem.)



1979 ◽  
Vol 46 ◽  
pp. 368
Author(s):  
Clinton B. Ford

A “new charts program” for the Americal Association of Variable Star Observers was instigated in 1966 via the gift to the Association of the complete variable star observing records, charts, photographs, etc. of the late Prof. Charles P. Olivier of the University of Pennsylvania (USA). Adequate material covering about 60 variables, not previously charted by the AAVSO, was included in this original data, and was suitably charted in reproducible standard format.Since 1966, much additional information has been assembled from other sources, three Catalogs have been issued which list the new or revised charts produced, and which specify how copies of same may be obtained. The latest such Catalog is dated June 1978, and lists 670 different charts covering a total of 611 variables none of which was charted in reproducible standard form previous to 1966.



Sign in / Sign up

Export Citation Format

Share Document