Normalized natural deduction systems for some relevant logics I: The logic DW

2006 ◽  
Vol 71 (1) ◽  
pp. 35-66 ◽  
Author(s):  
Ross T. Brady

Fitch-style natural deduction was first introduced into relevant logic by Anderson in [1960], for the sentential logic E of entailment and its quantincational extension EQ. This was extended by Anderson and Belnap to the sentential relevant logics R and T and some of their fragments in [ENT1], and further extended to a wide range of sentential and quantified relevant logics by Brady in [1984]. This was done by putting conditions on the elimination rules, →E, ~E, ⋁E and ∃E, pertaining to the set of dependent hypotheses for formulae used in the application of the rule. Each of these rules were subjected to the same condition, this condition varying from logic to logic. These conditions, together with the set of natural deduction rules, precisely determine the particular relevant logic. Generally, this is a simpler representation of a relevant logic than the standard Routley-Meyer semantics, with its existential modelling conditions stated in terms of an otherwise arbitrary 3-place relation R, which is defined over a possibly infinite set of worlds. Readers are urged to refer to Brady [1984], if unfamiliar with the above natural deduction systems, but we will introduce in §2 a modified version in full detail.Natural deduction for classical logic was invented by Jaskowski and Gentzen, but it was Prawitz in [1965] who normalized natural deduction, streamlining its rules so as to allow a subformula property to be proved. (This key property ensures that each formula in the proof of a theorem is indeed a subformula of that theorem.)

2020 ◽  
Author(s):  
Tore Fjetland Øgaard

Abstract Many relevant logics are conservatively extended by Boolean negation. Not all, however. This paper shows an acute form of non-conservativeness, namely that the Boolean-free fragment of the Boolean extension of a relevant logic need not always satisfy the variable-sharing property. In fact, it is shown that such an extension can in fact yield classical logic. For a vast range of relevant logic, however, it is shown that the variable-sharing property, restricted to the Boolean-free fragment, still holds for the Boolean extended logic.


1992 ◽  
Vol 57 (4) ◽  
pp. 1425-1440 ◽  
Author(s):  
Ewa Orlowska

AbstractA method is presented for constructing natural deduction-style systems for propositional relevant logics. The method consists in first translating formulas of relevant logics into ternary relations, and then defining deduction rules for a corresponding logic of ternary relations. Proof systems of that form are given for various relevant logics. A class of algebras of ternary relations is introduced that provides a relation-algebraic semantics for relevant logics.


2019 ◽  
Vol 16 (2) ◽  
pp. 10
Author(s):  
Peter Verdée ◽  
Inge De Bal ◽  
Aleksandra Samonek

In this paper we first develop a logic independent account of relevant implication. We propose a stipulative denition of what it means for a multiset of premises to relevantly L-imply a multiset of conclusions, where L is a Tarskian consequence relation: the premises relevantly imply the conclusions iff there is an abstraction of the pair <premises, conclusions> such that the abstracted premises L-imply the abstracted conclusions and none of the abstracted premises or the abstracted conclusions can be omitted while still maintaining valid L-consequence.          Subsequently we apply this denition to the classical logic (CL) consequence relation to obtain NTR-consequence, i.e. the relevant CL-consequence relation in our sense, and develop a sequent calculus that is sound and complete with regard to relevant CL-consequence. We present a sound and complete sequent calculus for NTR. In a next step we add rules for an object language relevant implication to thesequent calculus. The object language implication reflects exactly the NTR-consequence relation. One can see the resulting logic NTR-> as a relevant logic in the traditional sense of the word.       By means of a translation to the relevant logic R, we show that the presented logic NTR is very close to relevance logics in the Anderson-Belnap-Dunn-Routley-Meyer tradition. However, unlike usual relevant logics, NTR is decidable for the full language, Disjunctive Syllogism (A and ~AvB relevantly imply B) and Adjunction (A and B relevantly imply A&B) are valid, and neither Modus Ponens nor the Cut rule are admissible.


2021 ◽  
Vol 18 (5) ◽  
pp. 154-288
Author(s):  
Robert Meyer

The purpose of this paper is to formulate first-order Peano arithmetic within the resources of relevant logic, and to demonstrate certain properties of the system thus formulated. Striking among these properties are the facts that (1) it is trivial that relevant arithmetic is absolutely consistent, but (2) classical first-order Peano arithmetic is straightforwardly contained in relevant arithmetic. Under (1), I shall show in particular that 0 = 1 is a non-theorem of relevant arithmetic; this, of course, is exactly the formula whose unprovability was sought in the Hilbert program for proving arithmetic consistent. Under (2), I shall exhibit the requisite translation, drawing some Goedelian conclusions therefrom. Left open, however, is the critical problem whether Ackermann’s rule γ is admissible for theories of relevant arithmetic. The particular system of relevant Peano arithmetic featured in this paper shall be called R♯. Its logical base shall be the system R of relevant implication, taken in its first-order form RQ. Among other Peano arithmetics we shall consider here in particular the systems C♯, J♯, and RM3♯; these are based respectively on the classical logic C, the intuitionistic logic J, and the Sobocinski-Dunn semi-relevant logic RM3. And another feature of the paper will be the presentation of a system of natural deduction for R♯, along lines valid for first-order relevant theories in general. This formulation of R♯ makes it possible to construct relevantly valid arithmetical deductions in an easy and natural way; it is based on, but is in some respects more convenient than, the natural deduction formulations for relevant logics developed by Anderson and Belnap in Entailment.


1999 ◽  
Vol 34 (1) ◽  
pp. 7-23 ◽  
Author(s):  
Stephen Read

In order to explicate Gentzen’s famous remark that the introduction-rules for logical constants give their meaning, the elimination-rules being simply consequences of the meaning so given, we develop natural deduction rules for Sheffer’s stroke, alternative denial. The first system turns out to lack Double Negation. Strengthening the introduction-rules by allowing the introduction of Sheffer’s stroke into a disjunctive context produces a complete system of classical logic, one which preserves the harmony between the rules which Gentzen wanted: all indirect proof reduces to direct proof.


Author(s):  
Timothy Williamson

The book argues that our use of conditionals is governed by imperfectly reliable heuristics, in the psychological sense of fast and frugal (or quick and dirty) ways of assessing them. The primary heuristic is this: to assess ‘If A, C’, suppose A and on that basis assess C; whatever attitude you take to C conditionally on A (such as acceptance, rejection, or something in between) take unconditionally to ‘If A, C’. This heuristic yields both the equation of the probability of ‘If A, C’ with the conditional probability of C on A and standard natural deduction rules for the conditional. However, these results can be shown to make the heuristic implicitly inconsistent, and so less than fully reliable. There is also a secondary heuristic: pass conditionals freely from one context to another under normal conditions for acceptance of sentences on the basis of memory and testimony. The effect of the secondary heuristic is to undermine interpretations on which ‘if’ introduces a special kind of context-sensitivity. On the interpretation which makes best sense of the two heuristics, ‘if’ is simply the truth-functional conditional. Apparent counterexamples to truth-functionality are artefacts of reliance on the primary heuristic in cases where it is unreliable. The second half of the book concerns counterfactual conditionals, as expressed with ‘if’ and ‘would’. It argues that ‘would’ is an independently meaningful modal operator for contextually restricted necessity: the meaning of counterfactuals is simply that derived compositionally from the meanings of their constituents, including ‘if’ and ‘would’, making them contextually restricted strict conditionals.


2021 ◽  
pp. 1-22
Author(s):  
SHAWN STANDEFER

Abstract Anderson and Belnap presented indexed Fitch-style natural deduction systems for the relevant logics R, E, and T. This work was extended by Brady to cover a range of relevant logics. In this paper I present indexed tree natural deduction systems for the Anderson–Belnap–Brady systems and show how to translate proofs in one format into proofs in the other, which establishes the adequacy of the tree systems.


2020 ◽  
pp. 31-67
Author(s):  
Timothy Williamson

This chapter argues that the Suppositional Rule is a fallible heuristic, because it has inconsistent consequences. They arise in several ways: (i) it implies standard natural deduction rules for ‘if’, and analogous but incompatible rules for refutation in place of proof; (ii) it implies the equation of the probability of ‘If A, C’ with the conditional probability of C on A, which is subject to the trivialization results of David Lewis and others; (iii) its application to complex attitudes generates further inconsistencies. The Suppositional Rule is compared to inconsistent principles built into other linguistic practices: disquotation for ‘true’ and ‘false’ generate Liar-like paradoxes; tolerance principles for vague expressions generate sorites paradoxes. Their status as fallible, semantically invalid but mostly reliable heuristics is not immediately available to competent speakers.


2019 ◽  
Vol 29 (5) ◽  
pp. 631-663 ◽  
Author(s):  
Roberto Ciuni ◽  
Thomas Macaulay Ferguson ◽  
Damian Szmuc

AbstractA wide family of many-valued logics—for instance, those based on the weak Kleene algebra—includes a non-classical truth-value that is ‘contaminating’ in the sense that whenever the value is assigned to a formula $\varphi $, any complex formula in which $\varphi $ appears is assigned that value as well. In such systems, the contaminating value enjoys a wide range of interpretations, suggesting scenarios in which more than one of these interpretations are called for. This calls for an evaluation of systems with multiple contaminating values. In this paper, we consider the countably infinite family of multiple-conclusion consequence relations in which classical logic is enriched with one or more contaminating values whose behaviour is determined by a linear ordering between them. We consider some motivations and applications for such systems and provide general characterizations for all consequence relations in this family. Finally, we provide sequent calculi for a pair of four-valued logics including two linearly ordered contaminating values before defining two-sided sequent calculi corresponding to each of the infinite family of many-valued logics studied in this paper.


Sign in / Sign up

Export Citation Format

Share Document