Gentzen's Proof of Normalization for Natural Deduction

2008 ◽  
Vol 14 (2) ◽  
pp. 240-257 ◽  
Author(s):  
Jan von Plato

AbstractGentzen writes in the published version of his doctoral thesis Untersuchungen über das logische Schliessen (Investigations into logical reasoning) that he was able to prove the normalization theorem only for intuitionistic natural deduction, but not for classical. To cover the latter, he developed classical sequent calculus and proved a corresponding theorem, the famous cut elimination result. Its proof was organized so that a cut elimination result for an intuitionistic sequent calculus came out as a special case, namely the one in which the sequents have at most one formula in the right, succedent part. Thus, there was no need for a direct proof of normalization for intuitionistic natural deduction. The only traces of such a proof in the published thesis are some convertibilities, such as when an implication introduction is followed by an implication elimination [1934–35, II.5.13]. It remained to Dag Prawitz in 1965 to work out a proof of normalization. Another, less known proof was given also in 1965 by Andres Raggio.We found in February 2005 an early handwritten version of Gentzen's thesis, with exactly the above title, but with rather different contents: Most remarkably, it contains a detailed proof of normalization for what became the standard system of natural deduction. The manuscript is located in the Paul Bernays collection at the ETH-Zurichwith the signum Hs. 974: 271. Bernays must have gotten it well before the time of his being expelled from Göttingen on the basis of the racial laws in April 1933.

Author(s):  
A.M. Ungar

Different presentations of the principles of logic reflect different approaches to the subject itself. The three kinds of system discussed here treat as fundamental not logical truth, but consequence, the relation holding between the premises and conclusion of a valid argument. They are, however, inspired by different conceptions of this relation. Natural deduction rules are intended to formalize the way in which mathematicians actually reason in their proofs. Tableau systems reflect the semantic conception of consequence; their rules may be interpreted as the systematic search for a counterexample to an argument. Finally, sequent calculi were developed for the sake of their metamathematical properties. All three systems employ rules rather than axioms. Each logical constant is governed by a pair of rules which do not involve the other constants and are, in some sense, inverse. Take the implication operator ‘→’, for example. In natural deduction, there is an introduction rule for ‘→’ which gives a sufficient condition for inferring an implication, and an elimination rule which gives the strongest conclusion that can be inferred from a premise having the form of an implication. Tableau systems contain a rule which gives a sufficient condition for an implication to be true, and another which gives a sufficient condition for it to be false. A sequent is an array Γ⊢Δ, where Γ and Δ are lists (or sets) of formulas. Sequent calculi have rules for introducing implication on the left of the ‘⊢’ symbol and on the right. The construction of derivations or tableaus in these systems is often more concise and intuitive than in an axiomatic one, and versions of all three have found their way into introductory logic texts. Furthermore, every natural deduction or sequent derivation can be made more direct by transforming it into a ‘normal form’. In the case of the sequent calculus, this result is known as the cut-elimination theorem. It has been applied extensively in metamathematics, most famously to obtain consistency proofs. The semantic inspiration for the rules of tableau construction suggests a very perspicuous proof of classical completeness, one which can also be adapted to the sequent calculus. The introduction and elimination rules of natural deduction are intuitionistically valid and have suggested an alternative semantics based on a conception of meaning as use. The idea is that the meaning of each logical constant is exhausted by its inferential behaviour and can therefore be characterized by its introduction and elimination rules. Although the discussion below focuses on intuitionistic and classical first-order logic, various other logics have also been formulated as sequent, natural deduction and even tableau systems: modal logics, for example, relevance logic, infinitary and higher-order logics. There is a gain in understanding the role of the logical constants which comes from formulating introduction and elimination (or left and right) rules for them. Some authors have even suggested that one must be able to do so for an operator to count as logical.


2017 ◽  
Vol 101 (115) ◽  
pp. 75-98
Author(s):  
Mirjana Borisavljevic

The normalization theorem for the system of extended natural deduction will be proved as a consequence of the cut-elimination theorem, by using the connections between the system of extended natural deduction and a standard system of sequents.


PMLA ◽  
1960 ◽  
Vol 75 (3) ◽  
pp. 309-319 ◽  
Author(s):  
M. E. Grenander

In recent years, critical attention has focussed increasingly on The Princess Casamassima, Henry James's novel of the international revolutionary movement seething beneath the surface of society. The sad wisdom of the mid-twentieth century no longer finds incredible the plot earlier critics dismissed as footling melodrama; and with a recognition of its probability, students of James have undertaken a re-examination of the whole novel. Oddly enough, however, little attention has been paid to its reliance on Roderick Hudson, where the Princess Casamassima first appears. The one significant exception has been a short essay by Louise Bogan, though Christina's complexity and interest have attracted other writers. Yet Roderick Hudson deserves study for its own merits; and, as Miss Bogan has pointed out, the character of the Princess is difficult to interpret unless one also remembers her as Christina Light. It is not true, as Miss Bogan asserts (p. 472), that Christina is “the only figure [James] ever ‘revived’ and carried from one book to another,” for not only do Madame Grandoni and the Prince Casamassima share her transposition; the sculptor Gloriani, who makes his debut in Roderick Hudson, reappears in The Ambassadors. But it is true, as Cargill more accurately points out (p. 108), that “Christina is the only major [italics mine] character that James ever revived from an earlier work,” for he questioned the wisdom of indulging wholesale the writer's “revivalist impulse” to “go on with a character.” Hence Christina Light must have struck him as a very special case. He tells us that he felt, “toward the end of ‘Roderick,‘ that the Princess Casamassima had been launched, that, wound-up with the right silver key, she would go on a certain time by the motion communicated” (AN, p. 18). In the Preface to The Princess Casamassima he continues this train of thought: Christina Light, “extremely disponible” and knowing herself “striking, in the earlier connexion,… couldn't resign herself not to strike again” (AN, pp. 73, 74).


1977 ◽  
Vol 42 (1) ◽  
pp. 11-28 ◽  
Author(s):  
Jonathan P. Seldin

The sequent calculus formulation (L-formulation) of the theory of functionality without the rules allowing for conversion of subjects of [3, §14E6] fails because the (cut) elimination theorem (ET) fails. This can be most easily seen by the fact that it is easy to prove in the systemandbut not (as is obvious if α is an atomic type [an F-simple])The error in the “proof” of ET in [14, §3E6], [3, §14E6], and [7, §9C] occurs in Stage 3, where it is implicitly assumed that if [x]X ≡ [x] Y then X ≡ Y. In the above counterexample, we have [x]x ≡ ∣ ≡ [x](∣x) but x ≢ ∣x. Since the formulation of [2, §9F] is not really satisfactory (for reasons stated in [3, §14E]), a new seguent calculus formulation is needed for the case in which the rules for subject conversions are not present. The main part of this paper is devoted to presenting such a formulation and proving it equivalent to the natural deduction formulation (T-formulation). The paper will conclude in §6 with some remarks on the result that every ob (term) with a type (functional character) has a normal form.The conventions and definitions of [3], especially of §12D and Chapter 14, will be used throughout the paper.


2009 ◽  
Vol 86 (100) ◽  
pp. 27-34
Author(s):  
Mirjana Borisavljevic

Pairs of systems, which consist of a system of sequents and a natural deduction system for some part of intuitionistic logic, are considered. For each of these pairs of systems the property that the normalization theorem is a consequence of the cut-elimination theorem is presented.


2012 ◽  
Vol 18 (3) ◽  
pp. 313-367 ◽  
Author(s):  
Jan von Plato

AbstractGentzen's systems of natural deduction and sequent calculus were byproducts in his program of proving the consistency of arithmetic and analysis. It is suggested that the central component in his results on logical calculi was the use of a tree form for derivations. It allows the composition of derivations and the permutation of the order of application of rules, with a full control over the structure of derivations as a result. Recently found documents shed new light on the discovery of these calculi. In particular, Gentzen set up five different forms of natural calculi and gave a detailed proof of normalization for intuitionistic natural deduction. An early handwritten manuscript of his thesis shows that a direct translation from natural deduction to the axiomatic logic of Hilbert and Ackermann was, in addition to the influence of Paul Hertz, the second component in the discovery of sequent calculus. A system intermediate between the sequent calculus LI and axiomatic logic, denoted LIG in unpublished sources, is implicit in Gentzen's published thesis of 1934–35. The calculus has half rules, half “groundsequents,” and does not allow full cut elimination. Nevertheless, a translation from LI to LIG in the published thesis gives a subformula property for a complete class of derivations in LIG. After the thesis, Gentzen continued to work on variants of sequent calculi for ten more years, in the hope to find a consistency proof for arithmetic within an intuitionistic calculus.


2000 ◽  
Vol 10 (1) ◽  
pp. 121-134 ◽  
Author(s):  
HENK BARENDREGT ◽  
SILVIA GHILEZAN

It is well known that there is an isomorphism between natural deduction derivations and typed lambda terms. Moreover, normalising these terms corresponds to eliminating cuts in the equivalent sequent calculus derivations. Several papers have been written on this topic. The correspondence between sequent calculus derivations and natural deduction derivations is, however, not a one-one map, which causes some syntactic technicalities. The correspondence is best explained by two extensionally equivalent type assignment systems for untyped lambda terms, one corresponding to natural deduction (λN) and the other to sequent calculus (λL). These two systems constitute different grammars for generating the same (type assignment relation for untyped) lambda terms. The second grammar is ambiguous, but the first one is not. This fact explains the many-one correspondence mentioned above. Moreover, the second type assignment system has a ‘cut-free’ fragment (λLcf). This fragment generates exactly the typeable lambda terms in normal form. The cut elimination theorem becomes a simple consequence of the fact that typed lambda terms possess a normal form.


Studia Logica ◽  
2021 ◽  
Author(s):  
Martín Figallo

AbstractThe tetravalent modal logic ($${\mathcal {TML}}$$ TML ) is one of the two logics defined by Font and Rius (J Symb Log 65(2):481–518, 2000) (the other is the normal tetravalent modal logic$${{\mathcal {TML}}}^N$$ TML N ) in connection with Monteiro’s tetravalent modal algebras. These logics are expansions of the well-known Belnap–Dunn’s four-valued logic that combine a many-valued character (tetravalence) with a modal character. In fact, $${\mathcal {TML}}$$ TML is the logic that preserves degrees of truth with respect to tetravalent modal algebras. As Font and Rius observed, the connection between the logic $${\mathcal {TML}}$$ TML and the algebras is not so good as in $${{\mathcal {TML}}}^N$$ TML N , but, as a compensation, it has a better proof-theoretic behavior, since it has a strongly adequate Gentzen calculus (see Font and Rius in J Symb Log 65(2):481–518, 2000). In this work, we prove that the sequent calculus given by Font and Rius does not enjoy the cut-elimination property. Then, using a general method proposed by Avron et al. (Log Univ 1:41–69, 2006), we provide a sequent calculus for $${\mathcal {TML}}$$ TML with the cut-elimination property. Finally, inspired by the latter, we present a natural deduction system, sound and complete with respect to the tetravalent modal logic.


2020 ◽  
Vol 30 (1) ◽  
pp. 157-174 ◽  
Author(s):  
Harley Eades III ◽  
Valeria de Paiva

Abstract Full intuitionistic linear logic (FILL) was first introduced by Hyland and de Paiva, and went against current beliefs that it was not possible to incorporate all of the linear connectives, e.g. tensor, par and implication, into an intuitionistic linear logic. Bierman showed that their formalization of FILL did not enjoy cut elimination as such, but Bellin proposed a small change to the definition of FILL regaining cut elimination and using proof nets. In this note we adopt Bellin’s proposed change and give a direct proof of cut elimination for the sequent calculus. Then we show that a categorical model of FILL in the basic dialectica category is also a linear/non-linear model of Benton and a full tensor model of Melliès’ and Tabareau’s tensorial logic. We give a double-negation translation of linear logic into FILL that explicitly uses par in addition to tensor. Lastly, we introduce a new library to be used in the proof assistant Agda for proving properties of dialectica categories.


2019 ◽  
Vol 12 (4) ◽  
pp. 607-636 ◽  
Author(s):  
EDI PAVLOVIĆ ◽  
NORBERT GRATZL

AbstractThis article investigates the proof theory of the Quantified Argument Calculus (Quarc) as developed and systematically studied by Hanoch Ben-Yami [3, 4]. Ben-Yami makes use of natural deduction (Suppes-Lemmon style), we, however, have chosen a sequent calculus presentation, which allows for the proofs of a multitude of significant meta-theoretic results with minor modifications to the Gentzen’s original framework, i.e., LK. As will be made clear in course of the article LK-Quarc will enjoy cut elimination and its corollaries (including subformula property and thus consistency).


Sign in / Sign up

Export Citation Format

Share Document