scholarly journals Syntax and semantics of simple paracomplete logics

2013 ◽  
Vol 19 ◽  
pp. 325-333
Author(s):  
В.М. Попов ◽  
В.О. Шангин

For an arbitrary fixed element $\beta$ in $\{1; 2; 3; ...; \omega\}$ both a sequent calculus and a natural deduction calculus which axiomatise simple paracomplete logic $I_{2;\beta}$ are built. Additionally, a valuation semantic which is adequate to logic $I_{2;\beta}$ is constructed. For an arbitrary fixed element $\gamma$ in $\{1; 2; 3;...\}$ a cortege semantic which is adequate to logic $I_{2;\gamma}$ is described. A number of results obtainable with the axiomatisations and semantics in question are formulated.

2021 ◽  
Vol 43 (2) ◽  
pp. 1-55
Author(s):  
Bernardo Toninho ◽  
Nobuko Yoshida

This work exploits the logical foundation of session types to determine what kind of type discipline for the Λ-calculus can exactly capture, and is captured by, Λ-calculus behaviours. Leveraging the proof theoretic content of the soundness and completeness of sequent calculus and natural deduction presentations of linear logic, we develop the first mutually inverse and fully abstract processes-as-functions and functions-as-processes encodings between a polymorphic session π-calculus and a linear formulation of System F. We are then able to derive results of the session calculus from the theory of the Λ-calculus: (1) we obtain a characterisation of inductive and coinductive session types via their algebraic representations in System F; and (2) we extend our results to account for value and process passing, entailing strong normalisation.


1977 ◽  
Vol 42 (1) ◽  
pp. 11-28 ◽  
Author(s):  
Jonathan P. Seldin

The sequent calculus formulation (L-formulation) of the theory of functionality without the rules allowing for conversion of subjects of [3, §14E6] fails because the (cut) elimination theorem (ET) fails. This can be most easily seen by the fact that it is easy to prove in the systemandbut not (as is obvious if α is an atomic type [an F-simple])The error in the “proof” of ET in [14, §3E6], [3, §14E6], and [7, §9C] occurs in Stage 3, where it is implicitly assumed that if [x]X ≡ [x] Y then X ≡ Y. In the above counterexample, we have [x]x ≡ ∣ ≡ [x](∣x) but x ≢ ∣x. Since the formulation of [2, §9F] is not really satisfactory (for reasons stated in [3, §14E]), a new seguent calculus formulation is needed for the case in which the rules for subject conversions are not present. The main part of this paper is devoted to presenting such a formulation and proving it equivalent to the natural deduction formulation (T-formulation). The paper will conclude in §6 with some remarks on the result that every ob (term) with a type (functional character) has a normal form.The conventions and definitions of [3], especially of §12D and Chapter 14, will be used throughout the paper.


2008 ◽  
Vol 14 (2) ◽  
pp. 240-257 ◽  
Author(s):  
Jan von Plato

AbstractGentzen writes in the published version of his doctoral thesis Untersuchungen über das logische Schliessen (Investigations into logical reasoning) that he was able to prove the normalization theorem only for intuitionistic natural deduction, but not for classical. To cover the latter, he developed classical sequent calculus and proved a corresponding theorem, the famous cut elimination result. Its proof was organized so that a cut elimination result for an intuitionistic sequent calculus came out as a special case, namely the one in which the sequents have at most one formula in the right, succedent part. Thus, there was no need for a direct proof of normalization for intuitionistic natural deduction. The only traces of such a proof in the published thesis are some convertibilities, such as when an implication introduction is followed by an implication elimination [1934–35, II.5.13]. It remained to Dag Prawitz in 1965 to work out a proof of normalization. Another, less known proof was given also in 1965 by Andres Raggio.We found in February 2005 an early handwritten version of Gentzen's thesis, with exactly the above title, but with rather different contents: Most remarkably, it contains a detailed proof of normalization for what became the standard system of natural deduction. The manuscript is located in the Paul Bernays collection at the ETH-Zurichwith the signum Hs. 974: 271. Bernays must have gotten it well before the time of his being expelled from Göttingen on the basis of the racial laws in April 1933.


Author(s):  
Neil Tennant

Parallelized elimination rules in natural deduction correspond to Left rules in the sequent calculus; and introduction rules correspond to Right rules. These rules may be construed as inductive clauses in the inductive definition of the notion of sequent proof. There is a natural isomorphism between natural deductions in Core Logic and the sequent proofs that correspond to them. We examine the relations, between sequents, of concentration and dilution; and describe what it is for one sequent to strengthen another. We examine some possible global restrictions on proof-formation, designed to prevent proofs from proving dilutions of sequents already proved by a subproof. We establish the important result that the sequent rules of Core Logic maintain concentration, and we explain its importance for automated proof-search.


1987 ◽  
Vol 52 (3) ◽  
pp. 665-680 ◽  
Author(s):  
Neil Tennant

Relevance logic began in an attempt to avoid the so-called fallacies of relevance. These fallacies can be in implicational form or in deductive form. For example, Lewis's first paradox can beset a system in implicational form, in that the system contains as a theorem the formula (A & ∼A) → B; or it can beset it in deductive form, in that the system allows one to deduce B from the premisses A, ∼A.Relevance logic in the tradition of Anderson and Belnap has been almost exclusively concerned with characterizing a relevant conditional. Thus it has attacked the problem of relevance in its implicational form. Accordingly for a relevant conditional → one would not have as a theorem the formula (A & ∼A) → B. Other theorems even of minimal logic would also be lacking. Perhaps most important among these is the formula (A → (B → A)). It is also a well-known feature of their system R that it lacks the intuitionistically valid formula ((A ∨ B) & ∼A) → B (disjunctive syllogism).But it is not the case that any relevance logic worth the title even has to concern itself with the conditional, and hence with the problem in its implicational form. The problem arises even for a system without the conditional primitive. It would still be an exercise in relevance logic, broadly construed, to formulate a deductive system free of the fallacies of relevance in deductive form even if this were done in a language whose only connectives were, say, &, ∨ and ∼. Solving the problem of relevance in this more basic deductive form is arguably a precondition for solving it for the conditional, if we suppose (as is reasonable) that the relevant conditional is to be governed by anything like the rule of conditional proof.


2012 ◽  
Vol 18 (3) ◽  
pp. 313-367 ◽  
Author(s):  
Jan von Plato

AbstractGentzen's systems of natural deduction and sequent calculus were byproducts in his program of proving the consistency of arithmetic and analysis. It is suggested that the central component in his results on logical calculi was the use of a tree form for derivations. It allows the composition of derivations and the permutation of the order of application of rules, with a full control over the structure of derivations as a result. Recently found documents shed new light on the discovery of these calculi. In particular, Gentzen set up five different forms of natural calculi and gave a detailed proof of normalization for intuitionistic natural deduction. An early handwritten manuscript of his thesis shows that a direct translation from natural deduction to the axiomatic logic of Hilbert and Ackermann was, in addition to the influence of Paul Hertz, the second component in the discovery of sequent calculus. A system intermediate between the sequent calculus LI and axiomatic logic, denoted LIG in unpublished sources, is implicit in Gentzen's published thesis of 1934–35. The calculus has half rules, half “groundsequents,” and does not allow full cut elimination. Nevertheless, a translation from LI to LIG in the published thesis gives a subformula property for a complete class of derivations in LIG. After the thesis, Gentzen continued to work on variants of sequent calculi for ten more years, in the hope to find a consistency proof for arithmetic within an intuitionistic calculus.


1978 ◽  
Vol 43 (4) ◽  
pp. 643-649 ◽  
Author(s):  
Jonathan P. Seldin

In [1, §14E], a sequent calculus formulation (L.-formulation) of type assignment (theory of functionality) is given for a system based either on a system of combinators with strong reduction or on a system of λη-calculus provided that the rule for subject conversion (which says that if X has type α and X cnv Y then Y has type α) is postulated for the system. This sequent calculus formulation does not work for a system based on the λβ-calculus. In [2] I introduced a sequent calculus formulation for a system without the rule of subject conversion based on any of the three systems mentioned above. Further, in [2, §5] I pointed out that if proper inclusions of the form of the statement that λx·x is a function from α to β are postulated, then functions are identified with their restrictions in the λη-calculus but not in the λβ-calculus, and that therefore there is some interest in having a sequent calculus formulation of type assignment with the rule of subject conversion for systems based on the λβ-calculus. In this paper, such a system is presented, the elimination theorem (Gentzen's Hauptsatz) is proved for it, and it is proved equivalent to the natural deduction formulation of [1, §14D].I shall assume familiarity with the λβ-calculus, and shall use (with minor modifications) the notational conventions of [1]. Hence, the theory of type assignment (theory of functionality) will be based on an atomic constant F such that if α and β are types then Fαβ represents roughly the type of functions from α to β (more exactly it represents the type of functions whose domain includes α and under which the image of α is included in β).


Sign in / Sign up

Export Citation Format

Share Document