Stabilization — an alternative to double-negation translation for classical natural deduction

2017 ◽  
pp. 167-199 ◽  
Author(s):  
Ralph Matthes
1999 ◽  
Vol 34 (1) ◽  
pp. 7-23 ◽  
Author(s):  
Stephen Read

In order to explicate Gentzen’s famous remark that the introduction-rules for logical constants give their meaning, the elimination-rules being simply consequences of the meaning so given, we develop natural deduction rules for Sheffer’s stroke, alternative denial. The first system turns out to lack Double Negation. Strengthening the introduction-rules by allowing the introduction of Sheffer’s stroke into a disjunctive context produces a complete system of classical logic, one which preserves the harmony between the rules which Gentzen wanted: all indirect proof reduces to direct proof.


2008 ◽  
Vol 67 (2) ◽  
pp. 119-123 ◽  
Author(s):  
Grégory Lo Monaco ◽  
Florent Lheureux ◽  
Séverine Halimi-Falkowicz

Deux techniques permettent le repérage systématique du système central d’une représentation sociale: la technique de la mise en cause (MEC) et le modèle des schèmes cognitifs de base (SCB). Malgré cet apport, ces techniques présentent des inconvénients: la MEC, de par son principe de double négation, et les SCB, de par la longueur de passation. Une nouvelle technique a été développée: le test d’indépendance au contexte (TIC). Elle vise à rendre compte des caractères trans-situationnel ou contingent des éléments représentationnels, tout en présentant un moindre coût cognitif perçu. Deux objets de représentation ont été étudiés auprès d’une population étudiante. Les résultats révèlent que le TIC paraît, aux participants, cognitivement moins coûteux que la MEC. De plus, le TIC permet un repérage du noyau central identique à celui offert par la MEC.


Author(s):  
Timothy Williamson

The book argues that our use of conditionals is governed by imperfectly reliable heuristics, in the psychological sense of fast and frugal (or quick and dirty) ways of assessing them. The primary heuristic is this: to assess ‘If A, C’, suppose A and on that basis assess C; whatever attitude you take to C conditionally on A (such as acceptance, rejection, or something in between) take unconditionally to ‘If A, C’. This heuristic yields both the equation of the probability of ‘If A, C’ with the conditional probability of C on A and standard natural deduction rules for the conditional. However, these results can be shown to make the heuristic implicitly inconsistent, and so less than fully reliable. There is also a secondary heuristic: pass conditionals freely from one context to another under normal conditions for acceptance of sentences on the basis of memory and testimony. The effect of the secondary heuristic is to undermine interpretations on which ‘if’ introduces a special kind of context-sensitivity. On the interpretation which makes best sense of the two heuristics, ‘if’ is simply the truth-functional conditional. Apparent counterexamples to truth-functionality are artefacts of reliance on the primary heuristic in cases where it is unreliable. The second half of the book concerns counterfactual conditionals, as expressed with ‘if’ and ‘would’. It argues that ‘would’ is an independently meaningful modal operator for contextually restricted necessity: the meaning of counterfactuals is simply that derived compositionally from the meanings of their constituents, including ‘if’ and ‘would’, making them contextually restricted strict conditionals.


2021 ◽  
pp. 1-22
Author(s):  
SHAWN STANDEFER

Abstract Anderson and Belnap presented indexed Fitch-style natural deduction systems for the relevant logics R, E, and T. This work was extended by Brady to cover a range of relevant logics. In this paper I present indexed tree natural deduction systems for the Anderson–Belnap–Brady systems and show how to translate proofs in one format into proofs in the other, which establishes the adequacy of the tree systems.


Linguistics ◽  
2020 ◽  
Vol 58 (4) ◽  
pp. 967-1008
Author(s):  
Mena B. Lafkioui ◽  
Vermondo Brugnatelli

AbstractDouble and triple negation marking is an ancient and deep-rooted feature that is attested in almost the entire Berber-speaking area (North Africa and diaspora), regardless of the type of negators in use; i. e., discontinuous markers (preverbal and postverbal negators) and dedicated negative verb stem alternations. In this article, we deal with the main stages that have led to the present Berber negation patterns and we argue, from a typological viewpoint, that certain morphophonetic mechanisms are to be regarded as a hitherto overlooked source for new negators. Moreover, we present a number of motivations that account for the hypothesis that, in Berber, those languages with both a preverbal and a postverbal negator belong to a diachronic stage prior to the attested languages with a preverbal negator only. Consequently, the study demonstrates that the Jespersen Cycle is back to the beginning in certain Berber languages. In doing so, we also show that Berber is to be regarded as a substrate in the development of double negation in North African Arabic. In addition, the study accounts for the asymmetric nature of Berber negation, although some new developments towards more symmetrical negation configurations are also attested.


Mathematics ◽  
2021 ◽  
Vol 9 (4) ◽  
pp. 385
Author(s):  
Hyeonseung Im

A double negation translation (DNT) embeds classical logic into intuitionistic logic. Such translations correspond to continuation passing style (CPS) transformations in programming languages via the Curry-Howard isomorphism. A selective CPS transformation uses a type and effect system to selectively translate only nontrivial expressions possibly with computational effects into CPS functions. In this paper, we review the conventional call-by-value (CBV) CPS transformation and its corresponding DNT, and provide a logical account of a CBV selective CPS transformation by defining a selective DNT via the Curry-Howard isomorphism. By using an annotated proof system derived from the corresponding type and effect system, our selective DNT translates classical proofs into equivalent intuitionistic proofs, which are smaller than those obtained by the usual DNTs. We believe that our work can serve as a reference point for further study on the Curry-Howard isomorphism between CPS transformations and DNTs.


2021 ◽  
Vol 43 (2) ◽  
pp. 1-55
Author(s):  
Bernardo Toninho ◽  
Nobuko Yoshida

This work exploits the logical foundation of session types to determine what kind of type discipline for the Λ-calculus can exactly capture, and is captured by, Λ-calculus behaviours. Leveraging the proof theoretic content of the soundness and completeness of sequent calculus and natural deduction presentations of linear logic, we develop the first mutually inverse and fully abstract processes-as-functions and functions-as-processes encodings between a polymorphic session π-calculus and a linear formulation of System F. We are then able to derive results of the session calculus from the theory of the Λ-calculus: (1) we obtain a characterisation of inductive and coinductive session types via their algebraic representations in System F; and (2) we extend our results to account for value and process passing, entailing strong normalisation.


2015 ◽  
Vol 8 (2) ◽  
pp. 296-305 ◽  
Author(s):  
NISSIM FRANCEZ

AbstractThe paper proposes an extension of the definition of a canonical proof, central to proof-theoretic semantics, to a definition of a canonical derivation from open assumptions. The impact of the extension on the definition of (reified) proof-theoretic meaning of logical constants is discussed. The extended definition also sheds light on a puzzle regarding the definition of local-completeness of a natural-deduction proof-system, underlying its harmony.


Lingua ◽  
2015 ◽  
Vol 163 ◽  
pp. 75-107 ◽  
Author(s):  
Viviane Déprez ◽  
Susagna Tubau ◽  
Anne Cheylus ◽  
M. Teresa Espinal

Sign in / Sign up

Export Citation Format

Share Document