Analytic natural deduction

1965 ◽  
Vol 30 (2) ◽  
pp. 123-139 ◽  
Author(s):  
Raymond M. Smullyan

We consider some natural deduction systems for quantification theory whose only quantificational rules involve elimination of quantifiers. By imposing certain restrictions on the rules, we obtain a system which we term Analytic Natural Deduction; it has the property that the only formulas used in the proof of a given formula X are either subformulas of X, or negations of subformulas of X. By imposing further restrictions, we obtain a canonical procedure which is bound to terminate, if the formula being tested is valid. The procedure (ultimately in the spirit of Herbrand [1]) can be thought of as a partial linearization of the method of semantical tableaux [2], [3]. A further linearization leads to a variant of Gentzen's system which we shall study in a sequel.The completeness theorem for semantical tableaux rests essentially on König's lemma on infinite graphs [4]. Our completeness theorem for natural deduction uses as a counterpart to König's lemma, a lemma on infinite “nest structures”, as they are to be defined. These structures can be looked at as the underlying combinatorial basis of a wide variety of natural deduction systems.In § 1 we study these nest structures in complete abstraction from quantification theory; the results of this section are of a purely combinatorial nature. The applications to quantification theory are given in § 2.


1950 ◽  
Vol 15 (2) ◽  
pp. 93-102 ◽  
Author(s):  
W. V. Quine

For Gentzen's natural deduction, a formalized method of deduction in quantification theory dating from 1934, these important advantages may be claimed: it corresponds more closely than other methods of formalized quantification theory to habitual unformalized modes of reasoning, and it consequently tends to minimize the false moves involved in seeking to construct proofs. The object of this paper is to present and justify a simplification of Gentzen's method, to the end of enhancing the advantages just claimed. No acquaintance with Gentzen's work will be presupposed.A further advantage of Gentzen's method, also somewhat enhanced in my revision of the method, is relative brevity of proofs. In the more usual systematizations of quantification theory, theorems are derived from axiom schemata by proofs which, if rendered in full, would quickly run to unwieldy lengths. Consequently an abbreviative expedient is usually adopted which consists in preserving and numbering theorems for reference in proofs of subsequent theorems. Further brevity is commonly gained by establishing metatheorems, or derived rules, for reference in proving subsequent theorems. In natural deduction, on the other hand, proofs tend to be so short that the abbreviative expedients just now mentioned may conveniently be dispensed with—at least until theorems of extraordinary complexity are embarked upon. In natural deduction accordingly it is customary to start each argument from scratch, without benefit of accumulated theorems or derived rules.



1968 ◽  
Vol 32 (4) ◽  
pp. 480-504 ◽  
Author(s):  
J. Jay Zeman

The “traditional” method of presenting the subject-matter of symbolic logic involves setting down, first of all, a basis for a propositional calculus—which basis might be a system of natural deduction, an axiom system, or a rule concerning tautologous formulas. The next step, ordinarily, consists of the introduction of quantifiers into the symbol-set of the system, and the stating of axioms or rules for quantification. In this paper I shall propose a system somewhat different from the ordinary; this system has rules for quantification and is, indeed, equivalent to classical quantification theory. It departs from the usual, however, in that it has no primitive quantifiers.



1966 ◽  
Vol 31 (3) ◽  
pp. 322-324 ◽  
Author(s):  
Raymond M. Smullyan

Our terminology and notation is the same as that of [1], of which this note is a sequel.We wish to show that if we take the natural deduction system (N) described in [1], and delete the rules for the quantifiers, we obtain a complete system for propositional logic. [Of course we now construe “X”, “Y”, “Z” as syntactic variables ranging over formulas of propositional logic, rather than sentences of quantification theory.] Moreover the system serves as a neat decision procedure.





1992 ◽  
Author(s):  
Richard Statman ◽  
Gilles Dowek
Keyword(s):  


Author(s):  
Timothy Williamson

The book argues that our use of conditionals is governed by imperfectly reliable heuristics, in the psychological sense of fast and frugal (or quick and dirty) ways of assessing them. The primary heuristic is this: to assess ‘If A, C’, suppose A and on that basis assess C; whatever attitude you take to C conditionally on A (such as acceptance, rejection, or something in between) take unconditionally to ‘If A, C’. This heuristic yields both the equation of the probability of ‘If A, C’ with the conditional probability of C on A and standard natural deduction rules for the conditional. However, these results can be shown to make the heuristic implicitly inconsistent, and so less than fully reliable. There is also a secondary heuristic: pass conditionals freely from one context to another under normal conditions for acceptance of sentences on the basis of memory and testimony. The effect of the secondary heuristic is to undermine interpretations on which ‘if’ introduces a special kind of context-sensitivity. On the interpretation which makes best sense of the two heuristics, ‘if’ is simply the truth-functional conditional. Apparent counterexamples to truth-functionality are artefacts of reliance on the primary heuristic in cases where it is unreliable. The second half of the book concerns counterfactual conditionals, as expressed with ‘if’ and ‘would’. It argues that ‘would’ is an independently meaningful modal operator for contextually restricted necessity: the meaning of counterfactuals is simply that derived compositionally from the meanings of their constituents, including ‘if’ and ‘would’, making them contextually restricted strict conditionals.



1981 ◽  
Vol 4 (4) ◽  
pp. 975-995
Author(s):  
Andrzej Szałas

A language is considered in which the reader can express such properties of block-structured programs with recursive functions as correctness and partial correctness. The semantics of this language is fully described by a set of schemes of axioms and inference rules. The completeness theorem and the soundness theorem for this axiomatization are proved.



Sign in / Sign up

Export Citation Format

Share Document