A First Course in Logic

Author(s):  
Shawn Hedman

The ability to reason and think in a logical manner forms the basis of learning for most mathematics, computer science, philosophy and logic students. Based on the author's teaching notes at the University of Maryland and aimed at a broad audience, this text covers the fundamental topics in classical logic in an extremely clear, thorough and accurate style that is accessible to all the above. Covering propositional logic, first-order logic, and second-order logic, as well as proof theory, computability theory, and model theory, the text also contains numerous carefully graded exercises and is ideal for a first or refresher course.

Author(s):  
Shawn Hedman

First-order logic is a richer language than propositional logic. Its lexicon contains not only the symbols ∧, ∨, ¬, →, and ↔ (and parentheses) from propositional logic, but also the symbols ∃ and ∀ for “there exists” and “for all,” along with various symbols to represent variables, constants, functions, and relations. These symbols are grouped into five categories. • Variables. Lower case letters from the end of the alphabet (. . . x, y, z) are used to denote variables. Variables represent arbitrary elements of an underlying set. This, in fact, is what “first-order” refers to. Variables that represent sets of elements are called second-order. Second-order logic, discussed in Chapter 9, is distinguished by the inclusion of such variables. • Constants. Lower case letters from the beginning of the alphabet (a, b, c, . . .) are usually used to denote constants. A constant represents a specific element of an underlying set. • Functions. The lower case letters f, g, and h are commonly used to denote functions. The arguments may be parenthetically listed following the function symbol as f(x1, x2, . . . , xn). First-order logic has symbols for functions of any number of variables. If f is a function of one, two, or three variables, then it is called unary, binary, or ternary, respectively. In general, a function of n variables is called n-ary and n is referred to as the arity of the function. • Relations. Capital letters, especially P, Q, R, and S, are used to denote relations. As with functions, each relation has an associated arity. We have an infinite number of each of these four types of symbols at our disposal. Since there are only finitely many letters, subscripts are used to accomplish this infinitude. For example, x1, x2, x3, . . . are often used to denote variables. Of course, we can use any symbol we want in first-order logic. Ascribing the letters of the alphabet in the above manner is a convenient convention. If you turn to a random page in this book and see “R(a, x, y),” you can safely assume that R is a ternary relation, x and y are variables, and a is a constant.


2020 ◽  
Author(s):  
Michał Walicki

Abstract Graph normal form, introduced earlier for propositional logic, is shown to be a normal form also for first-order logic. It allows to view syntax of theories as digraphs, while their semantics as kernels of these digraphs. Graphs are particularly well suited for studying circularity, and we provide some general means for verifying that circular or apparently circular extensions are conservative. Traditional syntactic means of ensuring conservativity, like definitional extensions or positive occurrences guaranteeing exsitence of fixed points, emerge as special cases.


1988 ◽  
Vol 53 (2) ◽  
pp. 554-570 ◽  
Author(s):  
Kosta Došen ◽  
Peter Schroeder-Heister

This paper is meant to be a comment on Beth's definability theorem. In it we shall make the following points.Implicit definability as mentioned in Beth's theorem for first-order logic is a special case of a more general notion of uniqueness. If α is a nonlogical constant, Tα a set of sentences, α* an additional constant of the same syntactical category as α and Tα, a copy of Tα with α* instead of α, then for implicit definability of α in Tα one has, in the case of predicate constants, to derive α(x1,…,xn) ↔ α*(x1,…,xn) from Tα ∪ Tα*, and similarly for constants of other syntactical categories. For uniqueness one considers sets of schemata Sα and derivability from instances of Sα ∪ Sα* in the language with both α and α*, thus allowing mixing of α and α* not only in logical axioms and rules, but also in nonlogical assumptions. In the first case, but not necessarily in the second one, explicit definability follows. It is crucial for Beth's theorem that mixing of α and α* is allowed only inside logic, not outside. This topic will be treated in §1.Let the structural part of logic be understood roughly in the sense of Gentzen-style proof theory, i.e. as comprising only those rules which do not specifically involve logical constants. If we restrict mixing of α and α* to the structural part of logic which we shall specify precisely, we obtain a different notion of implicit definability for which we can demonstrate a general definability theorem, where a is not confined to the syntactical categories of nonlogical expressions of first-order logic. This definability theorem is a consequence of an equally general interpolation theorem. This topic will be treated in §§2, 3, and 4.


Author(s):  
Jan Gorzny ◽  
Ezequiel Postan ◽  
Bruno Woltzenlogel Paleo

Abstract Proofs are a key feature of modern propositional and first-order theorem provers. Proofs generated by such tools serve as explanations for unsatisfiability of statements. However, these explanations are complicated by proofs which are not necessarily as concise as possible. There are a wide variety of compression techniques for propositional resolution proofs but fewer compression techniques for first-order resolution proofs generated by automated theorem provers. This paper describes an approach to compressing first-order logic proofs based on lifting proof compression ideas used in propositional logic to first-order logic. The first approach lifted from propositional logic delays resolution with unit clauses, which are clauses that have a single literal. The second approach is partial regularization, which removes an inference $\eta $ when it is redundant in the sense that its pivot literal already occurs as the pivot of another inference in every path from $\eta $ to the root of the proof. This paper describes the generalization of the algorithms LowerUnits and RecyclePivotsWithIntersection (Fontaine et al.. Compression of propositional resolution proofs via partial regularization. In Automated Deduction—CADE-23—23rd International Conference on Automated Deduction, Wroclaw, Poland, July 31–August 5, 2011, p. 237--251. Springer, 2011) from propositional logic to first-order logic. The generalized algorithms compresses resolution proofs containing resolution and factoring inferences with unification. An empirical evaluation of these approaches is included.


2010 ◽  
Vol 16 (1) ◽  
pp. 1-36 ◽  
Author(s):  
Peter Koellner

AbstractIn this paper we investigate strong logics of first and second order that have certain absoluteness properties. We begin with an investigation of first order logic and the strong logics ω-logic and β-logic, isolating two facets of absoluteness, namely, generic invariance and faithfulness. It turns out that absoluteness is relative in the sense that stronger background assumptions secure greater degrees of absoluteness. Our aim is to investigate the hierarchies of strong logics of first and second order that are generically invariant and faithful against the backdrop of the strongest large cardinal hypotheses. We show that there is a close correspondence between the two hierarchies and we characterize the strongest logic in each hierarchy. On the first-order side, this leads to a new presentation of Woodin's Ω-logic. On the second-order side, we compare the strongest logic with full second-order logic and argue that the comparison lends support to Quine's claim that second-order logic is really set theory in sheep's clothing.


2015 ◽  
Vol 21 (2) ◽  
pp. 123-163 ◽  
Author(s):  
ROY DYCKHOFF ◽  
SARA NEGRI

AbstractThat every first-order theory has a coherent conservative extension is regarded by some as obvious, even trivial, and by others as not at all obvious, but instead remarkable and valuable; the result is in any case neither sufficiently well-known nor easily found in the literature. Various approaches to the result are presented and discussed in detail, including one inspired by a problem in the proof theory of intermediate logics that led us to the proof of the present paper. It can be seen as a modification of Skolem’s argument from 1920 for his “Normal Form” theorem. “Geometric” being the infinitary version of “coherent”, it is further shown that every infinitary first-order theory, suitably restricted, has a geometric conservative extension, hence the title. The results are applied to simplify methods used in reasoning in and about modal and intermediate logics. We include also a new algorithm to generate special coherent implications from an axiom, designed to preserve the structure of formulae with relatively little use of normal forms.


2012 ◽  
Vol 7 ◽  
Author(s):  
Anders Søgaard ◽  
Søren Lind Kristiansen

Existing logic-based querying tools for dependency treebanks use first order logic or monadic second order logic. We introduce a very fast model checker based on hybrid logic with operators ↓, @ and A and show that it is much faster than an existing querying tool for dependency treebanks based on first order logic, and much faster than an existing general purpose hybrid logic model checker. The querying tool is made publicly available.


1999 ◽  
Vol Vol. 3 no. 3 ◽  
Author(s):  
Thomas Schwentick ◽  
Klaus Barthelmann

International audience Building on work of Gaifman [Gai82] it is shown that every first-order formula is logically equivalent to a formula of the form ∃ x_1,...,x_l, \forall y, φ where φ is r-local around y, i.e. quantification in φ is restricted to elements of the universe of distance at most r from y. \par From this and related normal forms, variants of the Ehrenfeucht game for first-order and existential monadic second-order logic are developed that restrict the possible strategies for the spoiler, one of the two players. This makes proofs of the existence of a winning strategy for the duplicator, the other player, easier and can thus simplify inexpressibility proofs. \par As another application, automata models are defined that have, on arbitrary classes of relational structures, exactly the expressive power of first-order logic and existential monadic second-order logic, respectively.


Author(s):  
Jonathan Mai

English distinguishes between singular quantifiers like "a donkey" and plural quantifiers like "some donkeys". Pluralists hold that plural quantifiers range in an unusual, irreducibly plural, way over common objects, namely individuals from first-order domains and not over set-like objects. The favoured framework of pluralism is plural first-order logic, PFO, an interpreted first-order language that is capable of expressing plural quantification. Pluralists argue for their position by claiming that the standard formal theory based on PFO is both ontologically neutral and really logic. These properties are supposed to yield many important applications concerning second-order logic and set theory that alternative theories supposedly cannot deliver. I will show that there are serious reasons for rejecting at least the claim of ontological innocence. Doubt about innocence arises on account of the fact that, when properly spelled out, the PFO-semantics for plural quantifiers is committed to set-like objects. The correctness of my worries presupposes the principle that for every plurality there is a coextensive set. Pluralists might reply that this principle leads straight to paradox. However, as I will argue, the true culprit of the paradox is the assumption that every definite condition determines a plurality.


Author(s):  
Timothy Smiley

The predicate calculus is the dominant system of modern logic, having displaced the traditional Aristotelian syllogistic logic that had been the previous paradigm. Like Aristotle’s, it is a logic of quantifiers – words like ‘every’, ‘some’ and ‘no’ that are used to express that a predicate applies universally or with some other distinctive kind of generality, for example ‘everyone is mortal’, ‘someone is mortal’, ‘no one is mortal’. The weakness of syllogistic logic was its inability to represent the structure of complex predicates. Thus it could not cope with argument patterns like ‘everything Fs and Gs, so everything Fs’. Nor could it cope with relations, because a logic of relations must be able to analyse cases where a quantifier is applied to a predicate that already contains one, as in ‘someone loves everyone’. Remedying the weakness required two major innovations. One was a logic of connectives – words like ‘and’, ‘or’ and ‘if’ that form complex sentences out of simpler ones. It is often studied as a distinct system: the propositional calculus. A proposition here is a true-or-false sentence and the guiding principle of propositional calculus is truth-functionality, meaning that the truth-value (truth or falsity) of a compound proposition is uniquely determined by the truth-values of its components. Its principal connectives are negation, conjunction, disjunction and a ‘material’ (that is, truth-functional) conditional. Truth-functionality makes it possible to compute the truth-values of propositions of arbitrary complexity in terms of their basic propositional constituents, and so develop the logic of tautology and tautological consequence (logical truth and consequence in virtue of the connectives). The other invention was the quantifier-variable notation. Variables are letters used to indicate things in an unspecific way; thus ‘x is mortal’ is read as predicating of an unspecified thing x what ‘Socrates is mortal’ predicates of Socrates. The connectives can now be used to form complex predicates as well as propositions, for example ‘x is human and x is mortal’; while different variables can be used in different places to express relational predicates, for example ‘x loves y’. The quantifier goes in front of the predicate it governs, with the relevant variable repeated beside it to indicate which positions are being generalized. These radical departures from the idiom of quantification in natural languages are needed to solve the further problem of ambiguity of scope. Compare, for example, the ambiguity of ‘someone loves everyone’ with the unambiguous alternative renderings ‘there is an x such that for every y, x loves y’ and ‘for every y, there is an x such that x loves y’. The result is a pattern of formal language based on a non-logical vocabulary of names of things and primitive predicates expressing properties and relations of things. The logical constants are the truth-functional connectives and the universal and existential quantifiers, plus a stock of variables construed as ranging over things. This is ‘the’ predicate calculus. A common option is to add the identity sign as a further logical constant, producing the predicate calculus with identity. The first modern logic of quantification, Frege’s of 1879, was designed to express generalizations not only about individual things but also about properties of individuals. It would nowadays be classified as a second-order logic, to distinguish it from the first-order logic described above. Second-order logic is much richer in expressive power than first-order logic, but at a price: first-order logic can be axiomatized, second-order logic cannot.


Sign in / Sign up

Export Citation Format

Share Document