The equivalence of NF-style set theories with “tangled” type theories; the construction of ω-models of predicative NF (and more)

1995 ◽  
Vol 60 (1) ◽  
pp. 178-190 ◽  
Author(s):  
M. Randall Holmes

AbstractAn ω-model (a model in which all natural numbers are standard) of the predicative fragment of Quine's set theory “New Foundations” (NF) is constructed. Marcel Crabbé has shown that a theory NFI extending predicative NF is consistent, and the model constructed is actually a model of NFI as well. The construction follows the construction of ω-models of NFU (NF with urelements) by R. B. Jensen, and, like the construction of Jensen for NFU, it can be used to construct α-models for any ordinal α. The construction proceeds via a model of a type theory of a peculiar kind; we first discuss such “tangled type theories” in general, exhibiting a “tangled type theory” (and also an extension of Zermelo set theory with Δ0 comprehension) which is equiconsistent with NF (for which the consistency problem seems no easier than the corresponding problem for NF (still open)), and pointing out that “tangled type theory with urelements” has a quite natural interpretation, which seems to provide an explanation for the more natural behaviour of NFU relative to the other set theories of this kind, and can be seen anachronistically as underlying Jensen's consistency proof for NFU.

2016 ◽  
Vol 81 (2) ◽  
pp. 605-628 ◽  
Author(s):  
SEAN WALSH

AbstractFrege’sGrundgesetzewas one of the 19th century forerunners to contemporary set theory which was plagued by the Russell paradox. In recent years, it has been shown that subsystems of theGrundgesetzeformed by restricting the comprehension schema are consistent. One aim of this paper is to ascertain how much set theory can be developed within these consistent fragments of theGrundgesetze, and our main theorem (Theorem 2.9) shows that there is a model of a fragment of theGrundgesetzewhich defines a model of all the axioms of Zermelo–Fraenkel set theory with the exception of the power set axiom. The proof of this result appeals to Gödel’s constructible universe of sets and to Kripke and Platek’s idea of the projectum, as well as to a weak version of uniformization (which does not involve knowledge of Jensen’s fine structure theory). The axioms of theGrundgesetzeare examples ofabstraction principles, and the other primary aim of this paper is to articulate a sufficient condition for the consistency of abstraction principles with limited amounts of comprehension (Theorem 3.5). As an application, we resolve an analogue of the joint consistency problem in the predicative setting.


1954 ◽  
Vol 19 (3) ◽  
pp. 197-200 ◽  
Author(s):  
Václav Edvard Beneš

1. In this paper we construct a model for part of the system NF of [4]. Specifically, we define a relation R of natural numbers such that the R-relativiseds of all the axioms except P9 of Hailperin's finitization [2] of NF become theorems of say Zermelo set theory. We start with an informal explanation of the model.2. Scrutiny of P1-P8 of [2] suggests that a model for these axioms might be constructed by so to speak starting with a universe that contained a “universe set” and a “cardinal 1”, and passing to its closure under the operations implicit in P1-P7, viz., the Boolean, the domain, the direct product, the converse, and the mixtures of product and inverse operations represented by P3 and P4. To obtain such closure we must find a way of representing the operations that involve ordered pairs and triples.We take as universe of the model the set of natural numbers ω; we let 0 represent the “universe set” and 1 represent “cardinal 1”. Then, in order to be able to refer in the model to the unordered pair of two sets, we determine all representatives of unordered pairs in advance by assigning them the even numbers in unique fashion (see d3 and d25); we can now define the operations that involve ordered pairs and triples, and obtain closure under them using the odd numbers. It remains to weed out, as in d26, the unnecessary sets so as to satisfy the axiom of extensionality.


Author(s):  
Catarina Coquand

The notion of computability predicate developed by Tail gives a powerful method for proving normalization for various λ-calculi. There are actually two versions of this method: a typed and an untyped approach. In the former approach we define a computability predicate over well-typed terms. This technique was developed by Tail (1967) in order to prove normalization for Gödel’s system T. His method was extended by Girard (1971) for his System F and by Martin-Löf (1971) for his type theory. The second approach is similar to Kleene’s realizability interpretation (Kleene 1945), but in this approach formulas are realized by (not necessarily well-typed) λ-terms rather than Gödel numbers. This technique was developed by Tail (1975) where he uses this method to obtain a simplified proof of normalization for Girard’s system F. The method was later rediscovered independently by several others, for instance Mitchell (1986). There are two ways of defining typed λ-calculus: we have either equality as conversion (on raw terms) or equality as judgments. It is more difficult to show in the theory with equality as judgments that every well-typed term has a normal form of the same type. We can find different approaches to how to solve this problem in Altenkirch (1993), Coquand (1991), Goguen (1994) and Martin-Löf (1975). There are several papers addressing normalization for calculi with equality as untyped conversion; two relevant papers showing normalization are Altenkirch’s (1994) proof for system F using the untyped method and Werner’s (1994) proof for the calculus of constructions with inductive types using the other method. Another difference is the predicative and impredicative formulations of λ-calculi where it is more intricate to prove normalization for the impredicative ones. The aim of this paper is to give a simple argument of normalization for a theory with “real” dependent types, i.e. a theory in which we can define types by recursion on the natural numbers (like Nn). For this we choose to study the simplest theory that has such types, namely the N, Π, U-fragment of Martin-Löf’s polymorphic type theory (Martin-Löf 1972) which contains the natural numbers, dependent functions and one universe. This theory has equality as conversion and is predicative.


Author(s):  
Charles Sayward

<p>Russell held that the theory of natural numbers could be derived from three primitive concepts: number, successor and zero. This leaves out multiplication and addition. Russell introduces these concepts by recursive definition. It is argued that this does not render addition or multiplication any less primitive than the other three. To this it might be replied that any recursive definition can be transformed into a complete or explicit definition with the help of a little set theory. But that is a point about set theory, not number theory. We have learned more about the distinction between logic and set theory than was known in Russell's day, especially as this affects logicist aspirations.</p>


1955 ◽  
Vol 20 (2) ◽  
pp. 123-139 ◽  
Author(s):  
Robert L. Stanley

A system SF, closely related to NF, is outlined here. SF has several novel points of simplicity and interest, (a) It uses only one basic notion, from which all the other concepts of logic and mathematics may be built definitionally. Three-notion systems are common, but Quine's two-notion IA has for some time represented the extreme in conceptual economy, (b) The theorems of SF are generated under just three rules of analysis, which unify into a single postulational principle, (c) SF is built solely in terms of what is commonly, known as the “natural deduction” method, under which each theorem is attacked primarily as it stands, by means of a very small body of rules, rather than less directly, through a very large, potentially infinite backlog of theorems. Although natural deduction is by no means new as a method, its exclusive applications have previously been relatively limited, not even reaching principles of identity, much less set theory, relations, or mathematics proper, (d) SF is at least as strong as NF, yielding all of its theorems, which are expressed here in forms analogous to those of the metatheorems in ML. If NF is consistent, so is SF. The main points in the relative consistency proof are set forth below in section seven.


1998 ◽  
Vol 63 (1) ◽  
pp. 247-261
Author(s):  
Gian Aldo Antonelli

Quine's “New Foundations” (NF) was first presented in Quine [10] and later on in Quine [11]. Ernst Specker [15, 13], building upon a previous result of Ehrenfeucht and Mostowski [5], showed that NF is consistent if and only if there is a model of the Theory of Negative (and positive) Types (TNT) with full extensionality that admits of a “shifting automorphism,” but the existence of such a model remains an open problem.In his [8], Ronald Jensen gave a partial solution to the problem of the consistency of Quine's NF. Jensen considered a version of NF—referred to as NFU—in which the axiom of extensionality is weakened to allow for Urelemente or “atoms.” He showed, modifying Specker's theorem, that the existence of a model of TNT with atoms admitting of a “shifting automorphism” implies the consistency of NFU, proceeding then to exhibit such a model.This paper presents a reduction of the consistency problem for NF to the existence of a model of TNT with atoms containing certain “large” (unstratified) sets and admitting a shifting automorphism. In particular we show that such a model can be “collapsed” to a model of pure TNT in such a way as to preserve the shifting automorphism. By the above-mentioned result of Specker's, this implies the consistency of NF.Let us take the time to explain the main ideas behind the construction. Suppose we have a certain universe U of sets, built up from certain individuals or “atoms.” In such a universe we have only a weak version of the axiom of extensionality: two objects are the same if and only if they are both sets having the same members. We would like to obtain a universe U′ that is as close to U as possible, but in which there are no atoms (i.e., the only memberless object is the empty set). One way of doing this is to assign to each atom ξ, a set a (perhaps the empty set), inductively identifying sets that have members that we are already committed to considering “the same.” In doing this we obtain an equivalence relation ≃ over U that interacts nicely with the membership relation (provided we have accounted for multiplicity of members, i.e., we have allowed sets to contain “multiple copies” of the same object). Then we can take U′ = U/≃, the quotient of U with respect to ≃. It is then possible to define a “membership” relation over U′ in such a way as to have full extensionality. Relations such as ≃ are referred to as “contractions” by Hinnion and “bisimulations” by Aczel.


Author(s):  
Michael Detlefsen

AbstractFormalism in the philosophy of mathematics has taken a variety of forms and has been advocated for widely divergent reasons. In Sects. 1 and 2, I briefly introduce the major formalist doctrines of the late nineteenth and early twentieth centuries. These are what I call empirico-semantic formalism (advocated by Heine), game formalism (advocated by Thomae) and instrumental formalism (advocated by Hilbert). After describing these views, I note some basic points of similarity and difference between them. In the remainder of the paper, I turn my attention to Hilbert’s instrumental formalism. My primary aim there will be to develop its formalist elements more fully. These are, in the main, (i) its rejection of the axiom-centric focus of traditional model-construction approaches to consistency problems, (ii) its departure from the traditional understanding of the basic nature of proof and (iii) its distinctively descriptive or observational orientation with regard to the consistency problem for arithmetic. More specifically, I will highlight what I see as the salient points of connection between Hilbert’s formalist attitude and his finitist standard for the consistency proof for arithmetic. I will also note what I see as a significant tension between Hilbert’s observational approach to the consistency problem for arithmetic and his expressed hope that his solution of that problem would dispense with certain epistemological concerns regarding arithmetic once and for all.


Author(s):  
Cesare Gallozzi

Abstract We introduce a family of (k, h)-interpretations for 2 ≤ k ≤ ∞ and 1 ≤ h ≤ ∞ of constructive set theory into type theory, in which sets and formulas are interpreted as types of homotopy level k and h, respectively. Depending on the values of the parameters k and h, we are able to interpret different theories, like Aczel’s CZF and Myhill’s CST. We also define a proposition-as-hproposition interpretation in the context of logic-enriched type theories. The rest of the paper is devoted to characterising and analysing the interpretations considered. The formulas valid in the prop-as-hprop interpretation are characterised in terms of the axiom of unique choice. We also analyse the interpretations of CST into homotopy type theory, providing a comparative analysis with Aczel’s interpretation. This is done by formulating in a logic-enriched type theory the key principles used in the proofs of the two interpretations. Finally, we characterise a class of sentences valid in the (k, ∞)-interpretations in terms of the ΠΣ axiom of choice.


1989 ◽  
Vol 54 (4) ◽  
pp. 1401-1418 ◽  
Author(s):  
M. Forti ◽  
R. Hinnion

Since Gilmore showed that some theory with a positive comprehension scheme is consistent when the axiom of extensionality is dropped and inconsistent with it (see [1] and [2]), the problem of the consistency of various positive comprehension schemes has been investigated. We give here a short classification, which shows clearly the importance of the axiom of extensionality and of the abstraction operator in these consistency problems. The most difficult problem was to show the consistency of the comprehension scheme for positive formulas, with extensionality but without abstraction operator. In his unpublished thesis, Set theory in which the axiom of foundation fails [3], Malitz solved partially this problem but he needed to assume the existence of some unusual kind of large cardinal; as his original construction is very interesting and his thesis is unpublished, we give a short summary of it. M. Forti solved the problem completely by working in ZF with a free-construction principle (sometimes called an anti-foundation axiom), instead of ZF with the axiom of foundation, as Malitz did.This permits one to obtain the consistency of this positive theory, relative to ZF. In his general investigations about “topological set theories” (to be published), E. Weydert has independently proved the same result. The authors are grateful to the Mathematisches Forshungsinstitut Oberwolfach for giving them the opportunity of discussing these subjects and meeting E. Weydert during the meeting “New Foundations”, March 1–7, 1987.


Sign in / Sign up

Export Citation Format

Share Document