scholarly journals Solutions of Extension and Limits of Some Cantorian Paradoxes

Mathematics ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 486
Author(s):  
Josué-Antonio Nescolarde-Selva ◽  
José-Luis Usó-Doménech ◽  
Lorena Segura-Abad ◽  
Kristian Alonso-Stenberg ◽  
Hugh Gash

Cantor thought of the principles of set theory or intuitive principles as universal forms that can apply to any actual or possible totality. This is something, however, which need not be accepted if there are totalities which have a fundamental ontological value and do not conform to these principles. The difficulties involved are not related to ontological problems but with certain peculiar sets, including the set of all sets that are not members of themselves, the set of all sets, and the ordinal of all ordinals. These problematic totalities for intuitive theory can be treated satisfactorily with the Zermelo and Fraenkel (ZF) axioms or the von Neumann, Bernays, and Gödel (NBG) axioms, and the iterative conceptions expressed in them.

2014 ◽  
Vol 20 (1) ◽  
pp. 94-97
Author(s):  
Natasha Dobrinen

1967 ◽  
Vol 32 (3) ◽  
pp. 319-321 ◽  
Author(s):  
Leslie H. Tharp

We are concerned here with the set theory given in [1], which we call BL (Bernays-Levy). This theory can be given an elegant syntactical presentation which allows most of the usual axioms to be deduced from the reflection principle. However, it is more convenient here to take the usual Von Neumann-Bernays set theory [3] as a starting point, and to regard BL as arising from the addition of the schema where S is the formal definition of satisfaction (with respect to models which are sets) and ┌φ┐ is the Gödel number of φ which has a single free variable X.


2006 ◽  
Vol 28 (1) ◽  
pp. 95-109 ◽  
Author(s):  
Nicola Giocoli

The year 2003 marked the 100th anniversary of the birth of John von Neumann (1903–1957), one of greatest geniuses of the last century. Beyond contributing to fields as diverse as set theory, quantum mechanics, atomic energy, and automatic computing, von Neumann has also had a decisive influence upon modern economics. From the invention of game theory to the axiomatization of expected utility, from the introduction of convex analysis and fixed-point techniques to the development of the balanced growth model, the von Neumann heritage can be clearly traced in several areas of our discipline. The aim of this paper is to clarify the relationship between the two concepts of rationality he devised in his classic 1944 book Theory of Games and Economic Behavior, written with the collaboration of the Austrian economist Oskar Morgenstern (von Neumann and Morgenstern 1953).


Author(s):  
Asger Törnquist ◽  
Martino Lupini

Author(s):  
Wilfried Sieg

Proof theory is a branch of mathematical logic founded by David Hilbert around 1920 to pursue Hilbert’s programme. The problems addressed by the programme had already been formulated, in some sense, at the turn of the century, for example, in Hilbert’s famous address to the First International Congress of Mathematicians in Paris. They were closely connected to the set-theoretic foundations for analysis investigated by Cantor and Dedekind – in particular, to difficulties with the unrestricted notion of system or set; they were also related to the philosophical conflict with Kronecker on the very nature of mathematics. At that time, the central issue for Hilbert was the ‘consistency of sets’ in Cantor’s sense. Hilbert suggested that the existence of consistent sets, for example, the set of real numbers, could be secured by proving the consistency of a suitable, characterizing axiom system, but indicated only vaguely how to give such proofs model-theoretically. Four years later, Hilbert departed radically from these indications and proposed a novel way of attacking the consistency problem for theories. This approach required, first of all, a strict formalization of mathematics together with logic; then, the syntactic configurations of the joint formalism would be considered as mathematical objects; finally, mathematical arguments would be used to show that contradictory formulas cannot be derived by the logical rules. This two-pronged approach of developing substantial parts of mathematics in formal theories (set theory, second-order arithmetic, finite type theory and still others) and of proving their consistency (or the consistency of significant sub-theories) was sharpened in lectures beginning in 1917 and then pursued systematically in the 1920s by Hilbert and a group of collaborators including Paul Bernays, Wilhelm Ackermann and John von Neumann. In particular, the formalizability of analysis in a second-order theory was verified by Hilbert in those very early lectures. So it was possible to focus on the second prong, namely to establish the consistency of ‘arithmetic’ (second-order number theory and set theory) by elementary mathematical, ‘finitist’ means. This part of the task proved to be much more recalcitrant than expected, and only limited results were obtained. That the limitation was inevitable was explained in 1931 by Gödel’s theorems; indeed, they refuted the attempt to establish consistency on a finitist basis – as soon as it was realized that finitist considerations could be carried out in a small fragment of first-order arithmetic. This led to the formulation of a general reductive programme. Gentzen and Gödel made the first contributions to this programme by establishing the consistency of classical first-order arithmetic – Peano arithmetic (PA) – relative to intuitionistic arithmetic – Heyting arithmetic. In 1936 Gentzen proved the consistency of PA relative to a quantifier-free theory of arithmetic that included transfinite recursion up to the first epsilon number, ε0; in his 1941 Yale lectures, Gödel proved the consistency of the same theory relative to a theory of computable functionals of finite type. These two fundamental theorems turned out to be most important for subsequent proof-theoretic work. Currently it is known how to analyse, in Gentzen’s style, strong subsystems of second-order arithmetic and set theory. The first prong of proof-theoretic investigations, the actual formal development of parts of mathematics, has also been pursued – with a surprising result: the bulk of classical analysis can be developed in theories that are conservative over (fragments of) first-order arithmetic.


2007 ◽  
Vol 72 (2) ◽  
pp. 625-648 ◽  
Author(s):  
Masanao Ozawa

AbstractIn 1981, Takeuti introduced quantum set theory as the quantum counterpart of Boolean valued models of set theory by constructing a model of set theory based on quantum logic represented by the lattice of closed subspaces in a Hilbert space and showed that appropriate quantum counterparts of ZFC axioms hold in the model. Here, Takeuti's formulation is extended to construct a model of set theory based on the logic represented by the lattice of projections in an arbitrary von Neumann algebra. A transfer principle is established that enables us to transfer theorems of ZFC to their quantum counterparts holding in the model. The set of real numbers in the model is shown to be in one-to-one correspondence with the set of self-adjoint operators affiliated with the von Neumann algebra generated by the logic. Despite the difficulty pointed out by Takeuti that equality axioms do not generally hold in quantum set theory, it is shown that equality axioms hold for any real numbers in the model. It is also shown that any observational proposition in quantum mechanics can be represented by a corresponding statement for real numbers in the model with the truth value consistent with the standard formulation of quantum mechanics, and that the equality relation between two real numbers in the model is equivalent with the notion of perfect correlation between corresponding observables (self-adjoint operators) in quantum mechanics. The paper is concluded with some remarks on the relevance to quantum set theory of the choice of the implication connective in quantum logic.


2001 ◽  
Vol 66 (3) ◽  
pp. 1321-1341 ◽  
Author(s):  
P. V. Andreev ◽  
E. I. Gordon

AbstractWe present an axiomatic framework for nonstandard analysis—the Nonstandard Class Theory (NCT) which extends von Neumann–Gödel–Bernays Set Theory (NBG) by adding a unary predicate symbol St to the language of NBG (St(X) means that the class X is standard) and axioms—related to it—analogs of Nelson's idealization, standardization and transfer principles. Those principles are formulated as axioms, rather than axiom schemes, so that NCT is finitely axiomatizable. NCT can be considered as a theory of definable classes of Bounded Set Theory by V. Kanovei and M. Reeken. In many aspects NCT resembles the Alternative Set Theory by P. Vopenka. For example there exist semisets (proper subclasses of sets) in NCT and it can be proved that a set has a standard finite cardinality iff it does not contain any proper subsemiset. Semisets can be considered as external classes in NCT. Thus the saturation principle can be formalized in NCT.


1995 ◽  
Vol 1 (4) ◽  
pp. 393-407 ◽  
Author(s):  
Ronald Jensen

In this paper, we sketch the development of two important themes of modern set theory, both of which can be regarded as growing out of work of Kurt Gödel. We begin with a review of some basic concepts and conventions of set theory. §0. The ordinal numbers were Georg Cantor's deepest contribution to mathematics. After the natural numbers 0, 1, …, n, … comes the first infinite ordinal number ω, followed by ω + 1, ω + 2, …, ω + ω, … and so forth. ω is the first limit ordinal as it is neither 0 nor a successor ordinal. We follow the von Neumann convention, according to which each ordinal number α is identified with the set {ν ∣ ν α} of its predecessors. The ∈ relation on ordinals thus coincides with <. We have 0 = ∅ and α + 1 = α ∪ {α}. According to the usual set-theoretic conventions, ω is identified with the first infinite cardinal ℵ0, similarly for the first uncountable ordinal number ω1 and the first uncountable cardinal number ℵ1, etc. We thus arrive at the following picture: The von Neumann hierarchy divides the class V of all sets into a hierarchy of sets Vα indexed by the ordinal numbers. The recursive definition reads: (where } is the power set of x); Vλ = ∪v<λVv for limit ordinals λ. We can represent this hierarchy by the following picture.


1990 ◽  
Vol 55 (2) ◽  
pp. 707-732 ◽  
Author(s):  
Arnon Avron

In this work we describe a new approach to the notions of relevance and paraconsistency. Unlike the works of Anderson and Belnap or da Costa (see [2], [8] and [7]) we shall mainly be guided in it by semantical intuitions. In the first two sections we introduce and investigate the algebraic structures that reflect those intuitions. The corresponding formal systems are briefly described in the third section (a more detailed treatment of these systems, including full proofs, will be given in another paper).Our basic intuitive idea is that of “domains of discourse” or “relevance domains”. Classical logic, so we think, is valid in as much as sentences get values inside one domain; limitations on its use can be imposed only with respect to inferences in which more than one domain is involved. There are two basic binary relations over the collection of domains. One is relevance. It is reflexive and symmetric (but not necessarily transitive). Under a given interpretation two sentences are relevant to each other when their values are in relevant domains. Another basic relation between domains, no less important, is that of grading according to “degrees of reality”. The idea behind it is not new. Gentzen, for example, divided in [9] the world of mathematics into three grades, representing three “levels of reality”. The elementary theory of numbers has the highest degree or level of reality; set theory has the smallest degree and mathematical analysis occupies the intermediate level. In the theory of types, or in the accumulative von Neumann universe for set theory, we can find indication of a richer hierarchy.


1978 ◽  
Vol 43 (3) ◽  
pp. 613-613 ◽  
Author(s):  
Stephen C. Kleene

Gödel has called to my attention that p. 773 is misleading in regard to the discovery of the finite axiomatization and its place in his proof of the consistency of GCH. For the version in [1940], as he says on p. 1, “The system Σ of axioms for set theory which we adopt [a finite one] … is essentially due to P. Bernays …”. However, it is not at all necessary to use a finite axiom system. Gödel considers the more suggestive proof to be the one in [1939], which uses infinitely many axioms.His main achievement regarding the consistency of GCH, he says, really is that he first introduced the concept of constructible sets into set theory defining it as in [1939], proved that the axioms of set theory (including the axiom of choice) hold for it, and conjectured that the continuum hypothesis also will hold. He told these things to von Neumann during his stay at Princeton in 1935. The discovery of the proof of this conjecture On the basis of his definition is not too difficult. Gödel gave the proof (also for GCH) not until three years later because he had fallen ill in the meantime. This proof was using a submodel of the constructible sets in the lowest case countable, similar to the one commonly given today.


Sign in / Sign up

Export Citation Format

Share Document