scholarly journals About Symbolic Integration in the Course of Mathematical Analysis

2019 ◽  
pp. 94-106
Author(s):  
Mikhail D. Malykh ◽  
◽  
Anton L. Sevastianov ◽  
Leonid A. Sevastianov ◽  
◽  
...  

The work of transforming a database from one format periodically appears in different organizations for various reasons. Today, the mechanism for changing the format of relational databases is well developed. But with the advent of new types of database such as NoSQL, this problem was exacerbated due to the radical difference in the way data was organized. This article discusses a formalized method based on set theory, at the choice of the number and composition of collections for a key-value type database. The initial data are the properties of the objects, information about which is stored in the database, and the set of queries that are most frequently executed or the speed of which should be maximized. The considered method can be applied not only when creating a new key-value database, but also when transforming an existing one, when moving from relational databases to NoSQL, when consolidating databases.

2019 ◽  
pp. 15-28
Author(s):  
Van Muon Ha ◽  
◽  
Yulia A. Shichkina ◽  
Sergey V. Kostichev ◽  
◽  
...  

The work of transforming a database from one format periodically appears in different organizations for various reasons. Today, the mechanism for changing the format of relational databases is well developed. However, with the advent of new types of databases, such as NoSQL, this problem is prevalent due to the radically different ways of data organization at the various databases. This article discusses a formalized method based on set theory, at the choice of the number and composition of collections for a key-value type database. The initial data are the properties of objects, about which information is stored in the database, and the set of queries that are most frequently executed. The considered method can be applied not only when creating a new keyvalue database, but also when transforming an existing one, when moving from relational databases to NoSQL, when consolidating databases.


Axioms ◽  
2021 ◽  
Vol 10 (4) ◽  
pp. 263
Author(s):  
Yuri N. Lovyagin ◽  
Nikita Yu. Lovyagin

The standard elementary number theory is not a finite axiomatic system due to the presence of the induction axiom scheme. Absence of a finite axiomatic system is not an obstacle for most tasks, but may be considered as imperfect since the induction is strongly associated with the presence of set theory external to the axiomatic system. Also in the case of logic approach to the artificial intelligence problems presence of a finite number of basic axioms and states is important. Axiomatic hyperrational analysis is the axiomatic system of hyperrational number field. The properties of hyperrational numbers and functions allow them to be used to model real numbers and functions of classical elementary mathematical analysis. However hyperrational analysis is based on well-known non-finite hyperarithmetic axiomatics. In the article we present a new finite first-order arithmetic theory designed to be the basis of the axiomatic hyperrational analysis and, as a consequence, mathematical analysis in general as a basis for all mathematical application including AI problems. It is shown that this axiomatics meet the requirements, i.e., it could be used as the basis of an axiomatic hyperrational analysis. The article in effect completes the foundation of axiomatic hyperrational analysis without calling in an arithmetic extension, since in the framework of the presented theory infinite numbers arise without invoking any new constants. The proposed system describes a class of numbers in which infinite numbers exist as natural objects of the theory itself. We also do not appeal to any “enveloping” set theory.


Author(s):  
Nicolaas Govert de Bruijn

After millennia of mathematics we have reached a level of understanding that can be represented physically. Humankind has managed to disentangle the intricate mixture of language, metalanguage and interpretation, isolating a body of formal, abstract mathematics that can be completely verified by machines. Systems for computer-aided verification have philosophical aspects. The design and usage of such systems are influenced by the way we think about mathematics, but it also works the other way. A number of aspects of this mutual influence will be discussed in this paper. In particular, attention will be given to philosophical aspects of type-theoretical systems. These definitely call for new attitudes: throughout the twentieth century most mathematicians had been trained to think in terms of untyped sets. The word “philosophy” will be used lightheartedly. It does not refer to serious professional philosophy, but just to meditation about the way one does one’s job. What used to be called philosophy of mathematics in the past was for a large part subject oriented. Most people characterized mathematics by its subject matter, classifying it as the science of space and number. From the verification system’s point of view, however, subject matter is irrelevant. Verification is involved with the rules of mathematical reasoning, not with the subject. The picture may be a bit confused, however, by the fact that so many people consider set theory, in particular untyped set theory, as part of the language and foundation of mathematics, rather than as a particular subject treated by mathematics. The views expressed in this paper are quite personal, and can mainly be carried back to the author’s design of the Automath system in the late 1960s, where the way to look upon the meaning (philosophy) of mathematics is inspired by the usage of the unification system and vice versa. See de Bruijn 1994b for various philosophical items concerning Automath, and Nederpelt et al. 1994, de Bruin 1980, de Bruijn 1991a for general information about the Automath project. Some of the points of view given in this paper are matters of taste, but most of them were imposed by the task of letting a machine follow what we say, a machine without any knowledge of our mathematical culture and without any knowledge of physical laws.


Author(s):  
Ludovic Liétard ◽  
Daniel Rocacher

This chapter is devoted to the evaluation of quantified statements which can be found in many applications as decision making, expert systems, or flexible querying of relational databases using fuzzy set theory. Its contribution is to introduce the main techniques to evaluate such statements and to propose a new theoretical background for the evaluation of quantified statements of type “Q X are A” and “Q B X are A.” In this context, quantified statements are interpreted using an arithmetic on gradual numbers from Nf, Zf, and Qf. It is shown that the context of fuzzy numbers provides a framework to unify previous approaches and can be the base for the definition of new approaches.


2009 ◽  
pp. 1127-1150
Author(s):  
Theresa Beaubouef ◽  
Frederick E. Petry

This chapter discusses ways in which rough-set theory can enhance databases by allowing for the management of uncertainty. Rough sets can be integrated into an underlying database model, relational or object oriented, and also used in the design and uerying of databases, because roughsets are a versatile theory, theories. The authors discuss the rough relational databases model, the rough object-oriented database model, and fuzzy set and intuitionistic set extensions to each of these models. Comparisons and benefits of the various approaches are discussed, illustrating the usefulness and versatility of rough sets for uncertainty management in databases.


1999 ◽  
Vol 64 (4) ◽  
pp. 1601-1627 ◽  
Author(s):  
Kai Hauser

AbstractFor a canonical model of set theory whose projective theory of the real numbers is stable under set forcing extensions, a set of reals of minimal complexity is constructed which fails to be universally Baire. The construction uses a general method for generating non-universally Baire sets via the Levy collapse of a cardinal, as well as core model techniques. Along the way it is shown (extending previous results of Steel) how sufficiently iterable fine structure models recognize themselves as global core models.


Pragmatics ◽  
2006 ◽  
Vol 16 (1) ◽  
pp. 103-138 ◽  
Author(s):  
Pieter A.M. Seuren

This paper aims at an explanation of the discrepancies between natural intuitions and standard logic in terms of a distinction between NATURAL and CONSTRUCTED levels of cognition, applied to the way human cognition deals with sets. NATURAL SET THEORY (NST) restricts standard set theory cutting it down to naturalness. The restrictions are then translated into a theory of natural logic. The predicate logic resulting from these restrictions turns out to be that proposed in Hamilton (1860) and Jespersen (1917). Since, in this logic, NO is a quantifier in its own right, different from NOT-SOME, and given the assumption that natural lexicalization processes occur at the level of basic naturalness, single-morpheme lexicalizations for NOT-ALL should not occur, just as there is no single-morpheme lexicalization for NOT-SOME at that level. An analogous argument is developed for the systematic absence of lexicalizations for NOT-AND in propositional logic.


Author(s):  
Renaud Chorlay

This article examines ways of expressing generality and epistemic configurations in which generality issues became intertwined with epistemological topics, such as rigor, or mathematical topics, such as point-set theory. In this regard, three very specific configurations are discussed: the first evolving from Niels Henrik Abel to Karl Weierstrass, the second in Joseph-Louis Lagrange’s treatises on analytic functions, and the third in Emile Borel. Using questions of generality, the article first compares two major treatises on function theory, one by Lagrange and one by Augustin Louis Cauchy. It then explores how some mathematicians adopted the sophisticated point-set theoretic tools provided for by the advocates of rigor to show that, in some way, Lagrange and Cauchy had been right all along. It also introduces the concept of embedded generality for capturing an approach to generality issues that is specific to mathematics.


Author(s):  
Ernan McMullin

Kepler’s mathematical analysis of Brahe’s observations of the motions of Mars enabled him to formulate the descriptive ‘laws’ of planetary motion, thus giving heliocentric astronomy an empirical basis far more accurate than it had before. He insisted that astronomy had to discover the causes of the motions that the laws described, in this way becoming a ‘physics of the sky’. In the pursuit of this goal, he formulated the notion of distance-dependent forces between sun and planet, and guessed that gravity could be explained as an attraction between heavy bodies and their home planets, analogous to magnetic action, thus pointing the way for Newton’s theory of gravity.


Sign in / Sign up

Export Citation Format

Share Document