Building up a toolbox for Martin-Löf’s type theory: subset theory

Author(s):  
Giovanni Sambin ◽  
Silvio Valentini

Beginning in 1970, Per Martin-Löf has developed an intuitionistic type theory (hence-forth type theory for short) as a constructive alternative to the usual foundation of mathematics based on classical set theory. We assume the reader is aware at least of the main peculiarities of type theory, as formulated in Martin-Löf 1984 or Nordström et al. 1990; here we recall some of them to be able to introduce our point of view. The form of type theory is that of a logical calculus, where inference rules to derive judgements are at the same time set-theoretic constructions, because of the “propositions-as-sets” interpretation. The spirit of type theory—expressing our interpretation in a single sentence-—is to adopt those notions and rules which keep total control of the amount of information contained in the different forms of judgement. We now briefly justify this claim. First of all, the judgement asserting the truth of a proposition A, which from an intuitionistic point of view means the existence of a verification of A, in type theory is replaced by the judgement a ∈ A which explicitly exhibits a verification a of A. In fact, it would be unwise, for a constructivist, to throw away the specific verification of A which must be known to be able to assert the existence of a verification! The judgement that A is a set, which from an intuitionistic point of view means that there exists an inductive presentation of A, is treated in type theory in a quite similar way (even if in this case no notation analogous to a ∈ A is used) since the judgement A set in type theory becomes explicit knowledge of the specific inductive presentation of A. In fact, the rules for primitive types and for type constructors are so devised that whenever a judgement A set is proved, it means that one also has complete information on the rules which describe how canonical elements of A are formed. Such a property, which might look like a peculiarity of type theory, is as a matter of fact necessary in order to give a coherent constructive treatment of quantifiers.

Author(s):  
Silvio Valentini

The aim of this paper is to give a simple but instructive example of the forget-restore principle, conceived by Giovanni Sambin as a discipline for a constructive development of mathematics and which first appeared in print in the introduction of Sambin and Valentini 1998. The best way to explain such a philosophical position is to quote from that paper: “To build up an abstract concept from a raw flow of data, one must disregard inessential details ... this is obtained by forgetting some information. To forget information is the same as to destroy something, in particular if there is no possibility of restoring that information ... our principle is that an abstraction is constructive ... when information ... is forgotten in such a way that it can be restored at will in any moment.” The example we want to show here refers to Martin-Löf’s intuitionistic type theory (just type theory from now on). We assume knowledge of the main peculiarities of type theory, as formulated in Martin-Löf 1984 or Nordström et al. 1990. Type theory is a logical calculus which adopts those notions and rules which keep total control of the amount of information contained in the different forms of judgement. However, type theory offers a way of “forgetting” information: that is, supposing A set, the form of judgement A true. The meaning of A true is that there exists an element a such that a ∈ A but it does not matter which particular element a is (see also the notion of proof irrelevance in de Bruijn 1980). Thus to pass from the judgement a ∈ A to the judgement A true is а clear example of the forgetting process. We will show that it is a constructive way to forget since, provided that there is a proof of the judgement A true, an element a such that a ∈ A can be reconstructed.


1991 ◽  
Vol 56 (3) ◽  
pp. 505-559 ◽  
Author(s):  
Karel Eckschlager

In this review, analysis is treated as a process of gaining information on chemical composition, taking place in a stochastic system. A model of this system is outlined, and a survey of measures and methods of information theory is presented to an extent as useful for qualitative or identification, quantitative and trace analysis and multicomponent analysis. It is differentiated between information content of an analytical signal and information gain, or amount of information, obtained by the analysis, and their interrelation is demonstrated. Some notions of analytical chemistry are quantified from the information theory and system theory point of view; it is also demonstrated that the use of fuzzy set theory can be suitable. The review sums up the principal results of the series of 25 papers which have been published in this journal since 1971.


Author(s):  
Cesare Gallozzi

Abstract We introduce a family of (k, h)-interpretations for 2 ≤ k ≤ ∞ and 1 ≤ h ≤ ∞ of constructive set theory into type theory, in which sets and formulas are interpreted as types of homotopy level k and h, respectively. Depending on the values of the parameters k and h, we are able to interpret different theories, like Aczel’s CZF and Myhill’s CST. We also define a proposition-as-hproposition interpretation in the context of logic-enriched type theories. The rest of the paper is devoted to characterising and analysing the interpretations considered. The formulas valid in the prop-as-hprop interpretation are characterised in terms of the axiom of unique choice. We also analyse the interpretations of CST into homotopy type theory, providing a comparative analysis with Aczel’s interpretation. This is done by formulating in a logic-enriched type theory the key principles used in the proofs of the two interpretations. Finally, we characterise a class of sentences valid in the (k, ∞)-interpretations in terms of the ΠΣ axiom of choice.


2021 ◽  
pp. 1-24
Author(s):  
Lijun Chen ◽  
Damei Luo ◽  
Pei Wang ◽  
Zhaowen Li ◽  
Ningxin Xie

 An approximation space (A-space) is the base of rough set theory and a fuzzy approximation space (FA-space) can be seen as an A-space under the fuzzy environment. A fuzzy probability approximation space (FPA-space) is obtained by putting probability distribution into an FA-space. In this way, it combines three types of uncertainty (i.e., fuzziness, probability and roughness). This article is devoted to measuring the uncertainty for an FPA-space. A fuzzy relation matrix is first proposed by introducing the probability into a given fuzzy relation matrix, and on this basis, it is expanded to an FA-space. Then, granularity measurement for an FPA-space is investigated. Next, information entropy measurement and rough entropy measurement for an FPA-space are proposed. Moreover, information amount in an FPA-space is considered. Finally, a numerical example is given to verify the feasibility of the proposed measures, and the effectiveness analysis is carried out from the point of view of statistics. Since three types of important theories (i.e., fuzzy set theory, probability theory and rough set theory) are clustered in an FPA-space, the obtained results may be useful for dealing with practice problems with a sort of uncertainty.


Author(s):  
Lev D. Lamberov ◽  

In recent decades, some epistemological issues have become especially acute in mathematics. These issues are associated with long proofs of various important mathematical results, as well as with a large and constantly increasing number of publications in mathematics. It is assumed that (at least partially) these difficulties can be resolved by referring to computer proofs. However, computer proofs also turn out to be problematic from an epistemological point of view. With regard to both proofs in ordinary (informal) mathematics and computer proofs, the problem of their surveyability appears to be fundamental. Based on the traditional concept of proof, it must be surveyable, otherwise it will not achieve its main goal — the formation of conviction in the correctness of the mathematical result being proved. About 15 years ago, a new approach to the foundations of mathematics began to develop, combining constructivist, structuralist features and a number of advantages of the classical approach to mathematics. This approach is built on the basis of homotopy type theory and is called the univalent foundations of mathematics. Due to itspowerful notion of equality, this approach can significantly reduce the length of formalized proofs, which outlines a way to resolve the epistemological difficulties that have arisen


1971 ◽  
Vol 36 (3) ◽  
pp. 414-432 ◽  
Author(s):  
Peter B. Andrews

In [8] J. A. Robinson introduced a complete refutation procedure called resolution for first order predicate calculus. Resolution is based on ideas in Herbrand's Theorem, and provides a very convenient framework in which to search for a proof of a wff believed to be a theorem. Moreover, it has proved possible to formulate many refinements of resolution which are still complete but are more efficient, at least in many contexts. However, when efficiency is a prime consideration, the restriction to first order logic is unfortunate, since many statements of mathematics (and other disciplines) can be expressed more simply and naturally in higher order logic than in first order logic. Also, the fact that in higher order logic (as in many-sorted first order logic) there is an explicit syntactic distinction between expressions which denote different types of intuitive objects is of great value where matching is involved, since one is automatically prevented from trying to make certain inappropriate matches. (One may contrast this with the situation in which mathematical statements are expressed in the symbolism of axiomatic set theory.).


Author(s):  
B. K. Tripathy

Granular Computing has emerged as a framework in which information granules are represented and manipulated by intelligent systems. Granular Computing forms a unified conceptual and computing platform. Rough set theory put forth by Pawlak is based upon single equivalence relation taken at a time. Therefore, from a granular computing point of view, it is single granular computing. In 2006, Qiang et al. introduced a multi-granular computing using rough set, which was called optimistic multigranular rough sets after the introduction of another type of multigranular computing using rough sets called pessimistic multigranular rough sets being introduced by them in 2010. Since then, several properties of multigranulations have been studied. In addition, these basic notions on multigranular rough sets have been introduced. Some of these, called the Neighborhood-Based Multigranular Rough Sets (NMGRS) and the Covering-Based Multigranular Rough Sets (CBMGRS), have been added recently. In this chapter, the authors discuss all these topics on multigranular computing and suggest some problems for further study.


Author(s):  
Álvaro Quijano-Solís ◽  
Guadalupe Vega-Díaz

The purpose of this chapter is to describe how the concepts and principles from the Systems Approach may be helpful in understanding and modeling the collaborative group cognitive processes in information handling in an academic library. In order to address complexity and dynamics, this chapter analyzes several theoretical positions, which together may help us to shape the academic library from a comprehensive and systemic point of view (such as Systems Approach, Communities of Practice, Activity Theory and the Viable System Model). This chapter suggests focalizing on the activity (performed by a community) as the basic unit of analysis in studying the complexity of academic libraries. This activity is what allows the transmission of tacit and explicit knowledge and the skills from an expert to a novice. Other elements in the activity are objectives, rules and regulations, and importantly the learning processes that occur dialectically between subjects and community. A model such as Beer´s in the way the authors presented it in this chapter fits well to decompose reality and synthesize it to analyze the proposed complexity. This may allow facing organizational problems by focusing in the way people act to transform the inputs into products and add value to them by teaching and learning.


Sign in / Sign up

Export Citation Format

Share Document