A proof-irrelevant model of Martin-Löf's logical framework

2002 ◽  
Vol 12 (6) ◽  
pp. 771-795 ◽  
Author(s):  
DANIEL FRIDLENDER

We extend the proof-irrelevant model defined in Smith (1988) to the whole of Martin-Löf's logical framework. The main difference here is the existence of a type whose objects themselves represent types rather than proof-objects. This means that the model must now be able to distinguish between objects with different degree of relevance: those that denote proofs are irrelevant whereas those that denote types are not. In fact a whole hierarchy of relevance exists.Another difference is the higher level of detail in the formulation of the formal theory, such as the explicit manipulation of contexts and substitutions. This demands an equally detailed definition of the model, including interpreting contexts and substitutions.We are thus led to a whole reformulation of the proof-irrelevant model. We present a model that is built up from an arbitrary model of the untyped lambda calculus. We also show how to extend it when the logical framework itself is enlarged with inductive definitions. In doing so, a variant of Church numerals is introduced.As in Smith (1988), the model can only be defined in the absence of universes, and it is useful to obtain an elementary proof of consistency and to prove the independence of Peano's fourth axiom.

2016 ◽  
Vol 27 (8) ◽  
pp. 1364-1385
Author(s):  
ULRICH BERGER ◽  
TIE HOU

We give a realizability interpretation of an intuitionistic version of Church's Simple Theory of Types (CST) which can be viewed as a formalization of intuitionistic higher-order logic. Although definable in CST we include operators for monotone induction and coinduction and provide simple realizers for them. Realizers are formally represented in an untyped lambda–calculus with pairing and case-construct. The purpose of this interpretation is to provide a foundation for the extraction of verified programs from formal proofs as an alternative to type-theoretic systems. The advantages of our approach are that (a) induction and coinduction are not restricted to the strictly positive case, (b) abstract mathematical structures and results may be imported, (c) the formalization is technically simpler than in other systems, for example, regarding the definition of realizability, which is a simple syntactical substitution, and the treatment of nested and simultaneous (co)inductive definitions.


Author(s):  
Juan de Lara ◽  
Esther Guerra

AbstractModelling is an essential activity in software engineering. It typically involves two meta-levels: one includes meta-models that describe modelling languages, and the other contains models built by instantiating those meta-models. Multi-level modelling generalizes this approach by allowing models to span an arbitrary number of meta-levels. A scenario that profits from multi-level modelling is the definition of language families that can be specialized (e.g., for different domains) by successive refinements at subsequent meta-levels, hence promoting language reuse. This enables an open set of variability options given by all possible specializations of the language family. However, multi-level modelling lacks the ability to express closed variability regarding the availability of language primitives or the possibility to opt between alternative primitive realizations. This limits the reuse opportunities of a language family. To improve this situation, we propose a novel combination of product lines with multi-level modelling to cover both open and closed variability. Our proposal is backed by a formal theory that guarantees correctness, enables top-down and bottom-up language variability design, and is implemented atop the MetaDepth multi-level modelling tool.


1995 ◽  
Vol 06 (03) ◽  
pp. 203-234 ◽  
Author(s):  
YUKIYOSHI KAMEYAMA

This paper studies an extension of inductive definitions in the context of a type-free theory. It is a kind of simultaneous inductive definition of two predicates where the defining formulas are monotone with respect to the first predicate, but not monotone with respect to the second predicate. We call this inductive definition half-monotone in analogy of Allen’s term half-positive. We can regard this definition as a variant of monotone inductive definitions by introducing a refined order between tuples of predicates. We give a general theory for half-monotone inductive definitions in a type-free first-order logic. We then give a realizability interpretation to our theory, and prove its soundness by extending Tatsuta’s technique. The mechanism of half-monotone inductive definitions is shown to be useful in interpreting many theories, including the Logical Theory of Constructions, and Martin-Löf’s Type Theory. We can also formalize the provability relation “a term p is a proof of a proposition P” naturally. As an application of this formalization, several techniques of program/proof-improvement can be formalized in our theory, and we can make use of this fact to develop programs in the paradigm of Constructive Programming. A characteristic point of our approach is that we can extract an optimization program since our theory enjoys the program extraction theorem.


1988 ◽  
Vol 65 (5) ◽  
pp. 2261-2264 ◽  
Author(s):  
T. A. Wilson

Standard methods for describing the mechanical properties of a linear elastic system are applied to the two- and three-compartment models of the chest wall. The compliance matrix and the experiments required to determine the entries in this matrix and thereby to describe the mechanical properties of the relaxed chest wall are described. The effective forces exerted by external loads and muscle tension are defined. The formal theory is used to identify relations among variables. From the definition of effective force, it follows that the ratio of the forces exerted by the diaphragm on the rib cage and abdomen is the same as the ratio of the dependence of diaphragm length on rib cage and abdominal volumes. As an example of relations among variables that follow from the symmetry of the compliance matrix, it is shown that the change of gastric pressure caused by raising pleural pressure is related to the change in lung volume caused by changing stomach volume.


1976 ◽  
Vol 41 (1) ◽  
pp. 188-198 ◽  
Author(s):  
Douglas Cenzer

Monotone inductive definitions occur frequently throughout mathematical logic. The set of formulas in a given language and the set of consequences of a given axiom system are examples of (monotone) inductively defined sets. The class of Borel subsets of the continuum can be given by a monotone inductive definition. Kleene's inductive definition of recursion in a higher type functional (see [6]) is fundamental to modern recursion theory; we make use of it in §2.Inductive definitions over the natural numbers have been studied extensively, beginning with Spector [11]. We list some of the results of that study in §1 for comparison with our new results on inductive definitions over the continuum. Note that for our purposes the continuum is identified with the Baire space ωω.It is possible to obtain simple inductive definitions over the continuum by introducing real parameters into inductive definitions over N—as in the definition of recursion in [5]. This is itself an interesting concept and is discussed further in [4]. These parametric inductive definitions, however, are in general weaker than the unrestricted set of inductive definitions, as is indicated below.In this paper we outline, for several classes of monotone inductive definitions over the continuum, solutions to the following characterization problems:(1) What is the class of sets which may be given by such inductive definitions ?(2) What is the class of ordinals which are the lengths of such inductive definitions ?These questions are made more precise below. Most of the results of this paper were announced in [2].


1992 ◽  
Vol 2 (2) ◽  
pp. 231-247 ◽  
Author(s):  
Kim B. Bruce ◽  
Roberto Di Cosmo ◽  
Giuseppe Longo

A constructive characterization is given of the isomorphisms which must hold in all models of the typed lambda calculus with surjective pairing. Using the close relation between closed Cartesian categories and models of these calculi, we also produce a characterization of those isomorphisms which hold in all CCC's. Using the correspondence between these calculi and proofs in intuitionistic positive propositional logic, we thus provide a characterization of equivalent formulae of this logic, where the definition of equivalence of terms depends on having “invertible” proofs between the two terms. Work of Rittri (1989), on types as search keys in program libraries, provides an interesting example of use of these characterizations.


Author(s):  
PIERRE-EVARISTE DAGAND

AbstractFunctional programmers from all horizons strive to use, and sometimes abuse, their favorite type system in order to capture the invariants of their programs. A widely used tool in that trade consists in defining finely indexed datatypes. Operationally, these types classify the programmer's data, following the ML tradition. Logically, these types enforce the program invariants in a novel manner. This new programming pattern, by which one programs over inductive definitions to account for some invariants, lead to the development of a theory of ornaments (McBride, 2011 Ornamental Algebras, Algebraic Ornaments. Unpublished). However, ornaments originate as a dependently-typed object and may thus appear rather daunting to a functional programmer of the non-dependent kind. This article aims at presenting ornaments from first-principles and, in particular, to declutter their presentation from syntactic considerations. To do so, we shall give a sufficiently abstract model of indexed datatypes by means of many-sorted signatures. In this process, we formalize our intuition that an indexed datatype is the combination of a data-structure and a data-logic. Over this abstraction of datatypes, we shall recast the definition of ornaments, effectively giving a model of ornaments. Benefiting both from the operational and abstract nature of many-sorted signatures, ornaments should appear applicable and, one hopes, of interest beyond the type-theoretic circles, case in point being languages with generalized abstract datatypes or refinement types.


10.29007/hsbm ◽  
2018 ◽  
Author(s):  
Baudouin Le Charlier ◽  
Mêton Mêton Atindehou

We present a data structure to represent and manipulate large sets of (equal) terms (or expressions). Our initial and main motivation for this data structure is the simplification of expressions with respect to a formal theory, typically, an equational one. However, it happens that the data structure is also efficient to compute the congruence closure of a relation over a set of terms.We provide an abstract definition of the data structure, including a precise semantics, and we explain how to implement it efficiently. We prove the correctness of the proposed algorithms, with a complexity analysis and experimental results. We compare these algorithms with previous algorithms to compute the congruence closure and we also sketch how we use the data structure to tackle the expression simplification problem.


Author(s):  
M.-O. Löwner ◽  
G. Gröger ◽  
J. Benner ◽  
F. Biljecki ◽  
C. Nagel

The Open Geospatial Consortium (OGC) <i>CityGML</i> standard offers a Level of Detail (LoD) concept that enables the representation of CityGML features from a very detailed to a less detailed description. Due to a rising application variety, the current LoD concept seems to be too inflexible. Here, we present a multi representation concept (MRC) that enables a user-defined definition of LoDs. Because CityGML is an international standard, official profiles of the MRC are proposed. However, encoding of the defined profiles reveals many problems including mapping the conceptual model to the normative encoding, missing technologies and so on. Therefore, we propose to use the MRC as a meta model for the further definition of an LoD concept for CityGML 3.0.


Author(s):  
Elena B. Agoshkova

Systems thinking is an important factor in solving global problems. The twentieth-century has witnessed the development of a systems paradigm and different spheres of systems knowledge. However, further development of systems thinking necessitates overcoming the contradictions between different schools and unifying them into a single systems conception. With this in mind, systems problems are examined in light of the theory of knowledge. It is suggested that the gnosiological definition of the notion 'system' should be used as a basis for a single approach. An analysis of the concept 'system' leads to a logically well-structured conception of system. It follows from this that, in addition to the general theory of systems and the systems science, a non-formal theory of whole object and non-formal systems logic should form part of the systems thinking. This would set the stage for a categorical structure and a conceptual basis for systems thinking. The development of systems thinking should be regarded as the key challenge in perfecting humanity. The elaboration of a single systems conception within the philosophy of science and the methodology of scientific knowledge should be treated as a basis for meeting this challenge.


Sign in / Sign up

Export Citation Format

Share Document