scholarly journals Higher-Order Types and Meta-Programming for Global Computing

2002 ◽  
Vol 62 ◽  
pp. 52-68 ◽  
Author(s):  
G. Ferrari ◽  
E. Moggi ◽  
R. Pugliese
2006 ◽  
Vol 16 (4-5) ◽  
pp. 375-414 ◽  
Author(s):  
MATTHIAS BLUME ◽  
DAVID McALLESTER

Even in statically typed languages it is useful to have certain invariants checked dynamically. Findler and Felleisen gave an algorithm for dynamically checking expressive higher-order types called contracts. They did not, however, give a semantics of contracts. The lack of a semantics makes it impossible to define and prove soundness and completeness of the checking algorithm. (Given a semantics, a sound checker never reports violations that do not exist under that semantics; a complete checker is – in principle – able to find violations when violations exist.) Ideally, a semantics should capture what programmers intuitively feel is the meaning of a contract or otherwise clearly point out where intuition does not match reality. In this paper we give an interpretation of contracts for which we prove the Findler-Felleisen algorithm sound and (under reasonable assumptions) complete. While our semantics mostly matches intuition, it also exposes a problem with predicate contracts where an arguably more intuitive interpretation than ours would render the checking algorithm unsound. In our semantics we have to make use of a notion of safety (which we define in the paper) to avoid unsoundness. We are able to eliminate the “leakage” of safety into the semantics by changing the language, replacing the original version of unrestricted predicate contracts with a restricted form. The corresponding loss in expressive power can be recovered by making safety explicit as a contract. This can be done either in ad-hoc fashion or by including general recursive contracts. The addition of recursive contracts has far-reaching implications, deeply affecting the formulation of our model and requiring different techniques for proving soundness.


1994 ◽  
Vol 4 (4) ◽  
pp. 435-477 ◽  
Author(s):  
Fritz Henglein ◽  
Harry G. Mairson

AbstractWe analyse the computational complexity of type inference for untyped λ-terms in the second-order polymorphic typed λ-calculus (F2) invented by Girard and Reynolds, as well as higher-order extensions F3, F4, …, Fω proposed by Girard. We prove that recognising the F2-typable terms requires exponential time, and for Fω the problem is non-elementary. We show as well a sequence of lower bounds on recognising the Fk-typable terms, where the bound for Fk+1 is exponentially larger than that for Fk.The lower bounds are based on generic simulation of Turing Machines, where computation is simulated at the expression and type level simultaneously. Non-accepting computations are mapped to non-normalising reduction sequences, and hence non-typable terms. The accepting computations are mapped to typable terms, where higher-order types encode reduction sequences, and first-order types encode the entire computation as a circuit, based on a unification simulation of Boolean logic. A primary technical tool in this reduction is the composition of polymorphic functions having different domains and ranges.These results are the first nontrivial lower bounds on type inference for the Girard/Reynolds system as well as its higher-order extensions. We hope that the analysis provides important combinatorial insights which will prove useful in the ultimate resolution of the complexity of the type inference problem.


2013 ◽  
Vol 23 (6) ◽  
pp. 658-700
Author(s):  
MATTHEW R. LAKIN ◽  
ANDREW M. PITTS

AbstractCorrect handling of names and binders is an important issue in meta-programming. This paper presents an embedding of constraint logic programming into the αML functional programming language, which provides a provably correct means of implementing proof search computations over inductive definitions involving names and binders modulo α-equivalence. We show that the execution of proof search in the αML operational semantics is sound and complete with regard to the model-theoretic semantics of formulae, and develop a theory of contextual equivalence for the subclass of αML expressions which correspond to inductive definitions and formulae. In particular, we prove that αML expressions, which denote inductive definitions, are contextually equivalent precisely when those inductive definitions have the same model-theoretic semantics. This paper is a revised and extended version of the conference paper (Lakin, M. R. & Pitts, A. M. (2009) Resolving inductive definitions with binders in higher-order typed functional programming. InProceedings of the 18th European Symposium on Programming (ESOP 2009), Castagna, G. (ed), Lecture Notes in Computer Science, vol. 5502. Berlin, Germany: Springer-Verlag, pp. 47–61) and draws on material from the first author's PhD thesis (Lakin, M. R. (2010)An Executable Meta-Language for Inductive Definitions with Binders. University of Cambridge, UK).


Author(s):  
Dale Miller ◽  
Gopalan Nadathur
Keyword(s):  

2005 ◽  
Vol 5 (3) ◽  
pp. 305-354 ◽  
Author(s):  
GOPALAN NADATHUR

The logic programming paradigm provides the basis for a new intensional view of higher-order notions. This view is realized primarily by employing the terms of a typed lambda calculus as representational devices and by using a richer form of unification for probing their structures. These additions have important meta-programming applications but they also pose non-trivial implementation problems. One issue concerns the machine representation of lambda terms suitable to their intended use: an adequate encoding must facilitate comparison operations over terms in addition to supporting the usual reduction computation. Another aspect relates to the treatment of a unification operation that has a branching character and that sometimes calls for the delaying of the solution of unification problems. A final issue concerns the execution of goals whose structures become apparent only in the course of computation. These various problems are exposed in this paper and solutions to them are described. A satisfactory representation for lambda terms is developed by exploiting the nameless notation of de Bruijn as well as explicit encodings of substitutions. Special mechanisms are molded into the structure of traditional Prolog implementations to support branching in unification and carrying of unification problems over other computation steps; a premium is placed in this context on exploiting determinism and on emulating usual first-order behaviour. An extended compilation model is presented that treats higher-order unification and also handles dynamically emergent goals. The ideas described here have been employed in the Teyjus implementation of the $\lambda$Prolog language, a fact that is used to obtain a preliminary assessment of their efficacy.


1998 ◽  
Vol 14 ◽  
pp. 38-51
Author(s):  
Carlos Camarão ◽  
Lucília Figueiredo
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document