scholarly journals Strictness and Totality Analysis

1994 ◽  
Vol 23 (474) ◽  
Author(s):  
Kirsten Lackner Solberg ◽  
Hanne Riis Nielson

We define a novel inference system for strictness and totality analysis for the simply-typed lazy lambda-calculus with constants and fixpoints. Strictness information identifies those terms that definitely denote bottom (i.e. do not evaluate to WHNF) whereas totality information identifies those terms that definitely do not denote bottom (i.e. do evaluate to WHNF). The analysis is presented as an annotated type system allowing conjunctions only at ``top level´´. We give examples of its use and prove the correctness with respect to a natural-style operational semantics.

2006 ◽  
Vol 16 (3) ◽  
pp. 527-552 ◽  
Author(s):  
PETER SELINGER ◽  
BENOIT VALIRON

In this paper we develop a functional programming language for quantum computers by extending the simply-typed lambda calculus with quantum types and operations. The design of this language adheres to the ‘quantum data, classical control’ paradigm, following the first author's work on quantum flow-charts. We define a call-by-value operational semantics, and give a type system using affine intuitionistic linear logic. The main results of this paper are the safety properties of the language and the development of a type inference algorithm.


2001 ◽  
Vol 11 (6) ◽  
pp. 591-627 ◽  
Author(s):  
E. MOGGI ◽  
AMR SABRY

Launchbury and Peyton Jones came up with an ingenious idea for embedding regions of imperative programming in a pure functional language like Haskell. The key idea was based on a simple modification of Hindley-Milner's type system. Our first contribution is to propose a more natural encapsulation construct exploiting higher-order kinds, which achieves the same encapsulation effect, but avoids the ad hoc type parameter of the original proposal. The second contribution is a type safety result for encapsulation of strict state using both the original encapsulation construct and the newly introduced one. We establish this result in a more expressive context than the original proposal, namely in the context of the higher-order lambda-calculus. The third contribution is a type safety result for encapsulation of lazy state in the higher-order lambda-calculus. This result resolves an outstanding open problem on which previous proof attempts failed. In all cases, we formalize the intended implementations as simple big-step operational semantics on untyped terms, which capture interesting implementation details not captured by the reduction semantics proposed previously.


2008 ◽  
Vol 18 (3) ◽  
pp. 501-553 ◽  
Author(s):  
DAVID SABEL ◽  
MANFRED SCHMIDT-SCHAUSS

We present a higher-order call-by-need lambda calculus enriched with constructors, case expressions, recursive letrec expressions, a seq operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in the form of a single-step rewriting system that defines a (non-deterministic) normal-order reduction. This strategy can be made fair by adding resources for book-keeping. As equational theory, we use contextual equivalence (that is, terms are equal if, when plugged into any program context, their termination behaviour is the same), in which we use a combination of may- and must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations with respect to normal-order reduction are the same as for fair normal-order reduction. We develop a number of proof tools for proving correctness of program transformations. In particular, we prove a context lemma for both may- and must- convergence that restricts the number of contexts that need to be examined for proving contextual equivalence. Combining this with so-called complete sets of commuting and forking diagrams, we show that all the deterministic reduction rules and some additional transformations preserve contextual equivalence. We also prove a standardisation theorem for fair normal-order reduction. The structure of the ordering ≤c is also analysed, and we show that Ω is not a least element and ≤c already implies contextual equivalence with respect to may-convergence.


2003 ◽  
Vol 10 (25) ◽  
Author(s):  
Dariusz Biernacki ◽  
Olivier Danvy

Starting from a continuation-based interpreter for a simple logic programming language, propositional Prolog with cut, we derive the corresponding logic engine in the form of an abstract machine. The derivation originates in previous work (our article at PPDP 2003) where it was applied to the lambda-calculus. The key transformation here is Reynolds's defunctionalization that transforms a tail-recursive, continuation-passing interpreter into a transition system, i.e., an abstract machine. Similar denotational and operational semantics were studied by de Bruin and de Vink in previous work (their article at TAPSOFT 1989), and we compare their study with our derivation. Additionally, we present a direct-style interpreter of propositional Prolog expressed with control operators for delimited continuations.<br /><br />Superseded by BRICS-RS-04-5.


2000 ◽  
Vol 11 (01) ◽  
pp. 65-87
Author(s):  
MASATOMO HASHIMOTO

This paper develops an ML-style programming language with first-class contexts i.e. expressions with holes. The crucial operation for contexts is hole-filling. Filling a hole with an expression has the effect of dynamic binding or macro expansion which provides the advanced feature of manipulating open program fragments. Such mechanisms are useful in many systems including distributed/mobile programming and program modules. If we can treat a context as a first-class citizen in a programming language, then we can manipulate open program fragments in a flexible and seamless manner. A possibility of such a programming language was shown by the theory of simply typed context calculus developed by Hashimoto and Ohori. This paper extends the simply typed system of the context calculus to an ML-style polymorphic type system, and gives an operational semantics and a sound and complete type inference algorithm.


1992 ◽  
Vol 2 (1) ◽  
pp. 55-91 ◽  
Author(s):  
Pierre-Louis Curien ◽  
Giorgio Ghelli

A subtyping relation ≤ between types is often accompanied by a typing rule, called subsumption: if a term a has type T and T≤U, then a has type U. In presence of subsumption, a well-typed term does not codify its proof of well typing. Since a semantic interpretation is most naturally defined by induction on the structure of typing proofs, a problem of coherence arises: different typing proofs of the same term must have related meanings. We propose a proof-theoretical, rewriting approach to this problem. We focus on F≤, a second-order lambda calculus with bounded quantification, which is rich enough to make the problem interesting. We define a normalizing rewriting system on proofs, which transforms different proofs of the same typing judgement into a unique normal proof, with the further property that all the normal proofs assigning different types to a given term in a given environment differ only by a final application of the subsumption rule. This rewriting system is not defined on the proofs themselves but on the terms of an auxiliary type system, in which the terms carry complete information about their typing proof. This technique gives us three different results:— Any semantic interpretation is coherent if and only if our rewriting rules are satisfied as equations.— We obtain a proof of the existence of a minimum type for each term in a given environment.— From an analysis of the shape of normal form proofs, we obtain a deterministic typechecking algorithm, which is sound and complete by construction.


2018 ◽  
Vol 28 (9) ◽  
pp. 1606-1638 ◽  
Author(s):  
ANDREW CAVE ◽  
BRIGITTE PIENTKA

Proofs with logical relations play a key role to establish rich properties such as normalization or contextual equivalence. They are also challenging to mechanize. In this paper, we describe two case studies using the proof environmentBeluga: First, we explain the mechanization of the weak normalization proof for the simply typed lambda-calculus; second, we outline how to mechanize the completeness proof of algorithmic equality for simply typed lambda-terms where we reason about logically equivalent terms. The development of these proofs inBelugarelies on three key ingredients: (1) we encode lambda-terms together with their typing rules, operational semantics, algorithmic and declarative equality using higher order abstract syntax (HOAS) thereby avoiding the need to manipulate and deal with binders, renaming and substitutions, (2) we take advantage ofBeluga's support for representing derivations that depend on assumptions and first-class contexts to directly state inductive properties such as logical relations and inductive proofs, (3) we exploitBeluga's rich equational theory for simultaneous substitutions; as a consequence, users do not need to establish and subsequently use substitution properties, and proofs are not cluttered with references to them. We believe these examples demonstrate thatBelugaprovides the right level of abstractions and primitives to mechanize challenging proofs using HOAS encodings. It also may serve as a valuable benchmark for other proof environments.


1991 ◽  
Vol 1 (1) ◽  
pp. 3-48 ◽  
Author(s):  
Luca Cardelli ◽  
John C. Mitchell

We define a simple collection of operations for creating and manipulating record structures, where records are intended as finite associations of values to labels. A second-order type system over these operations supports both subtyping and polymorphism. We provide typechecking algorithms and limited semantic models.Our approach unifies and extends previous notions of records, bounded quantification, record extension, and parametrization by row-variables. The general aim is to provide foundations for concepts found in object-oriented languages, within a framework based on typed lambda-calculus.


1992 ◽  
Vol 21 (389) ◽  
Author(s):  
Jens Palsberg ◽  
Michael I. Schwartzbach

<p>Safety analysis is an algorithm for determining if a term in the untyped lambda calculus with constants is <em>safe</em>, i.e., if it does not cause an error during evaluation. This ambition is also shared by algorithms for type inference. Safety analysis and type inference are based on rather different perspectives, however. Safety analysis is based on closure analysis, whereas type inference attempts to assign a type to all subterms.</p><p>In this paper we prove that safety analysis is <em>sound</em>, relative to both a strict and a lazy operational semantics, and <em>superior</em> to type inference, in the sense that it accepts strictly more safe lambda terms.</p><p>The latter result may indicate the relative potentials of static program analyses based on respectively closure analysis and type inference.</p>


Sign in / Sign up

Export Citation Format

Share Document