scholarly journals Normalization by Evaluation with Typed Abstract Syntax

2001 ◽  
Vol 8 (16) ◽  
Author(s):  
Olivier Danvy ◽  
Morten Rhiger ◽  
Kristoffer H. Rose

<p>We present a simple way to implement typed abstract syntax for the<br />lambda calculus in Haskell, using phantom types, and we specify <br />normalization by evaluation (i.e., type-directed partial evaluation) to yield this<br />typed abstract syntax. Proving that normalization by evaluation preserves<br /> types and yields normal forms then reduces to type-checking the<br />specification.</p>


2000 ◽  
Vol 7 (34) ◽  
Author(s):  
Olivier Danvy ◽  
Morten Rhiger

<p>We present a simple way to program typed abstract syntax in a <br />language following a Hindley-Milner typing discipline, such as Haskell and ML, and we apply it to automate two proofs about normalization functions as embodied in type-directed partial evaluation for the simply typed lambda calculus: normalization functions (1) preserve types and (2) yield long beta-eta normal forms.</p><p>Keywords: Type-directed partial evaluation, normalization functions, simply-typed lambda-calculus, higher-order abstract syntax, Haskell.</p>



1992 ◽  
Vol 2 (1) ◽  
pp. 55-91 ◽  
Author(s):  
Pierre-Louis Curien ◽  
Giorgio Ghelli

A subtyping relation ≤ between types is often accompanied by a typing rule, called subsumption: if a term a has type T and T≤U, then a has type U. In presence of subsumption, a well-typed term does not codify its proof of well typing. Since a semantic interpretation is most naturally defined by induction on the structure of typing proofs, a problem of coherence arises: different typing proofs of the same term must have related meanings. We propose a proof-theoretical, rewriting approach to this problem. We focus on F≤, a second-order lambda calculus with bounded quantification, which is rich enough to make the problem interesting. We define a normalizing rewriting system on proofs, which transforms different proofs of the same typing judgement into a unique normal proof, with the further property that all the normal proofs assigning different types to a given term in a given environment differ only by a final application of the subsumption rule. This rewriting system is not defined on the proofs themselves but on the terms of an auxiliary type system, in which the terms carry complete information about their typing proof. This technique gives us three different results:— Any semantic interpretation is coherent if and only if our rewriting rules are satisfied as equations.— We obtain a proof of the existence of a minimum type for each term in a given environment.— From an analysis of the shape of normal form proofs, we obtain a deterministic typechecking algorithm, which is sound and complete by construction.



Author(s):  
SANDRINE BLAZY ◽  
PHILIPPE FACON

This paper describes a technique and a tool that support partial evaluation of FORTRAN programs, i.e., their specialization for specific values of their input variables. The authors’ aim is to understand old programs, which have become very complex due to numerous extensions. From a given FORTRAN program and these values of its input variables, the tool provides a simplified program, which behaves like the initial program for the specific values. This tool mainly uses constant propagation and simplification of alternatives to one of their branches. The tool is specified in terms of inference rules and operates by induction on the FORTRAN abstract syntax. These rules are compiled into Prolog by the Centaur/FORTRAN programming environment. The completeness and soundness of these rules are proven using rule induction.



2018 ◽  
Vol 28 (9) ◽  
pp. 1606-1638 ◽  
Author(s):  
ANDREW CAVE ◽  
BRIGITTE PIENTKA

Proofs with logical relations play a key role to establish rich properties such as normalization or contextual equivalence. They are also challenging to mechanize. In this paper, we describe two case studies using the proof environmentBeluga: First, we explain the mechanization of the weak normalization proof for the simply typed lambda-calculus; second, we outline how to mechanize the completeness proof of algorithmic equality for simply typed lambda-terms where we reason about logically equivalent terms. The development of these proofs inBelugarelies on three key ingredients: (1) we encode lambda-terms together with their typing rules, operational semantics, algorithmic and declarative equality using higher order abstract syntax (HOAS) thereby avoiding the need to manipulate and deal with binders, renaming and substitutions, (2) we take advantage ofBeluga's support for representing derivations that depend on assumptions and first-class contexts to directly state inductive properties such as logical relations and inductive proofs, (3) we exploitBeluga's rich equational theory for simultaneous substitutions; as a consequence, users do not need to establish and subsequently use substitution properties, and proofs are not cluttered with references to them. We believe these examples demonstrate thatBelugaprovides the right level of abstractions and primitives to mechanize challenging proofs using HOAS encodings. It also may serve as a valuable benchmark for other proof environments.



2002 ◽  
Vol 9 (33) ◽  
Author(s):  
Vincent Balat ◽  
Olivier Danvy

We use a code generator--type-directed partial evaluation--to verify conversions between isomorphic types, or more precisely to verify that a composite function is the identity function at some complicated type. A typed functional language such as ML provides a natural support to express the functions and type-directed partial evaluation provides a convenient setting to obtain the normal form of their composition. However, off-the-shelf type-directed partial evaluation turns out to yield gigantic normal forms.<br /> <br />We identify that this gigantism is due to redundancies, and that these redundancies originate in the handling of sums, which uses delimited continuations. We successfully eliminate these redundancies by extending type-directed partial evaluation with memoization capabilities. The result only works for pure functional programs, but it provides an unexpected use of code generation and it yields orders-of-magnitude improvements both in time and in space for type isomorphisms.



2002 ◽  
Vol 9 (52) ◽  
Author(s):  
Olivier Danvy

We present a translation from the call-by-value lambda-calculus to monadic normal forms that includes short-cut boolean evaluation. The translation is higher-order, operates in one pass, duplicates no code, generates no chains of thunks, and is properly tail recursive. It makes a crucial use of symbolic computation at translation time.



2021 ◽  
Vol 5 (OOPSLA) ◽  
pp. 1-29
Author(s):  
Weili Fu ◽  
Fabian Krause ◽  
Peter Thiemann

Dependently-typed programming languages are gaining importance, because they can guarantee a wide range of properties at compile time. Their use in practice is often hampered because programmers have to provide very precise types. Gradual typing is a means to vary the level of typing precision between program fragments and to transition smoothly towards more precisely typed programs. The combination of gradual typing and dependent types seems promising to promote the widespread use of dependent types. We investigate a gradual version of a minimalist value-dependent lambda calculus. Compile-time calculations and thus dependencies are restricted to labels, drawn from a generic enumeration type. The calculus supports the usual Pi and Sigma types as well as singleton types and subtyping. It is sufficiently powerful to provide flexible encodings of variant and record types with first-class labels. We provide type checking algorithms for the underlying label-dependent lambda calculus and its gradual extension. The gradual type checker drives the translation into a cast calculus, which extends the original language. The cast calculus comes with several innovations: refined typing for casts in the presence of singletons, type reduction in casts, and fully dependent Sigma types. Besides standard metatheoretical results, we establish the gradual guarantee for the gradual language.



Sign in / Sign up

Export Citation Format

Share Document