scholarly journals Syntactic Accidents in Program Analysis: On the Impact of the CPS Transformation

2001 ◽  
Vol 8 (54) ◽  
Author(s):  
Daniel Damian ◽  
Olivier Danvy

We show that a non-duplicating transformation into continuation-passing style (CPS) has no effect on control-flow analysis, a positive effect on binding-time analysis for traditional partial evaluation, and no effect on binding-time analysis for continuation-based partial evaluation: a monovariant control-flow analysis yields equivalent results on a direct-style program and on its CPS counterpart, a monovariant binding-time analysis yields less precise results on a direct-style program than on its CPS counterpart, and an enhanced monovariant binding-time analysis yields equivalent results on a direct-style program and on its CPS counterpart. Our proof technique amounts to constructing the CPS counterpart of flow information and of binding times.<br /> <br />Our results formalize and confirm a folklore theorem about traditional binding-time analysis, namely that CPS has a positive effect on binding times. What may be more surprising is that the benefit does not arise from a standard refinement of program analysis, as, for instance, duplicating continuations.<br /> <br />The present study is symptomatic of an unsettling property of program analyses: their quality is unpredictably vulnerable to syntactic accidents in source programs, i.e., to the way these programs are written. More reliable program analyses require a better understanding of the effect of syntactic change.

2000 ◽  
Vol 7 (15) ◽  
Author(s):  
Daniel Damian ◽  
Olivier Danvy

We show that a non-duplicating CPS transformation has no effect on control-flow analysis and that it has a positive effect on binding-time analysis: a monovariant control-flow analysis yields equivalent results on a direct-style program<br />and on its CPS counterpart, and a monovariant binding-time analysis yields more precise results on a CPS program than on its direct-style counterpart. Our proof technique amounts to constructing the continuation-passing style (CPS) counterpart of flow information and of binding times.<br />Our results confirm a folklore theorem about binding-time analysis, namely<br />that CPS has a positive effect on binding times. What may be more surprising is that this benefit holds even if contexts or continuations are not duplicated. The present study is symptomatic of an unsettling property of program analyses: their quality is unpredictably vulnerable to syntactic accidents in source programs, i.e., to the way these programs are written. More reliable program analyses require a better understanding of the effect of syntactic change.


2001 ◽  
Vol 8 (55) ◽  
Author(s):  
Daniel Damian ◽  
Olivier Danvy

We build on Danvy and Nielsen's first-order program transformation into continuation-passing style (CPS) to design a new CPS transformation of flow information that is simpler and more efficient than what has been presented in previous work. The key to simplicity and efficiency is that our CPS transformation constructs the flow information in one go, instead of first computing an intermediate result and then exploiting it to construct the flow information.<br /> <br />More precisely, we show how to compute control-flow information for CPS-transformed programs from control-flow information for direct-style programs and vice-versa. As a corollary, we confirm that CPS transformation has no effect on the control-flow information obtained by constraint-based control-flow analysis. The transformation has immediate applications in assessing the effect of the CPS transformation over other analyses such as, for instance, binding-time analysis.


1993 ◽  
Vol 3 (3) ◽  
pp. 315-346 ◽  
Author(s):  
Anders Bondorf ◽  
Jesper Jørgensen

AbstractBased on Henglein's efficient binding-time analysis for the lambda calculus (with constants and ‘fix’) (Henglein, 1991), we develop three efficient analyses for use in the preprocessing phase of Similix, a self-applicable partial evaluator for a higher-order subset of Scheme. The analyses developed in this paper are almost-linear in the size of the analysed program. (1) A flow analysis determines possible value flow between lambda-abstractions and function applications and between constructor applications and selector/predicate applications. The flow analysis is not particularly biased towards partial evaluation; the analysis corresponds to the closure analysis of Bondorf (1991b). (2) A (monovariant) binding-time analysis distinguishes static from dynamic values; the analysis treats both higher-order functions and partially static data structures. (3) A new is-used analysis, not present in Bondorf (1991b), finds a non-minimal binding-time annotation which is ‘safe’ in a certain way: a first-order value may only become static if its result is ‘needed’ during specialization; this ‘poor man's generalization’ (Holst, 1988) increases termination of specialization. The three analyses are performed sequentially in the above mentioned order since each depends on results from the previous analyses. The input to all three analyses are constraint sets generated from the program being analysed. The constraints are solved efficiently by a normalizing union/find-based algorithm in almost-linear time. Whenever possible, the constraint sets are partitioned into subsets which are solved in a specific order; this simplifies constraint normalization. The framework elegantly allows expressing both forwards and backwards components of analyses. In particular, the new is-used analysis is of backwards nature. The three constraint normalization algorithms are proved correct (soundness, completeness, termination, existence of a best solution). The analyses have been implemented and integrated in the Similix system. The new analyses are indeed much more efficient than those of Bondorf (1991b); the almost-linear complexity of the new analyses is confirmed by the implementation.


1995 ◽  
Vol 5 (4) ◽  
pp. 461-500 ◽  
Author(s):  
Charles Consel ◽  
Siau Cheng Khoo

AbstractThis paper presents semantic specifications and correctness proofs for both on-line and offline partial evaluation of strict first-order functional programs. To do so, our strategy consists of defining a core semantics as a basis for the specification of three non-standard evaluations: instrumented evaluation, on-line and off-line partial evaluation. We then use the technique of logical relations to prove the correctness of both on-line and off-line partial evaluation semantics.The contributions of this work are as follows:1. We provide a uniform framework to defining and proving correct both on-line and off-line partial evaluation.2. This work required a formal specification of on-line partial evaluation with polyvariant specialization. We define criteria for its correctness with respect to an instrumented standard semantics. As a by-product, on-line partial evaluation appears to be based on a fixpoint iteration process, just like binding-time analysis.3. We show that binding-time analysis, the preprocessing phase of off-line partial evaluation, is an abstraction of on-line partial evaluation. Therefore, its correctness can be proved with respect to on-line partial evaluation, instead of with respect to the standard semantics, as is customarily done.4. Based on the binding-time analysis, we formally derive the specialization semantics for off-line partial evaluation. This strategy ensures the correctness of the resulting semantics.


2002 ◽  
Vol 9 (36) ◽  
Author(s):  
Daniel Damian ◽  
Olivier Danvy

We characterize the impact of a linear beta-reduction on the result of a control-flow analysis. (By "a linear beta-reduction'' we mean the beta-reduction of a linear lambda-abstraction, i.e., of a lambda-abstraction whose parameter occurs exactly once in its body.)<br /> <br />As a corollary, we consider the administrative reductions of a Plotkin-style transformation into continuation-passing style (CPS), and how they affect the result of a constraint-based control-flow analysis and, in particular, the least element in the space of solutions. We show that administrative reductions preserve the least solution. Preservation of least solutions solves a problem that was left open in Palsberg and Wand's article "CPS Transformation of Flow Information.''<br /> <br />Together, Palsberg and Wand's article and the present article show how to map in linear time the least solution of the flow constraints of a program into the least solution of the flow constraints of the CPS counterpart of this program, after administrative reductions. Furthermore, we show how to CPS transform control-flow information in one pass.<br /><br />Supersedes BRICS-RS-01-40.


1995 ◽  
Vol 2 (51) ◽  
Author(s):  
Rowan Davies

<p>The Curry-Howard isomorphism identifies proofs with typed lambda-<br />calculus terms, and correspondingly identifies propositions with<br />types. We show how this isomorphism can be extended to relate<br />constructive temporal logic with binding-time analysis. In particular,<br />we show how to extend the Curry-Howard isomorphism<br />to include the   ("next") operator from linear-time temporal<br />logic. This yields the simply typed lambda-calculus which we prove<br />to be equivalent to a multi-level binding-time analysis like those<br />used in partial evaluation.</p><p><br />Keywords: Curry-Howard isomorphism, partial evaluation, binding-time analysis, temporal logic.</p>


1993 ◽  
Vol 3 (3) ◽  
pp. 365-387 ◽  
Author(s):  
Mitchell Wand

AbstractMogensen has exhibited a very compact partial evaluator for the pure lambda calculus, using binding-time analysis followed by specialization. We give a correctness criterion for this partial evaluator and prove its correctness relative to this specification. We show that the conventional properties of partial evaluators, such as the Futamura projections, are consequences of this specification. By considering both a flow analysis and the transformation it justifies together, this proof suggests a framework for incorporating flow analyses into verified compilers.


1995 ◽  
Vol 2 (41) ◽  
Author(s):  
Olivier Danvy ◽  
Karoline Malmkjær ◽  
Jens Palsberg

Partial-evaluation folklore has it that massaging one's source programs can make them specialize better. In Jones, Gomard, and Sestoft's recent textbook, a whole chapter is dedicated to listing such "binding-time improvements'': non-standard use of continuation-passing style, eta-expansion, and a popular transformation called "The Trick''. We provide a unified view of these binding-time improvements, from a typing perspective.<br /> <br />Just as a proper treatment of product values in partial evaluation requires partially static values, a proper treatment of disjoint sums requires moving static contexts across dynamic case expressions. This requirement precisely accounts for the non-standard use of continuation-passing style encountered in partial evaluation. In this setting, eta-expansion acts as a uniform binding-time coercion between values and contexts, be they of function type, product type, or disjoint-sum type. For the latter case, it achieves "The Trick''.<br /> <br />In this paper, we extend Gomard and Jones's partial evaluator for the lambda-calculus, lambda-Mix, with products and disjoint sums; we point out how eta-expansion for disjoint sums does The Trick; we generalize our earlier work by identifying that eta-expansion can be obtained in the binding-time analysis simply by adding two coercion rules; and we specify and prove the correctness of our extension to lambda-Mix.<br /><br /> See revised version BRICS-RS-96-17.


2001 ◽  
Vol 8 (29) ◽  
Author(s):  
Olivier Danvy ◽  
Bernd Grobauer ◽  
Morten Rhiger

Goal-directed evaluation, as embodied in Icon and Snobol, is built on the notions of backtracking and of generating successive results, and therefore it has always been something of a challenge to specify and implement. In this article, we address this challenge using computational monads and partial evaluation.<br /> <br />We consider a subset of Icon and we specify it with a monadic semantics and a list monad. We then consider a spectrum of monads that also fit the bill, and we relate them to each other. For example, we derive a continuation monad as a Church encoding of the list monad. The resulting semantics coincides with Gudeman's continuation semantics of Icon.<br /> <br />We then compile Icon programs by specializing their interpreter (i.e., by using the first Futamura projection), using type-directed partial evaluation. Through various back ends, including a run-time code generator, we generate ML code, C code, and OCaml byte code. Binding-time analysis and partial evaluation of the continuation-based interpreter automatically give rise to C programs that coincide with the result of Proebsting's optimized compiler.


1997 ◽  
Vol 7 (5) ◽  
pp. 507-541 ◽  
Author(s):  
JOHN HATCLIFF ◽  
OLIVIER DANVY

We formalize a partial evaluator for Eugenio Moggi's computational metalanguage. This formalization gives an evaluation-order independent view of binding-time analysis and program specialization, including a proper treatment of call unfolding. It also enables us to express the essence of ‘control-based binding-time improvements’ for let expressions. Specifically, we prove that the binding-time improvements given by ‘continuation-based specialization’ can be expressed in the metalanguage via monadic laws.


Sign in / Sign up

Export Citation Format

Share Document