scholarly journals Strong Normalization by Type-Directed Partial Evaluation and Run-Time Code Generation (Preliminary Version)

1997 ◽  
Vol 4 (43) ◽  
Author(s):  
Vincent Balat ◽  
Olivier Danvy

We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long beta-eta-normal form. Caml uses a virtual machine and has the capability to load byte code at run time. Representing the long beta-eta-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time.<br />We conclude this note with a preview of our current work on scaling<br />up strong normalization by run-time code generation to the Caml<br />module language.

1991 ◽  
Vol 1 (4) ◽  
pp. 459-494 ◽  
Author(s):  
Hanne Riis Nielson ◽  
Flemming Nielson

AbstractTraditional functional languages do not have an explicit distinction between binding times. It arises implicitly, however, as one typically instantiates a higher-order function with the arguments that are known, whereas the unknown arguments remain to be taken as parameters. The distinction between ‘known’ and ‘unknown’ is closely related to the distinction between binding times like ‘compile-time’ and ‘run-time’. This leads to the use of a combination of polymorphic type inference and binding time analysis for obtaining the required information about which arguments are known.Following the current trend in the implementation of functional languages we then transform the run-time level of the program (not the compile-time level) into categorical combinators. At this stage we have a natural distinction between two kinds of program transformations: partial evaluation, which involves the compile-time level of our notation, and algebraic transformations (i.e., the application of algebraic laws), which involves the run-time level of our notation.By reinterpreting the combinators in suitable ways we obtain specifications of abstract interpretations (or data flow analyses). In particular, the use of combinators makes it possible to use a general framework for specifying both forward and backward analyses. The results of these analyses will be used to enable program transformations that are not applicable under all circumstances.Finally, the combinators can also be reinterpreted to specify code generation for various (abstract) machines. To improve the efficiency of the code generated, one may apply abstract interpretations to collect extra information for guiding the code generation. Peephole optimizations may be used to improve the code at the lowest level.


1999 ◽  
Vol 6 (45) ◽  
Author(s):  
Torben Amtoft

We report on a case study in the application of partial evaluation, initiated<br />by the desire to speed up a constraint-based algorithm for control-flow<br /> analysis. We designed and implemented a dedicated partial evaluator,<br />able to specialize the analysis wrt. a given constraint graph and thus <br />remove the interpretive overhead, and measured it with Feeley's Scheme<br />benchmarks. Even though the gain turned out to be rather limited, our<br />investigation yielded valuable feed back in that it provided a better understanding<br />of the analysis, leading us to (re)invent an incremental version.<br />We believe this phenomenon to be a quite frequent spinoff from using <br />partial evaluation, since the removal of interpretive overhead makes the flow<br />of control more explicit and hence pinpoints sources of inefficiency. <br /> Finally, we observed that partial evaluation in our case yields such regular,<br />low-level specialized programs that it begs for run-time code generation.


1990 ◽  
Vol 19 (337) ◽  
Author(s):  
Hanne Riis Nielson ◽  
Flemming Nielson

<p>Traditional functional languages do not have an explicit distinction between binding times. It arises implicitly, however, as typically one instantiates a higher-order function with the arguments that are known whereas the unknown arguments remain to be taken as parameters. The distinction between ''known'' and ''unknown'' is closely related to the distinction between <em>binding times</em> like ''<em>compile-time</em>'' and ''<em> run-time</em>''. This leads to using a combination of <em>polymorphic type inference</em> and <em> binding time</em> analysis for obtaining the required information about which arguments are known.</p><p> </p><p>Following the current trend in the implementation of functional languages we then transform the run-time level of the program (<em> not</em> the compile-time level) into <em> categorial combinators</em>. At this stage we have a natural distinction between two kinds of program transformations: <em>partial evaluation</em> which involves the compile-time level of our notation and <em> algebraic transformations</em> (i.e. the application of algebraic laws) which involves the run-time level of our notation.</p><p> </p><p>By reinterpreting the combinators in suitable ways we obtain specifications of <em>abstract interpretations</em> (or data flow analyses). In particular, the use of combinators makes it possible to use a general framework for specifying both <em> forward</em> and <em> backward analyses</em>. The results of these analyses will be used to enable program transformations that are not applicable under all circumstances.</p><p> </p><p>Finally the combinators can also be reinterpreted to specify <em>code generation</em> for various (abstract) machines. To improve the efficiency of the code generated, one may apply abstract interpretations to collect extra information for guiding the code generation. Peephole optimizations may be used to improve the code at the lowest level.</p>


2003 ◽  
Vol 38 (12) ◽  
pp. 44-56 ◽  
Author(s):  
Sam Kamin
Keyword(s):  

2017 ◽  
Vol 27 (11) ◽  
pp. 1750178
Author(s):  
Lifang Cheng ◽  
Hongjun Cao

Three kinds of bifurcations of two coupled Rulkov neurons with electrical synapses are investigated in this paper. The critical normal forms are derived based on the center manifold theorem and the normal form theory. For the flip and the Neimark–Sacker bifurcation, the quartic terms and above in the normal forms are defined as higher order terms, which originate from the Taylor expansion of the original system. Then the effects of the quartic and quintic terms on the flip and the Neimark–Sacker bifurcation structure are discussed, which verifies that the normal form is locally topologically equivalent to the original system for the infinitesimal 4-sphere of initial conditions and tiny perturbation on the bifurcation curve. By the flip-Neimark–Sacker bifurcation analysis, a novel firing pattern can be found which is that the orbit oscillates between two invariant cycles. Two disconnected cardioid cycles also appear, which makes one, two, three, four, etc. turns happen before closure. Finally, we present a global bifurcation structure in the parameter space and exhibit the distribution of the periodic, quasi-periodic and chaotic firing patterns of the coupled neuron model.


2002 ◽  
Vol 9 (33) ◽  
Author(s):  
Vincent Balat ◽  
Olivier Danvy

We use a code generator--type-directed partial evaluation--to verify conversions between isomorphic types, or more precisely to verify that a composite function is the identity function at some complicated type. A typed functional language such as ML provides a natural support to express the functions and type-directed partial evaluation provides a convenient setting to obtain the normal form of their composition. However, off-the-shelf type-directed partial evaluation turns out to yield gigantic normal forms.<br /> <br />We identify that this gigantism is due to redundancies, and that these redundancies originate in the handling of sums, which uses delimited continuations. We successfully eliminate these redundancies by extending type-directed partial evaluation with memoization capabilities. The result only works for pure functional programs, but it provides an unexpected use of code generation and it yields orders-of-magnitude improvements both in time and in space for type isomorphisms.


Sign in / Sign up

Export Citation Format

Share Document