scholarly journals A Modern Look at GRIN, an Optimizing Functional Language Back End

2021 ◽  
Author(s):  
Peter Podlovics ◽  
Csaba Hruska ◽  
Andor Pénzes

GRIN is short for Graph Reduction Intermediate Notation, a modern back end for lazy functional languages. Most of the currently available compilers for such languages share a common flaw: they can only optimize programs on a per-module basis. The GRIN framework allows for interprocedural whole program analysis, enabling optimizing code transformations across functions and modules as well. Some implementations of GRIN already exist, but most of them were developed only for experimentation purposes. Thus, they either compromise on low-level efficiency or contain ad hoc modifications compared to the original specification. Our goal is to provide a full-fledged implementation of GRIN by combining the currently available best technologies like LLVM, and evaluate the framework's effectiveness by measuring how the optimizer improves the performance of certain programs. We also present some improvements to the already existing components of the framework. Some of these improvements include a typed representation for the intermediate language and an interprocedural program optimization, the dead data elimination.

2022 ◽  
Vol 6 (POPL) ◽  
pp. 1-28
Author(s):  
Amanda Liu ◽  
Gilbert Louis Bernstein ◽  
Adam Chlipala ◽  
Jonathan Ragan-Kelley

We present a lightweight Coq framework for optimizing tensor kernels written in a pure, functional array language. Optimizations rely on user scheduling using series of verified, semantics-preserving rewrites. Unusually for compilation targeting imperative code with arrays and nested loops, all rewrites are source-to-source within a purely functional language. Our language comprises a set of core constructs for expressing high-level computation detail and a set of what we call reshape operators, which can be derived from core constructs but trigger low-level decisions about storage patterns and ordering. We demonstrate that not only is this system capable of deriving the optimizations of existing state-of-the-art languages like Halide and generating comparably performant code, it is also able to schedule a family of useful program transformations beyond what is reachable in Halide.


Author(s):  
Margarita Khomyakova

The author analyzes definitions of the concepts of determinants of crime given by various scientists and offers her definition. In this study, determinants of crime are understood as a set of its causes, the circumstances that contribute committing them, as well as the dynamics of crime. It is noted that the Russian legislator in Article 244 of the Criminal Code defines the object of this criminal assault as public morality. Despite the use of evaluative concepts both in the disposition of this norm and in determining the specific object of a given crime, the position of criminologists is unequivocal: crimes of this kind are immoral and are in irreconcilable conflict with generally accepted moral and legal norms. In the paper, some views are considered with regard to making value judgments which could hardly apply to legal norms. According to the author, the reasons for abuse of the bodies of the dead include economic problems of the subject of a crime, a low level of culture and legal awareness; this list is not exhaustive. The main circumstances that contribute committing abuse of the bodies of the dead and their burial places are the following: low income and unemployment, low level of criminological prevention, poor maintenance and protection of medical institutions and cemeteries due to underperformance of state and municipal bodies. The list of circumstances is also open-ended. Due to some factors, including a high level of latency, it is not possible to reflect the dynamics of such crimes objectively. At the same time, identification of the determinants of abuse of the bodies of the dead will reduce the number of such crimes.


2019 ◽  
Vol 8 (2S8) ◽  
pp. 1463-1468

Software program optimization for improved execution speed can be achieved through modifying the program. Programs are usually written in high level languages then translated into low level assembly language. More coverage of optimization and performance analysis can be performed on low level than high level language. Optimization improvement is measured in the difference in program execution performance. Several methods are available for measuring program performance are classified into static approaches and dynamic approaches. This paper presents an alternative method of more accurately measuring code performance statically than commonly used code analysis metrics. New metrics proposed are designed to expose effectiveness of optimization performed on code, specifically unroll optimizations. An optimization method, loop unroll is used to demonstrate the effectiveness of the increased accuracy of the proposed metric. The results of the study show that measuring Instructions Performed and Instruction Latency is a more accurate static metric than Instruction Count and subsequently those based on it.


Author(s):  
Daniel Kroening

This chapter covers an application of propositional satisfiability to program analysis. We focus on the discovery of programming flaws in low-level programs, such as embedded software. The loops in the program are unwound together with a property to form a formula, which is then converted into CNF. The method supports low-level programming constructs such as bit-wise operators or pointer arithmetic.


Author(s):  
Marcos Lordello Chaim ◽  
Daniel Soares Santos ◽  
Daniela Soares Cruzes

Buffer overflow (BO) is a well-known and widely exploited security vulnerability. Despite the extensive body of research, BO is still a threat menacing security-critical applications. The authors present a comprehensive systematic review on techniques intended to detecting BO vulnerabilities before releasing a software to production. They found that most of the studies addresses several vulnerabilities or memory errors, being not specific to BO detection. The authors organized them in seven categories: program analysis, testing, computational intelligence, symbolic execution, models, and code inspection. Program analysis, testing and code inspection techniques are available for use by the practitioner. However, program analysis adoption is hindered by the high number of false alarms; testing is broadly used but in ad hoc manner; and code inspection can be used in practice provided it is added as a task of the software development process. New techniques combining object code analysis with techniques from different categories seem a promising research avenue towards practical BO detection.


1998 ◽  
Vol 8 (4) ◽  
pp. 319-321 ◽  
Author(s):  
SIMON PEYTON JONES ◽  
PHIL WADLER

In the end, research on functional languages does little good unless they are used to write something other than compilers for functional languages. However, if one scans a typical functional programming conference or journal, one mainly sees papers on twists in language design, speed-ups in compiled code, clever new analyses, or refinements to semantic models. It much less common to see a paper that considers a functional language as a tool to some other practical end. We would like to see this change.The Journal of Functional Programming carries, and will continue to carry, articles on all aspects of functional programming from lambda calculus theory to language design to implementation. But we have specially sought, and will continue to seek, papers on functional programming practice and experience.Research and papers on practice and experience sometimes receive less attention because they are perceived as possessing less academic content. So we want to remind potential authors that we have published a number of papers on this topic in the past, and to spell out the criteria we apply to such papers.


1993 ◽  
Vol 04 (01) ◽  
pp. 5-16 ◽  
Author(s):  
ALBERTO BROGGI ◽  
VINCENZO D'ANDREA ◽  
GIULIO DESTRI

In this paper we discuss the use of the Cellular Automata (CA) computational model in computer vision applications on massively parallel architectures. Motivations and guidelines of this approach to low-level vision in the frame of the PROMETHEUS project are discussed. The hard real-time requirement of actual application can be only satisfied using an ad hoc VLSI massively parallel architecture (PAPRICA). The hardware solutions and the specific algorithms can be efficiently verified and tested only using, as a simulator, a general purpose machine with a parent architecture (CM-2). An example of application related to feature extraction is discussed.


Author(s):  
Vanessa Grotti ◽  
Marc Brightman

AbstractIn this chapter we consider the afterlife of the remains of unidentified migrants who have died while attempting to cross the Mediterranean from Albania and North Africa to Italy. Drawing on insights from long-term, multi-sited field research, we outline paths taken by human remains and consider their multiple agencies and distributed personhood through the relational modalities with which they are symbolically and materially engaged at different scales of significance. The rising number of migrant deaths related to international crossings worldwide, especially in the Mediterranean, has stimulated a large body of scholarship, which generally relies upon a hermeneutics of secular transitional justice and fraternal transnationalism. We explore an alternative approach by focusing on the material and ritual afterlife of unidentified human remains at sea, examining the effects they have on their hosting environment. The treatment of dead strangers (across the double threshold constituted by the passage from life to death on the one hand and the rupture of exile on the other) raises new questions for the anthropology of death. We offer an interpretation of both ad hoc and organized recovery operations and mortuary practices, including forensic identification procedures, and collective and single burials of dead migrants, as acts of hospitality. Hosting the dead operates at different scales: it takes the politically charged form of memorialization at the levels of the state and the local community; however, while remembrance practices for dead strangers emphasize the latter’s status as a collective category, forensic technologies of remembrance are directed toward the reconstruction of (in)dividual personhood. These ritual and technological processes of memorialization and re-attachment together awaken ghosts of Italian fascism and colonialism.


Author(s):  
José L. Revilla

Abstract Decommissioning nuclear power plant causes an enormous amount of radioactive waste with very low level of contamination. A risk optimisation analysis would indicate that some of these residual materials need not to be handled, processed or disposed of with any reference to their radioactivity content, in order to allow more beneficial allocation for the limited social resources. This analysis could also be applied to the site liberation once a particular facility is decommissioned, remedial or restoration actions should be subjected to an optimisation process for selecting the best strategy of remedial measures. In order to make this release from regulatory control possible, it is necessary to establish conditions for the site or for these materials to be managed during their later reuse or final disposal. Authorisation for this release or clearance of control is a responsibility of the competent authority and, in the case of Spain, is carried out by the CSN (Spanish Nuclear Safety Council) on an “ad hoc” case by case analysis. Some personal considerations linked with the exemption policy and the application of radiological protection principles and criteria to the release authorisation of sites and solid materials generated within a regulated facility are presented in the paper. The main aim of this paper is to present the management options for very low level waste materials that are considered in the case of the Dismantling and Closure Plan authorisation granted for Vandellós 1 NPP decommissioning project. A framework consisting of three basic possibilities to apply clearance appears in the mentioned authorisation: • A first set of unconditional clearance levels N1 expressed in terms of gross activity concentration and surface contamination has been issued for unrestricted release of materials. Derived unconditional generic clearance levels, based on published international guidance, are also accepted. • Generic use of derived conditional clearance levels N2, based on “ad hoc” internationally published guidance, has been established for particular waste streams managed in well defined non regulated practices (metallic scrap recycling and concrete demolition debris). • The applicant may also propose candidate materials for other non-regulated route management practice, for which specific conditional clearance levels N3 can be issued by the Nuclear Safety Council. In all cases, control procedures have to be imposed to the licensee producing the residual materials that can be verified by the Safety Authority. They are based on the certification of the radionuclide content supported by quality controls and maintenance of records. There is not an official criterion, until now, for the remediation of land and liberation of the site, but probably the same radiological analysis will be used when evaluating the restoration plan application. A kind of “rubblization” is being considered by the licensee, using the above-mentioned third possibility for conditional clearance of the rubble produced in the dismantling of some particular buildings.


Sign in / Sign up

Export Citation Format

Share Document