scholarly journals Towards a Generic Framework for Trustworthy Program Refactoring

2021 ◽  
Author(s):  
Dániel Horpácsi ◽  
Judit Kőszegi ◽  
Dávid J. Németh

Refactoring has to preserve the dynamics of the transformed program with respect to a particular definition of semantics and behavioral equivalence. Apparently, it is always challenging to relate executable refactoring implementations with the formal semantics of the transformed language. There are a number of approaches to specifying program transformations on various kinds of program models, but trustworthiness of refactoring is still to be improved by means of formal verification. We propose a specification formalism and a generic framework for its processing, which claims to allow semi-automatic execution and formal verification, as well as to be adaptable to multiple paradigms.

Recycling ◽  
2019 ◽  
Vol 4 (2) ◽  
pp. 23 ◽  
Author(s):  
Julia Tanzer ◽  
Helmut Rechberger

Circular economy is currently characterized by various definitions, measurement approaches, and critical analyses thereof coexisting alongside each other. Whether the concept eventually prevails or collapses will depend to some extent on our success in harmonizing assessment methods among public, scientific, and private institutions, as well as across different materials and scales. Therefore, in this article, we present a generic material flow analysis framework that might serve as a common basis for circularity assessment, and test it by means of three case studies. It proved impossible to eliminate all subjective assumptions when transforming a real complex system into the generic framework, especially regarding the definition of by-products. However, by introducing subsystems it is at least possible to make such assumptions transparent. Therefore, adequate comparability across regions, materials, and scales is provided. Moreover, the generic system allows for coupled analysis of multiple materials simultaneously so that interactions between them can be studied, as well and a deeper insight into overall sustainability of the system can be gained.


2003 ◽  
Vol 9 (3) ◽  
pp. 273-298 ◽  
Author(s):  
Akihiro Kanamori

For the modern set theorist the empty set Ø, the singleton {a}, and the ordered pair 〈x, y〉 are at the beginning of the systematic, axiomatic development of set theory, both as a field of mathematics and as a unifying framework for ongoing mathematics. These notions are the simplest building locks in the abstract, generative conception of sets advanced by the initial axiomatization of Ernst Zermelo [1908a] and are quickly assimilated long before the complexities of Power Set, Replacement, and Choice are broached in the formal elaboration of the ‘set of’f {} operation. So it is surprising that, while these notions are unproblematic today, they were once sources of considerable concern and confusion among leading pioneers of mathematical logic like Frege, Russell, Dedekind, and Peano. In the development of modern mathematical logic out of the turbulence of 19th century logic, the emergence of the empty set, the singleton, and the ordered pair as clear and elementary set-theoretic concepts serves as amotif that reflects and illuminates larger and more significant developments in mathematical logic: the shift from the intensional to the extensional viewpoint, the development of type distinctions, the logical vs. the iterative conception of set, and the emergence of various concepts and principles as distinctively set-theoretic rather than purely logical. Here there is a loose analogy with Tarski's recursive definition of truth for formal languages: The mathematical interest lies mainly in the procedure of recursion and the attendant formal semantics in model theory, whereas the philosophical interest lies mainly in the basis of the recursion, truth and meaning at the level of basic predication. Circling back to the beginning, we shall see how central the empty set, the singleton, and the ordered pair were, after all.


Author(s):  
Shaoying Liu

FRSM (Formal Requirements Specification Method) is a structured formal language and method for requirements analysis and specification construction based on data flow analysis. It uses a formalized DeMarco data flow diagram to describe the overall structure of systems and a VDM-SL like formal notation to describe precisely the functionality of components in the diagrams. This paper first describes the formal syntax and semantics of FRSM and then presents an example of using the axiom and inference rules given in the definition of the formal semantics for checking consistency of specifications. A case study of applying FRSM to a practical example is described to demonstrate the principle of constructing requirements specifications and to uncover the benefits and deficiencies of FRSM.


2007 ◽  
Vol 6 (4) ◽  
pp. 191-200 ◽  
Author(s):  
Jenny Moon

AbstractReflection, reflective learning, reflective writing and reflective practice are used increasingly in higher education and professional development–but we do not work to one definition and there are considerable differences in the views of educationists on issues of definition. Such discrepancies can exist between the staff working with the same student group. The situation can lead to difficulties in indicating to students how to reflect, and what reflective writing ‘should look like’. Once students do manage to represent their reflection broadly in the required manner (usually writing), there is frequently observed to be a further problem because their reflection is superficial and descriptive. A consequence is that their learning from the reflective process is restricted.This paper addresses the issue of definition of reflection initially through clarifying the different words used around the notion of reflection (e.g., reflection, reflective learning, reflective writing) and providing some suggested definitions. It then addresses the matters both of how we should help students to start with reflection, and with the problem of the superficiality of much of their work. The ‘depth’ of reflection is a concept that has not been much discussed in the literature of reflection and yet it seems to be closely related to the quality of reflective work. The paper discusses the concept of depth and then introduces a style of exercise in which a scenario is reproduced at progressively deeper levels of reflection. The exercise is related to a generic framework for reflective writing. The rationale and justification for the exercise and the framework are discussed and suggestions are made for its manner of use. The exercise and the generic framework for reflective writing are in Appendices 1 and 2.The use of reflection to enhance formal learning has become increasingly common in the past 7 years. From the principle beginnings of its use in the professional development of nurses and teachers, its use has spread through other professions. Now, in the form of personal development planning (PDP), there is an expectation that all students in higher education will be deliberately engaging in reflection in the next 2 years.1 In addition, there are examples of the use of reflective learning journals and other reflective techniques in most, if not all, disciplines.2Reflection is not, however, a clearly defined and enacted concept. People hold different views of its nature, which only become revealed at stages such as assessment. For example, what is it that differentiates reflective writing from simple description? There are difficulties not only with the definition itself but also in conveying to learners what it is that we require them to do in reflection and in encouraging reflection that is deeper than description. In this paper, we consider some issues of definition and then focus on the means of encouraging learners to produce a reflective output of good-enough quality for the task at hand. The latter is presented as an exercise for staff and learners (Appendix 1) with a framework that underpins it (Appendix 2).


Author(s):  
NOURA BOUDIAF ◽  
FARID MOKHATI ◽  
MOURAD BADRI

Model Checking based verification techniques represent an important issue in the field of concurrent systems quality assurance. The lack of formal semantics in the existing formalisms describing multi-agents models combined with multi-agents systems complexity are sources of several problems during their development process. The Maude language, based on rewriting logic, offers a rich notation supporting formal specification and implementation of concurrent systems. In addition to its modeling capacity, the Maude environment integrates a Model Checker based on Linear Temporal Logic (LTL) for distributed systems verification. In this paper, we present a formal and generic framework (DIMA-Maude) supporting formal description and verification of DIMA multi-agents models.


2014 ◽  
pp. 297-323
Author(s):  
Paolo Arcaini ◽  
Angelo Gargantini ◽  
Elvinia Riccobene ◽  
Patrizia Scandurra

Domain Specific Languages (DSLs) are often defined in terms of metamodels capturing the abstract syntax of the language. For a complete definition of a DSL, both syntactic and semantic aspects of the language have to be specified. Metamodeling environments support syntactic definition issues, but they do not provide any help in defining the semantics of metamodels, which is usually given in natural language. In this chapter, the authors present an approach to formally define the semantics of metamodel-based languages. It is based on a translational technique that hooks to the language metamodel its precise and executable semantics expressed in terms of the Abstract State Machine formal method. The chapter also shows how different techniques can be used for formal analysis of models (i.e., instance of the language metamodel). The authors exemplify the use of their approach on a language for Petri nets.


2020 ◽  
Author(s):  
Vasil Dinev Penchev

A formal model of metaphor is introduced. It models metaphor, first, as an interaction of “frames” according to the frame semantics, and then, as a wave function in Hilbert space. The practical way for a probability distribution and a corresponding wave function to be assigned to a given metaphor in a given language is considered. A series of formal definitions is deduced from this for: “representation”, “reality”, “language”, “ontology”, etc. All are based on Hilbert space. A few statements about a quantum computer are implied: The so-defined reality is inherent and internal to it. It can report a result only “metaphorically”. It will demolish transmitting the result “literally”, i.e. absolutely exactly. A new and different formal definition of metaphor is introduced as a few entangled wave functions corresponding to different “signs” in different language formally defined as above. The change of frames as the change from the one to the other formal definition of metaphor is interpreted as a formal definition of thought. Four areas of cognition are unified as different but isomorphic interpretations of the mathematical model based on Hilbert space. These are: quantum mechanics, frame semantics, formal semantics by means of quantum computer, and the theory of metaphor in linguistics.


Author(s):  
Michael Backes ◽  
Aniket Kate ◽  
Praveen Manoharan ◽  
Sebastian Meiser ◽  
Esfandiar Mohammadi

Anonymous communication (AC) protocols such as the widely used Tor network have been designed to provide anonymity over the Internet to their participating users. While AC protocols have been the subject of several security and anonymity analyses in the last years, there still does not exist a framework for analyzing these complex systems and their different anonymity properties in a unified manner.   In this work we present AnoA: a generic framework for defining, analyzing, and quantifying anonymity properties for AC protocols. In addition to quantifying the (additive) advantage of an adversary in an indistinguishability-based definition, AnoA uses a multiplicative factor, inspired from differential privacy. AnoA enables a unified quantitative analysis of well-established anonymity properties, such as sender anonymity, sender unlinkability, and relationship anonymity. AnoA modularly specifies adversarial capabilities by a simple wrapper-construction, called adversary classes. We examine the structure of these adversary classes and identify conditions under which it suffices to establish anonymity guarantees for single messages in order to derive guarantees for arbitrarily many messages. This then leads us to the definition of Plug’n’Play adversary classes (PAC), which are easy-to-use, expressive, and satisfy this condition. We prove that our framework is compatible with the universal composability (UC) framework and show how to apply AnoA to a simplified version of Tor against passive adversaries, leveraging a recent realization proof in the UC framework.


2007 ◽  
Vol 14 (14) ◽  
Author(s):  
Olivier Danvy ◽  
Michael Spivey

Over forty years ago, David Barron and Christopher Strachey published a startlingly elegant program for the Cartesian product of a list of lists, expressing it with a three nested occurrences of the function we now call <em>foldr</em>. This program is remarkable for its time because of its masterful display of higher-order functions and lexical scope, and we put it forward as possibly the first ever functional pearl. We first characterize it as the result of a sequence of program transformations, and then apply similar transformations to a program for the classical power set example. We also show that using a higher-order representation of lists allows a definition of the Cartesian product function where <em>foldr</em> is nested only twice.


Sign in / Sign up

Export Citation Format

Share Document