scholarly journals Expressing High-Level Scientific Claims with Formal Semantics

2021 ◽  
Author(s):  
Cristina-Iulia Bucur ◽  
Tobias Kuhn ◽  
Davide Ceolin ◽  
Jacco van Ossenbruggen
2008 ◽  
Vol 18 (5-6) ◽  
pp. 649-706 ◽  
Author(s):  
KEVIN DONNELLY ◽  
MATTHEW FLUET

AbstractConcurrent programs require high-level abstractions in order to manage complexity and enable compositional reasoning. In this paper, we introduce a novel concurrency abstraction, dubbed transactional events, which combines first-class synchronous message passing events with all-or-nothing transactions. This combination enables simple solutions to interesting problems in concurrent programming. For example, guarded synchronous receive can be implemented as an abstract transactional event, whereas in other languages it requires a non-abstract, non-modular protocol. As another example, three-way rendezvous can be implemented as an abstract transactional event, which is impossible using first-class events alone. Both solutions are easy to code and easy to reason about.The expressive power of transactional events arises from a sequencing combinator whose semantics enforces an all-or-nothing transactional property – either both of the constituent events synchronize in sequence or neither of them synchronizes. This sequencing combinator, along with a non-deterministic choice combinator, gives transactional events the compositional structure of a monad-with-plus. We provide a formal semantics for transactional events and give a detailed account of an implementation.


2020 ◽  
Vol 177 (3-4) ◽  
pp. 203-234
Author(s):  
Elvira Albert ◽  
Nikolaos Bezirgiannis ◽  
Frank de Boer ◽  
Enrique Martin-Martin

We present a formal translation of a resource-aware extension of the Abstract Behavioral Specification (ABS) language to the functional language Haskell. ABS is an actor-based language tailored to the modeling of distributed systems. It combines asynchronous method calls with a suspend and resume mode of execution of the method invocations. To cater for the resulting cooperative scheduling of the method invocations of an actor, the translation exploits for the compilation of ABS methods Haskell functions with continuations. The main result of this article is a correctness proof of the translation by means of a simulation relation between a formal semantics of the source language and a high-level operational semantics of the target language, i.e., a subset of Haskell. We further prove that the resource consumption of an ABS program extended with a cost model is preserved over this translation, as we establish an equivalence of the cost of executing the ABS program and its corresponding Haskell-translation. Concretely, the resources consumed by the original ABS program and those consumed by the Haskell program are the same, considering a cost model. Consequently, the resource bounds automatically inferred for ABS programs extended with a cost model, using resource analysis tools, are sound resource bounds also for the translated Haskell programs. Our experimental evaluation confirms the resource preservation over a set of benchmarks featuring different asymptotic costs.


1985 ◽  
Vol 14 (198) ◽  
Author(s):  
Kurt Jensen ◽  
Erik Meineche Schmidt

This paper describes the formal semantics of a subset of PASCAL by means of a semantic model based on a combination of denotational semantics and high-level Petri nets. It is our intention that the paper be used as part of the written material for an introductory course in Computer Science.


Author(s):  
José Alberto Maldonado ◽  
Diego Boscá ◽  
David Moner ◽  
Montserrat Robles

Normalization of data is a prerequisite to achieve semantic interoperability in any domain. This is even more important in the healthcare sector due to the special sensitivity of medical data: data exchange must be done in a meaningful way, avoiding any possibility of misunderstanding or misinterpretation. In this chapter, we present the LinkEHR system for clinical data standardization and exchange. The LinkEHR platform provides tools that simplify meaningful sharing of electronic health records between different systems and organizations. Key contributions of LinkEHR are the development of a powerful medical concept, expressed in the form of archetypes, editing framework based on formal semantics capable of handling multiple electronic health record architectures, the definition of high-level non-procedural mappings to describe the relationship between archetype and legacy clinical data and the semi-automatic generation of XQuery scripts that transform legacy data into XML documents compliant with the underlying electronic health record data architecture and at the same time satisfy the constraints imposed by the archetype.


2010 ◽  
Vol 21 (1) ◽  
pp. 21-58 ◽  
Author(s):  
SUNGWOO PARK ◽  
HYEONSEUNG IM

AbstractIn efforts to overcome the complexity of the syntax and the lack of formal semantics of conventional hardware description languages, a number of functional hardware description languages have been developed. Like conventional hardware description languages, however, functional hardware description languages eventually convert all source programs into netlists, which describe wire connections in hardware circuits at the lowest level and conceal all high-level descriptions written into source programs. We develop a calculus, called lλ (linear lambda), which may serve as an intermediate functional language just above netlists in the hierarchy of hardware description languages. In order to support higher-order functions, lλ uses a linear type system, which enforces the linear use of variables of function type. The translation of lλ into structural descriptions of hardware circuits is sound and complete in the sense that it maps expressions only to realizable hardware circuits, and that every realizable hardware circuit has a corresponding expression in lλ. To illustrate the use of lλ as a practical intermediate language for hardware description, we design a simple hardware description language that extends lλ with polymorphism, and use it to implement a fast Fourier transform circuit and a bitonic sorting network.


1982 ◽  
Vol 11 (152) ◽  
Author(s):  
Niels Damgaard Hansen ◽  
Kim Halskov Madsen

<p>Denotational semantics has proved to be an excellent tool for the specification of nearly all kinds of declarations and commands in sequential languages, but the description of concurrent processes is in practice nearly impossible.</p><p>High-level Petri nets, on the other hand, have proved their value in the specification of communication and synchronization of concurrent processes.</p><p>We propose to combine the two models into a single approach, where denotational semantics is used to build up environments and to describe store transformations, while Petri nets are used to describe sequencing and communication.</p>


Author(s):  
David P. Bazett-Jones ◽  
Mark L. Brown

A multisubunit RNA polymerase enzyme is ultimately responsible for transcription initiation and elongation of RNA, but recognition of the proper start site by the enzyme is regulated by general, temporal and gene-specific trans-factors interacting at promoter and enhancer DNA sequences. To understand the molecular mechanisms which precisely regulate the transcription initiation event, it is crucial to elucidate the structure of the transcription factor/DNA complexes involved. Electron spectroscopic imaging (ESI) provides the opportunity to visualize individual DNA molecules. Enhancement of DNA contrast with ESI is accomplished by imaging with electrons that have interacted with inner shell electrons of phosphorus in the DNA backbone. Phosphorus detection at this intermediately high level of resolution (≈lnm) permits selective imaging of the DNA, to determine whether the protein factors compact, bend or wrap the DNA. Simultaneously, mass analysis and phosphorus content can be measured quantitatively, using adjacent DNA or tobacco mosaic virus (TMV) as mass and phosphorus standards. These two parameters provide stoichiometric information relating the ratios of protein:DNA content.


Author(s):  
J. S. Wall

The forte of the Scanning transmission Electron Microscope (STEM) is high resolution imaging with high contrast on thin specimens, as demonstrated by visualization of single heavy atoms. of equal importance for biology is the efficient utilization of all available signals, permitting low dose imaging of unstained single molecules such as DNA.Our work at Brookhaven has concentrated on: 1) design and construction of instruments optimized for a narrow range of biological applications and 2) use of such instruments in a very active user/collaborator program. Therefore our program is highly interactive with a strong emphasis on producing results which are interpretable with a high level of confidence.The major challenge we face at the moment is specimen preparation. The resolution of the STEM is better than 2.5 A, but measurements of resolution vs. dose level off at a resolution of 20 A at a dose of 10 el/A2 on a well-behaved biological specimen such as TMV (tobacco mosaic virus). To track down this problem we are examining all aspects of specimen preparation: purification of biological material, deposition on the thin film substrate, washing, fast freezing and freeze drying. As we attempt to improve our equipment/technique, we use image analysis of TMV internal controls included in all STEM samples as a monitor sensitive enough to detect even a few percent improvement. For delicate specimens, carbon films can be very harsh-leading to disruption of the sample. Therefore we are developing conducting polymer films as alternative substrates, as described elsewhere in these Proceedings. For specimen preparation studies, we have identified (from our user/collaborator program ) a variety of “canary” specimens, each uniquely sensitive to one particular aspect of sample preparation, so we can attempt to separate the variables involved.


Sign in / Sign up

Export Citation Format

Share Document