scholarly journals Verified secure compilation for mixed-sensitivity concurrent programs

2021 ◽  
Vol 31 ◽  
Author(s):  
ROBERT SISON ◽  
TOBY MURRAY

Abstract Proving only over source code that programs do not leak sensitive data leaves a gap between reasoning and reality that can only be filled by accounting for the behaviour of the compiler. Furthermore, software does not always have the luxury of limiting itself to single-threaded computation with resources statically dedicated to each user to ensure the confidentiality of their data. This results in mixed-sensitivity concurrent programs, which might reuse memory shared between their threads to hold data of different sensitivity levels at different times; for such programs, a compiler must preserve the value-dependent coordination of such mixed-sensitivity reuse despite the impact of concurrency. Here we demonstrate, using Isabelle/HOL, that it is feasible to verify that a compiler preserves noninterference, the strictest kind of confidentiality property, for mixed-sensitivity concurrent programs. First, we present notions of refinement that preserve a concurrent value-dependent notion of noninterference that we have designed to support such programs. As proving noninterference-preserving refinement can be considerably more complex than the standard refinements typically used to verify semantics-preserving compilation, our notions include a decomposition principle that separates the semantics preservation from security preservation concerns. Second, we demonstrate that these refinement notions are applicable to verified secure compilation, by exercising them on a single-pass compiler for mixed-sensitivity concurrent programs that synchronise using mutex locks, from a generic imperative language to a generic RISC-style assembly language. Finally, we execute our compiler on a non-trivial mixed-sensitivity concurrent program modelling a real-world use case, thus preserving its source-level noninterference properties down to an assembly-level model automatically. All results are formalised and proved in the Isabelle/HOL interactive proof assistant. Our work paves the way for more fully featured compilers to offer verified secure compilation support to developers of multithreaded software that must handle data of multiple sensitivity levels.

2020 ◽  
Author(s):  
Thien-An Ha ◽  
Tomás M. León ◽  
Karina Lalangui ◽  
Patricio Ponce ◽  
John M. Marshall ◽  
...  

AbstractBackgroundVector-borne diseases are a major cause of disease burden in Guayaquil, Ecuador, especially arboviruses spread by Aedes aegypti mosquitoes. Understanding which household characteristics and risk factors lead to higher Ae. aegypti densities and consequent disease risk can help inform and optimize vector control programs.MethodsCross-sectional entomological surveys were conducted in Guayaquil between 2013 and 2016, covering household demographics, municipal services, potential breeding containers, presence of Ae. aegypti larvae and pupae, and history of using mosquito control methods. A zero-truncated negative binomial regression model was fitted to data for estimating the household pupal index. An additional model assessed the factors of the most productive breeding sites across all of the households.ResultsOf surveyed households, 610 satisfied inclusion criteria. The final household-level model found that collection of large solid items (e.g., furniture and tires) and rainfall the week of and 2 weeks before collection were negatively correlated with average pupae per container, while bed canopy use, unemployment, container water volume, and the interaction between large solid collection and rainfall 2 weeks before the sampling event were positively correlated. Selection of these variables across other top candidate models with ΔAICc < 1 was robust, with the strongest effects from large solid collection and bed canopy use. The final container-level model explaining the characteristics of breeding sites found that contaminated water is positively correlated with Ae. aegypti pupae counts while breeding sites composed of car parts, furniture, sewerage parts, vases, ceramic material, glass material, metal material, and plastic material were all negatively correlated.ConclusionHaving access to municipal services like bulky item pickup was effective at reducing mosquito proliferation in households. Association of bed canopy use with higher mosquito densities is unexpected, and may be a consequence of large local mosquito populations or due to limited use or effectiveness of other vector control methods. The impact of rainfall on mosquito density is multifaceted, as it may both create new habitat and “wash out” existing habitat. Providing services and social/technical interventions focused on monitoring and eliminating productive breeding sites is important for reducing aquatic-stage mosquito densities in households at risk for Ae. aegypti-transmitted diseases.


2021 ◽  
Vol 129 ◽  
pp. 06007
Author(s):  
Eva Nahalková Tesárová ◽  
Anna Križanová

Research background: Globalization encourages increased involvement of retailers, and the market currently provides these entities with a new range of purchasing functions. The new economy is based primarily on information and knowledge, so the key to success is the ability to constantly improve and respond to changing market conditions and increasing customer requirements. The Internet thus becomes an integral part of our lives. It is a tool that makes our lives better. The expansion of the global information network Internet has created space for a new kind of business. It brings benefits to all e-commerce entities. For this reason, its popularity is growing exponentially. The negative experience caused by financial fraud, misuse of sensitive data, the unreliability of business partners, and the like is also growing. Purpose of the article: The basic purpose of the article is the fact that the development of e-commerce is one of the important conditions for maintaining and increasing the competitiveness of the Slovak economy and its ability to participate in the international division of labor with the economically developed countries. Methods: The basic pillar of the article was to analyze the current development of e-commerce in the Slovak Republic, which is affected by the pandemic situation caused by COVID-19, make a comparison with EU countries, and evaluate the perception of e-commerce by Slovak consumers based on a questionnaire survey. Findings & Value added: Finally, we interpret the answers of the respondents, which were obtained by the questionnaire method.


2016 ◽  
Vol 13 (1) ◽  
pp. 204-211
Author(s):  
Baghdad Science Journal

The internet is a basic source of information for many specialities and uses. Such information includes sensitive data whose retrieval has been one of the basic functions of the internet. In order to protect the information from falling into the hands of an intruder, a VPN has been established. Through VPN, data privacy and security can be provided. Two main technologies of VPN are to be discussed; IPSec and Open VPN. The complexity of IPSec makes the OpenVPN the best due to the latter’s portability and flexibility to use in many operating systems. In the LAN, VPN can be implemented through Open VPN to establish a double privacy layer(privacy inside privacy). The specific subnet will be used in this paper. The key and certificate will be generated by the server. An authentication and key exchange will be based on standard protocol SSL/TLS. Various operating systems from open source and windows will be used. Each operating system uses a different hardware specification. Tools such as tcpdump and jperf will be used to verify and measure the connectivity and performance. OpenVPN in the LAN is based on the type of operating system, portability and straightforward implementation. The bandwidth which is captured in this experiment is influenced by the operating system rather than the memory and capacity of the hard disk. Relationship and interoperability between each peer and server will be discussed. At the same time privacy for the user in the LAN can be introduced with a minimum specification.


Author(s):  
GWAN-HWAN HWANG ◽  
KUO-CHUNG TAI ◽  
TING-LU HUANG

Concurrent programs are more difficult to test than sequential programs because of non-deterministic behavior. An execution of a concurrent program non-deterministically exercises a sequence of synchronization events called a synchronization sequence (or SYN-sequence). Non-deterministic testing of a concurrent program P is to execute P with a given input many times in order to exercise distinct SYN-sequences. In this paper, we present a new testing approach called reachability testing. If every execution of P with input X terminates, reachability testing of P with input X derives and executes all possible SYN-sequences of P with input X. We show how to perform reachability testing of concurrent programs using read and write operations. Also, we present results of empirical studies comparing reachability and non-deterministic testing. Our results indicate that reachability testing has advantages over non-deterministic testing.


This chapter looks at the extent to which the semantic-based process mining approach of this book supports the conceptual analysis of the events logs and resultant models. Qualitatively, the chapter leverages the use case study of the research learning process domain to determine how the proposed method support the discovery, monitoring, and enhancement of the real-time processes through the abstraction levels of analysis. Also, the chapter quantitatively assesses the level of accuracy of the classification process to predict behaviours of unobserved instances within the underlying knowledge base. Overall, the work looks at the implications of the semantic-based approach, validation of the classification results, and their influence compared to other existing benchmark techniques/algorithms used for process mining.


2020 ◽  
Vol 5 (3) ◽  
pp. 27 ◽  
Author(s):  
Serio Angelo Maria Agriesti ◽  
Luca Studer ◽  
Giovanna Marchionni ◽  
Paolo Gandini ◽  
Xiaobo Qu

By now, it is widely acknowledged among stakeholders and academia that infrastructures will have to be composed both by a physical component and a digital one. The deployment of technologies exploiting dedicated short-range communications is viewed as the most cost-effective solution to face the foreseen growth of mobility. Still, little has been done to define the best implementation logic of DSRC. Aim of this paper is to frame the possible impacts arising by the implementation of a cooperative intelligent transport system (C-ITS)-use case: roadworks warning—closure of a lane, and, in order to achieve this result, microsimulations are exploited. The results are intended to support both road operators and car-makers in defining the best operational logics and the possible benefits achievable by presenting the cooperative message at a certain distance for certain market penetrations. Moreover, if the C-ITS message actually entails benefits or simply disrupts the upstream traffic should be assessed in advance, before implementing the system. The obtained results show that the risk of disruption and of reduction in traffic efficiency arises at lower market penetration levels. Nevertheless, a consistent trend in delay reduction is recorded upstream the roadworks, the highest reduction being equal to 8.66%. Moreover, the average speed at the roadworks entrance on the closing lane increases by a difference equal to around 10 km/h, while the average time in the queue at the highest market penetration reduces by 60 s on the open lane and 25 s on the closing one. These presented results reflect the way the traffic shifts from the slow to the fast lane thanks to the C-ITS system and effectively frames both the potentialities and the risks of the system.


2016 ◽  
Vol 13 (3-4) ◽  
pp. 173-183 ◽  
Author(s):  
Sarah K Mckenzie ◽  
Cissy Li ◽  
Gabrielle Jenkin ◽  
Sunny Collings

The impact on researchers of working with sensitive data is often not considered by ethics committees when approving research proposals. We conducted interviews with eight research assistants processing clinical notes on emergency department presentations for deliberate self-harm and suicide attempts during a suicide prevention trial. Common experiences of working with the data included feeling unprepared for the level of detail in the records, being drawn deeply into individual stories, emotional exhaustion from the cumulative exposure to the data over long periods of time while working alone, and experiencing a heightened awareness of the fragility of life and the need for safety. The research assistants also reported on some of the strategies they had developed to cope with the sensitive nature of the data and the demands of the work. The ethical implications for suicide research reliant on non-clinically trained researchers exploring sensitive data are considered. These include the need for research leaders and ethics committees to be aware of the potential adverse mental health impacts for these researchers examining sensitive data and to make appropriate arrangements to minimize the mental health impacts of such work.


Electronics ◽  
2020 ◽  
Vol 9 (9) ◽  
pp. 1435
Author(s):  
Paolo Ferrari ◽  
Emiliano Sisinni ◽  
Alessandro Depari ◽  
Alessandra Flammini ◽  
Stefano Rinaldi ◽  
...  

In the Industry 4.0 the communication infrastructure is derived from the Internet of Things (IoT), and it is called Industrial IoT or IIoT. Smart objects deployed on the field collect a large amount of data which is stored and processed in the Cloud to create innovative services. However, differently from most of the consumer applications, the industrial scenario is generally constrained by time-related requirements and its needs for real-time behavior (i.e., bounded and possibly short delays). Unfortunately, timeliness is generally ignored by traditional service provider, and the Cloud is treated as a black box. For instance, Cloud databases (generally seen as “Database as a service”—DBaaS) have unknown or hard-to-compare impact on applications. The novelty of this work is to provide an experimental measurement methodology based on an abstract view of IIoT applications, in order to define some easy-to-evaluate metrics focused on DBaaS latency (no matter the actual implementation details are). In particular, the focus is on the impact of DBaaS on the overall communication delays in a typical IIoT scalable context (i.e., from the field to the Cloud and the way back). In order to show the effectiveness of the proposed approach, a real use case is discussed (it is a predictive maintenance application with a Siemens S7 industrial controller transmitting system health status information to a Cloudant DB inside the IBM Bluemix platform). Experiments carried on in this use case provide useful insights about the DBaaS performance: evaluation of delays, effects of involved number of devices (scalability and complexity), constraints of the architecture, and clear information for comparing with other implementations and for optimizing configuration. In other words, the proposed evaluation strategy helps in finding out the peculiarities of Cloud Database service implementations.


Author(s):  
Jeong Hoon Kim ◽  
Chang Beom Choi ◽  
Tag Gon Kim

The modern naval air defense of a fleet is a critical task dictating the equipment, the operation, and the management of the fleet. Military modelers consider that an improved weapon system in naval air defense (i.e. the AEGIS system) is the most critical enabler of defense at the engagement level. However, at the mission execution level, naval air defense is a cooperative endeavor of humans and weapon systems. The weapon system and the command and control (C2) structure of a fleet engage in the situation through human reporting-in and commands, as well as weapon deployments. Hence, this paper models the combination of the human and the weapon systems in naval air defense by covering the C2 hierarchy of the fleet, as well as the weapon systems of warships. After developing this mission-level model, we perform battle experiments with varying parameters in the human and weapon aspects. These battle experiments inform us of the impact of the changes in the human and the weapon systems. For example, the speed of incoming missiles is a critical parameter for a fleet’s survival; yet the decision-making speed is another outstanding parameter, which illustrates that there is more to improve than the weapon system when considering the mission level. This modeling and these experiments provide an example, suggesting a method of combining the human C2 and the weapon systems at the mission level in the military domain.


Sign in / Sign up

Export Citation Format

Share Document