Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 753
Author(s):  
Xi ◽  
Lv ◽  
Sun ◽  
Ma

The advances in mobile technologies enable mobile devices to cooperate with each other to perform complex tasks to satisfy users’ composite service requirements. However, data with different sensitivities and heterogeneous systems with diverse security policies pose a great challenge on information flow security during the service composition across multiple mobile devices. The qualitative information flow control mechanism based on non-interference provides a solid security assurance on the propagation of customer’s private data across multiple service participants. However, strict discipline limits the service availability and may cause a high failure rate on service composition. Therefore, we propose a distributed quantitative information flow evaluation approach for service composition across multiple devices in mobile environments. The quantitative approach provides us a more precise way to evaluate the leakage and supports the customized disciplines on information flow security for the diverse requirements of different customers. Considering the limited energy feature on mobile devices, we use a distributed evaluation approach to provide a better balance on consumption on each service participant. Through the experiments and evaluations, the results indicate that our approach can improve the availability of composite service effectively while the security can be ensured.


2012 ◽  
Vol 37 (6) ◽  
pp. 1-5 ◽  
Author(s):  
Quoc-Sang Phan ◽  
Pasquale Malacaria ◽  
Oksana Tkachuk ◽  
Corina S. Păsăreanu

2015 ◽  
Vol 6 (2) ◽  
pp. 23-46
Author(s):  
Tom Chothia ◽  
Chris Novakovic ◽  
Rajiv Ranjan Singh

This paper presents a framework for calculating measures of data integrity for programs in a small imperative language. The authors develop a Markov chain semantics for their language which calculates Clarkson and Schneider's definitions of data contamination, data suppression, program suppression and program transmission. The authors then propose their own definition of program integrity for probabilistic specifications. These definitions are based on conditional mutual information and entropy; they present a result relating them to mutual information, which can be calculated by a number of existing tools. The authors extend a quantitative information flow tool (CH-IMP) to calculate these measures of integrity and demonstrate this tool with examples including error correcting codes, the Dining Cryptographers protocol and the attempts by a number of banks to influence the Libor rate.


2014 ◽  
Vol 25 (2) ◽  
pp. 457-479 ◽  
Author(s):  
MICHAEL BACKES ◽  
BORIS KÖPF

We provide a novel definition of quantitative information flow, called transmissible information, that is suitable for reasoning about informational-theoretically secure (or non-cryptographic) systems, as well as about cryptographic systems with their polynomially bounded adversaries, error probabilities, etc. Transmissible information captures deliberate communication between two processes, and it safely over-approximates the quantity of information that a process unintentionally leaks to another process.We show that transmissible information is preserved under universal composability, which constitutes the prevalent cryptographic notion of a secure implementation. This result enables us to lift quantitative bounds of transmissible information from simple ideal functionalities of cryptographic tasks to actual cryptographic systems.We furthermore prove a connection between transmissible information in the unconditional setting and channel capacity, based on the weak converse of Shannon's coding theorem. This connection enables us to compute an upper bound on the transmissible information for a restricted class of protocols, using existing techniques from quantitative information flow.


Sign in / Sign up

Export Citation Format

Share Document