scholarly journals DataProVe: Fully Automated Conformance Verification Between Data Protection Policies and System Architectures

2021 ◽  
Vol 2022 (1) ◽  
pp. 565-585
Author(s):  
Vinh Thong Ta ◽  
Max Hashem Eiza

Abstract Privacy and data protection by design are relevant parts of the General Data Protection Regulation (GDPR), in which businesses and organisations are encouraged to implement measures at an early stage of the system design phase to fulfil data protection requirements. This paper addresses the policy and system architecture design and propose two variants of privacy policy language and architecture description language, respectively, for specifying and verifying data protection and privacy requirements. In addition, we develop a fully automated algorithm based on logic, for verifying three types of conformance relations (privacy, data protection, and functional conformance) between a policy and an architecture specified in our languages’ variants. Compared to related works, this approach supports a more systematic and fine-grained analysis of the privacy, data protection, and functional properties of a system. Our theoretical methods are then implemented as a software tool called DataProVe and its feasibility is demonstrated based on the centralised and decentralised approaches of COVID-19 contact tracing applications.

2021 ◽  
Vol 11 (4) ◽  
pp. 1762
Author(s):  
David Sánchez ◽  
Alexandre Viejo ◽  
Montserrat Batet

To comply with the EU General Data Protection Regulation (GDPR), companies managing personal data have been forced to review their privacy policies. However, privacy policies will not solve any problems as long as users do not read or are not able to understand them. In order to assist users in both issues, we present a system that automatically assesses privacy policies. Our proposal quantifies the degree of policy compliance with respect to the data protection goals stated by the GPDR and presents clear and intuitive privacy scores to the user. In this way, users will become immediately aware of the risks associated with the services and their severity; this will empower them to take informed decisions when accepting (or not) the terms of a service. We leverage manual annotations and machine learning to train a model that automatically tags privacy policies according to their compliance (or not) with the data protection goals of the GDPR. In contrast with related works, we define clear annotation criteria consistent with the GDPR, and this enables us not only to provide aggregated scores, but also fine-grained ratings that help to understand the reasons of the assessment. The latter is aligned with the concept of explainable artificial intelligence. We have applied our method to the policies of 10 well-known internet services. Our scores are sound and consistent with the results reported in related works.


2021 ◽  
Vol 11 (21) ◽  
pp. 9977
Author(s):  
Daan Storm van Leeuwen ◽  
Ali Ahmed ◽  
Craig Watterson ◽  
Nilufar Baghaei

Faced with the biggest virus outbreak in a century, world governments at the start of 2020 took unprecedented measures to protect their healthcare systems from being overwhelmed in the light of the COVID-19 pandemic. International travel was halted and lockdowns were imposed. Many nations adopted measures to stop the transmission of the virus, such as imposing the wearing of face masks, social distancing, and limits on social gatherings. Technology was quickly developed for mobile phones, allowing governments to track people’s movements concerning locations of the virus (both people and places). These are called contact tracing applications. Contact tracing applications raise serious privacy and security concerns. Within Europe, two systems evolved: a centralised system, which calculates risk on a central server, and a decentralised system, which calculates risk on the users’ handset. This study examined both systems from a threat perspective to design a framework that enables privacy and security for contact tracing applications. Such a framework is helpful for App developers. The study found that even though both systems comply with the General Data Protection Regulation (GDPR), Europe’s privacy legislation, the centralised system suffers from severe risks against the threats identified. Experiments, research, and reviews tested the decentralised system in various settings but found that it performs better but still suffers from inherent shortcomings. User tracking and re-identification are possible, especially when users report themselves as infected. Based on these data, the study identified and validated a framework that enables privacy and security. The study also found that the current implementations using the decentralised Google/Apple API do not comply with the framework.


2020 ◽  
Vol 7 (1) ◽  
Author(s):  
Laura Bradford ◽  
Mateo Aboy ◽  
Kathleen Liddell

Abstract Digital surveillance has played a key role in containing the COVID-19 outbreak in China, Singapore, Israel, and South Korea. Google and Apple recently announced the intention to build interfaces to allow Bluetooth contact tracking using Android and iPhone devices. In this article, we look at the compatibility of the proposed Apple/Google Bluetooth exposure notification system with Western privacy and data protection regimes and principles, including the General Data Protection Regulation (GDPR). Somewhat counter-intuitively, the GDPR’s expansive scope is not a hindrance, but rather an advantage in conditions of uncertainty such as a pandemic. Its principle-based approach offers a functional blueprint for system design that is compatible with fundamental rights. By contrast, narrower, sector-specific rules such as the US Health Insurance Portability and Accountability Act (HIPAA), and even the new California Consumer Privacy Act (CCPA), leave gaps that may prove difficult to bridge in the middle of an emergency.


GigaScience ◽  
2019 ◽  
Vol 8 (12) ◽  
Author(s):  
Regina Becker ◽  
Pinar Alper ◽  
Valentin Grouès ◽  
Sandrine Munoz ◽  
Yohan Jarosz ◽  
...  

Abstract Background The new European legislation on data protection, namely, the General Data Protection Regulation (GDPR), has introduced comprehensive requirements for the documentation about the processing of personal data as well as informing the data subjects of its use. GDPR’s accountability principle requires institutions, projects, and data hubs to document their data processings and demonstrate compliance with the GDPR. In response to this requirement, we see the emergence of commercial data-mapping tools, and institutions creating GDPR data register with such tools. One shortcoming of this approach is the genericity of tools, and their process-based model not capturing the project-based, collaborative nature of data processing in biomedical research. Findings We have developed a software tool to allow research institutions to comply with the GDPR accountability requirement and map the sometimes very complex data flows in biomedical research. By analysing the transparency and record-keeping obligations of each GDPR principle, we observe that our tool effectively meets the accountability requirement. Conclusions The GDPR is bringing data protection to center stage in research data management, necessitating dedicated tools, personnel, and processes. Our tool, DAISY, is tailored specifically for biomedical research and can help institutions in tackling the documentation challenge brought about by the GDPR. DAISY is made available as a free and open source tool on Github. DAISY is actively being used at the Luxembourg Centre for Systems Biomedicine and the ELIXIR-Luxembourg data hub.


Sign in / Sign up

Export Citation Format

Share Document