Non-personal data sharing: Potential, pathways and problems

Author(s):  
Astha Kapoor ◽  
Amrita Nanda
Keyword(s):  
2019 ◽  
Vol 15 (3) ◽  
pp. 21-36
Author(s):  
Sheshadri Chatterjee ◽  
Sreenivasulu N.S.

Personal data sharing has become an important issue in public and private sectors of our society. However, data subjects are perceived to be always unwilling to share their data on security and privacy reasons. They apprehend that those data will be misused at the cost of their privacy jeopardising their human rights. Thus, personal data sharing is closely associated with human right issues. This concern of data subjects has increased manifolds owing to the interference of Artificial Intelligence (AI) since AI can analyse data without human intervention. In this background, this article has taken an attempt to investigate how applications of AI and imposition of regulatory controls with appropriate governance can influence the impact of personal data sharing on the issues of human right abuses.


Cryptography ◽  
2019 ◽  
Vol 3 (1) ◽  
pp. 7 ◽  
Author(s):  
Karuna Pande Joshi ◽  
Agniva Banerjee

An essential requirement of any information management system is to protect data and resources against breach or improper modifications, while at the same time ensuring data access to legitimate users. Systems handling personal data are mandated to track its flow to comply with data protection regulations. We have built a novel framework that integrates semantically rich data privacy knowledge graph with Hyperledger Fabric blockchain technology, to develop an automated access-control and audit mechanism that enforces users' data privacy policies while sharing their data with third parties. Our blockchain based data-sharing solution addresses two of the most critical challenges: transaction verification and permissioned data obfuscation. Our solution ensures accountability for data sharing in the cloud by incorporating a secure and efficient system for End-to-End provenance. In this paper, we describe this framework along with the comprehensive semantically rich knowledge graph that we have developed to capture rules embedded in data privacy policy documents. Our framework can be used by organizations to automate compliance of their Cloud datasets.


2016 ◽  
Vol 51 (1) ◽  
pp. 133-161 ◽  
Author(s):  
George R. Milne ◽  
George Pettinico ◽  
Fatima M. Hajjat ◽  
Ereni Markos

10.2196/16249 ◽  
2020 ◽  
Vol 22 (1) ◽  
pp. e16249
Author(s):  
Joanna Sleigh ◽  
Manuel Schneider ◽  
Julia Amann ◽  
Effy Vayena

Background Data have become an essential factor in driving health research and are key to the development of personalized and precision medicine. Primary and secondary use of personal data holds significant potential for research; however, it also introduces a new set of challenges around consent processes, privacy, and data sharing. Research institutions have issued ethical guidelines to address challenges and ensure responsible data processing and data sharing. However, ethical guidelines directed at researchers and medical professionals are often complex; require readers who are familiar with specific terminology; and can be hard to understand for people without sufficient background knowledge in legislation, research, and data processing practices. Objective This study aimed to visually represent an ethics framework to make its content more accessible to its stakeholders. More generally, we wanted to explore the potential of visualizing policy documents to combat and prevent research misconduct by improving the capacity of actors in health research to handle data responsibly. Methods We used a mixed methods approach based on knowledge visualization with 3 sequential steps: qualitative content analysis (open and axial coding, among others); visualizing the knowledge structure, which resulted from the previous step; and adding interactive functionality to access information using rapid prototyping. Results Through our iterative methodology, we developed a tool that allows users to explore an ethics framework for data sharing through an interactive visualization. Our results represent an approach that can make policy documents easier to understand and, therefore, more applicable in practice. Conclusions Meaningful communication and understanding each other remain a challenge in various areas of health care and medicine. We contribute to advancing communication practices through the introduction of knowledge visualization to bioethics to offer a novel way to tackle this relevant issue.


2018 ◽  
Vol 37 (4) ◽  
pp. 466-488 ◽  
Author(s):  
Petter Bae Brandtzaeg ◽  
Antoine Pultier ◽  
Gro Mette Moen

Personal data from mobile apps are increasingly impacting users’ lives and privacy perceptions. However, there is a scarcity of research addressing the combination of (1) individual perceptions of mobile app privacy, (2) actual dataflows in apps, and (3) how such perceptions and dataflows relate to actual privacy policies and terms of use in mobile apps. To address these limitations, we conducted an innovative mixed-methods study including a representative user survey in Norway, an analysis of personal dataflows in apps, and content analysis of privacy policies of 21 popular, free Android mobile apps. Our findings show that more than half the respondents in the user survey repeatedly had refrained from downloading or using apps to avoid sharing personal data. Our analysis of dataflows applied a novel methodology measuring activity in the apps over time (48 hr). The investigation showed that 19 of the 21 apps investigated transmitted personal data to a total of approximately 600 different primary and third-party domains. From an European perspective, it is particularly noteworthy that most of these domains were associated with tech companies in the United States, where privacy laws are less strict than companies operating from Europe. The investigation further revealed that some apps by default track and share user data continuously, even when the app is not in use. For some of these, the terms of use provided with the apps did not inform the users about the actual tracking practice. A comparison of terms of use as provided in the studied apps with actual person dataflows as identified in the analysis disclosed that three of the apps shared data in violation with their provided terms of use. A possible solution for the mobile app industry, to strengthen user trust, is privacy by design through opt-in data sharing with the service and third parties and more granular information on personal data sharing practices. Also, based on the findings from this study, we suggest specific visualizations to enhance transparency of personal dataflows in mobile apps. A methodological contribution is that a mixed-methods approach strengthens our understanding of the complexity of privacy issues in mobile apps.


2020 ◽  
Vol 69 (5) ◽  
pp. 457-473 ◽  
Author(s):  
Romain Meys

Abstract This paper explores how the existing European rules on the legal and contractual protection of databases limit the re-use of non-personal data by start-ups and SMEs for the purpose of developing artificial intelligence in the European Union. This analysis aims to determine whether the recent initiatives on data mining and data sharing are adequate to ensure an appropriate level of data re-usability for that purpose. In turn, this paper argues that additional reforms are needed to establish a more balanced European framework on the legal and contractual protection of databases. Therefore, it contemplates the introduction of data user rights, which would facilitate the access and re-use of non-personal data by the enterprises in question.


Sign in / Sign up

Export Citation Format

Share Document