The Anonymisation of Research Data — A Pyric Victory for Privacy that Should Not Be Pushed Too Hard by the eu Data Protection Framework?

2017 ◽  
Vol 24 (4) ◽  
pp. 347-367 ◽  
Author(s):  
Paul Quinn

Abstract Personal health data is essential to many forms of scientific research. Such data may come from a large variety of sources including electronic health records (ehrs), datasets used for previous research and from data linked to biobanks. European data protection law recognises that in addition to using consent as a legal basis for the processing of personal health data for scientific research, such data may be used without consent where it is in the ‘public interest’. Despite the existence of such a legal option, ethics bodies in a number of states have shown reticence to utilise it, often pushing researchers into either obtaining consent or anonymising the data in question. Whilst the latter option may be appealing from a legal point of view, if carried out properly, the result may be that the research value of the data is reduced or even destroyed.

Laws ◽  
2020 ◽  
Vol 9 (1) ◽  
pp. 6 ◽  
Author(s):  
Mark J. Taylor ◽  
Tess Whitton

The United Kingdom’s Data Protection Act 2018 introduces a new public interest test applicable to the research processing of personal health data. The need for interpretation and application of this new safeguard creates a further opportunity to craft a health data governance landscape deserving of public trust and confidence. At the minimum, to constitute a positive contribution, the new test must be capable of distinguishing between instances of health research that are in the public interest, from those that are not, in a meaningful, predictable and reproducible manner. In this article, we derive from the literature on theories of public interest a concept of public interest capable of supporting such a test. Its application can defend the position under data protection law that allows a legal route through to processing personal health data for research purposes that does not require individual consent. However, its adoption would also entail that the public interest test in the 2018 Act could only be met if all practicable steps are taken to maximise preservation of individual control over the use of personal health data for research purposes. This would require that consent is sought where practicable and objection respected in almost all circumstances. Importantly, we suggest that an advantage of relying upon this concept of the public interest, to ground the test introduced by the 2018 Act, is that it may work to promote the social legitimacy of data protection legislation and the research processing that it authorises without individual consent (and occasionally in the face of explicit objection).


2021 ◽  
Author(s):  
Stergios Aidinlis

Governments across the EU are increasingly turning their attention to advanced big data analytics, aiming to use their data to inform the design and implementation of public policies. Due to limitations in expertise and resources, this is often impossible without the formation of data sharing partnerships with private actors. Yet, the prevailing view in EU data protection regulatory guidance is that the ‘public interest’ and private interests as lawful grounds for data processing under article 6 GDPR find themselves in a zero-sum relationship. The ‘public interest’ under article 6(1)(e) GDPR is construed as the exclusive realm of public authorities, which are often advised against relying on other grounds for processing, associated with private interests, such as ‘legitimate interests’ under article 6(1)(f) GDPR. This chapter argues against the presently dominant divide between public and private interests under lawful grounds for processing, sketching the emergence of Government-to-Business (G2B) research data sharing in the EU. A conceptualisation of the ‘public’ interest as not incompatible with private interests, as long as a contribution to societal well-being is made through data processing, is offered in that regard. The chapter elaborates on this conceptualisation and the requirements for ensuring protection of the fundamental rights of data subjects, while reflecting on the research questions that should concern future EU data protection law researchers with regard to its adoption.


2022 ◽  
Vol 6 (GROUP) ◽  
pp. 1-22
Author(s):  
Melanie Duckert ◽  
Louise Barkhuus

Digital health data is important to keep secure, and patients' perception around the privacy of it is essential to the development of digital health records. In this paper we present people's perceptions of the communication of data protection, in relation to their personal health data and the access to it; we focused particularly on people with chronic or long-term illness. Based on their use of personally accessible health records, we inquired into their explicit perception of security and sense of data privacy in relation to their health data. Our goal was to provide insights and guidelines to designers and developers on the communication of data protection in health records in an accessible way for the users. We analyzed their approach to and experience with their own health care records and describe the details of their challenges. A conceptual framework called "Privacy Awareness' was developed from the findings and reflects the perspectives of the users. The conceptual framework forms the basis of a proposal for design guidelines for Digital Health Record systems, which aim to address, facilitate and improve the users' awareness of the protection of their online health data.


Author(s):  
Luan Ibraimi ◽  
Qiang Tang ◽  
Pieter Hartel ◽  
Willem Jonker

Commercial Web-based Personal-Health Record (PHR) systems can help patients to share their personal health records (PHRs) anytime from anywhere. PHRs are very sensitive data and an inappropriate disclosure may cause serious problems to an individual. Therefore commercial Web-based PHR systems have to ensure that the patient health data is secured using state-of-the-art mechanisms. In current commercial PHR systems, even though patients have the power to define the access control policy on who can access their data, patients have to trust entirely the access-control manager of the commercial PHR system to properly enforce these policies. Therefore patients hesitate to upload their health data to these systems as the data is processed unencrypted on untrusted platforms. Recent proposals on enforcing access control policies exploit the use of encryption techniques to enforce access control policies. In such systems, information is stored in an encrypted form by the third party and there is no need for an access control manager. This implies that data remains confidential even if the database maintained by the third party is compromised. In this paper we propose a new encryption technique called a type-and-identity-based proxy re-encryption scheme which is suitable to be used in the healthcare setting. The proposed scheme allows users (patients) to securely store their PHRs on commercial Web-based PHRs, and securely share their PHRs with other users (doctors).


Sign in / Sign up

Export Citation Format

Share Document