scholarly journals Individual Rights in Biobank Research Under the GDPR

Author(s):  
Ciara Staunton

AbstractThe coming into force of the General Data Protection Regulation (GDPR) on 25 May 2018 has brought about considerable changes in how data may collected, stored and used. Biobanks, which require the collection, use and re-use of large quantities of biological samples and data, will be affected by the proposed changes. In seeking to require ‘data protection by design’, the GDPR provides data subjects with certain individual rights. They are, the right to be informed, the right of access, the right to rectification, the right to erasure, the right to restrict processing, the right to data portability, the right to object and rights in relation to automated decision making and profiling.This chapter will consider each of these individual rights in turn and discuss the impact on biobank research. In particular, it will discuss the challenges that are now facing biobanks in upholding the individual rights, the limits of these rights in light of the technical realities of biobanks, and the potential impact that they may have on the collection, sharing, use and re-use of biological data and material.

2020 ◽  
Author(s):  
Bart Sloot

The General Data Protection Regulation in Plain Language is a guide for anyone interested in the much-discussed rules of the GDPR. In this legislation, which came into force in 2018, the European Union meticulously describes what you can and cannot do with data about other people. Violating these rules can lead to a fine of up to 20 million euros. This book sets out the most important obligations of individuals and organisations that process data about others. These include taking technical security measures, carrying out an impact assessment and registering all data-processing procedures within an organisation. It also discusses the rights of citizens whose data are processed, such as the right to be forgotten, the right to information and the right to data portability.


2021 ◽  
Vol 46 (3-4) ◽  
pp. 321-345
Author(s):  
Robert Grzeszczak ◽  
Joanna Mazur

Abstract The development of automated decision-making technologies creates the threat of de-iuridification: replacement of the legal acts’ provisions with automated, technological solutions. The article examines how selected provisions of the General Data Protection Regulation concerning, among other things, data protection impact assessments, the right to not be subject to automated decision-making, information obligations and the right to access are applied in the Polish national legal order. We focus on the institutional and procedural solutions regarding the involvement of expert bodies and other stakeholders in the process of specification of the norms included in the gdpr and their enforcement. We argue that the example of Poland shows that the solutions adopted in the gdpr do not shift the balance concerning regulatory power in regard to automated decision-making to other stakeholders and as such do not favor of a more participative approach to the regulatory processes.


2020 ◽  
pp. 146144482093403
Author(s):  
Sarah Turner ◽  
July Galindo Quintero ◽  
Simon Turner ◽  
Jessica Lis ◽  
Leonie Maria Tanczer

The right to data portability (RtDP), as outlined in the European Union’s General Data Protection Regulation (GDPR), enables data subjects to transmit their data from one service to another. This is of particular interest in the evolving Internet of Things (IoT) environment. This research delivers the first empirical analysis detailing the exercisability of the RtDP in the context of consumer IoT devices and the information provided to users about exercising the right. In Study 1, we reviewed 160 privacy policies of IoT producers to understand the level of information provided to a data subject. In Study 2, we tested four widely available IoT systems to examine whether procedures are in place to enable users to exercise the RtDP. Both studies showcase how the RtDP is not yet exercisable in the IoT environment, risking consumers being unable to unlock the long-term benefits of IoT systems.


Author(s):  
Lilian Edwards ◽  
Michael Veale

Cite as Lilian Edwards and Michael Veale, 'Slave to the Algorithm? Why a 'right to an explanation' is probably not the remedy you are looking for' (2017) 16 Duke Law and Technology Review 18–84. (First posted on SSRN 24 May 2017)Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals’ lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a “right to an explanation” has emerged as a compellingly attractive remedy since it intuitively promises to “open the black box” to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any remedy in a storm has looked attractive.However, we argue that a right to an explanation in the EU General Data Protection Regulation (GDPR) is unlikely to present a complete remedy to algorithmic harms, particularly in some of the core “algorithmic war stories” that have shaped recent attitudes in this domain. Firstly, the law is restrictive, unclear, or even paradoxical concerning when any explanation-related right can be triggered. Secondly, even navigating this, the legal conception of explanations as “meaningful information about the logic of processing” may not be provided by the kind of ML “explanations” computer scientists have developed, partially in response. ML explanations are restricted both by the type of explanation sought, the dimensionality of the domain and the type of user seeking an explanation. However, “subject-centric" explanations (SCEs) focussing on particular regions of a model around a query show promise for interactive exploration, as do explanation systems based on learning a model from outside rather than taking it apart (pedagogical vs decompositional explanations ) in dodging developers' worries of IP or trade secrets disclosure.Based on our analysis, we fear that the search for a “right to an explanation” in the GDPR may be at best distracting, and at worst nurture a new kind of “transparency fallacy.” But all is not lost. We argue that other parts of the GDPR related (i) to the right to erasure ("right to be forgotten") and the right to data portability; and (ii) to privacy by design, Data Protection Impact Assessments and certification and privacy seals, may have the seeds we can use to make algorithms more responsible, explicable, and human-centred.


2020 ◽  
Vol 3 (1) ◽  
pp. 17
Author(s):  
Kajcsa Andrea

The changes that have been brought about by the General Data Protection Regulation starting with May 2018 are complex and ambitious. The General Data Protection Regulation is one of the most wide ranging pieces of legislation passed by the EU in recent years, and it introduces many concepts that are yet to be fully discovered in practice, such as the right to be forgotten, data portability and data breach notification. This paper intends to analyze the main obligations that public bodies, particularly, have after the GDPR has entered into force, and to evaluate the impact this legislative act has on the routine activities carried out by public authorities in Romania. To reach our goal, we will make reference to the obligations that are specific to public administration authorities as well as to those that public bodies are exempted from. We will also analyze the national legislative measures adopted in Romania after GDPR started to be in force, and the degree to which these have particularized the way public bodies are allowed and obliged to process personal data in Romania.


2020 ◽  
Vol 11 (1) ◽  
pp. 18-50 ◽  
Author(s):  
Maja BRKAN ◽  
Grégory BONNET

Understanding of the causes and correlations for algorithmic decisions is currently one of the major challenges of computer science, addressed under an umbrella term “explainable AI (XAI)”. Being able to explain an AI-based system may help to make algorithmic decisions more satisfying and acceptable, to better control and update AI-based systems in case of failure, to build more accurate models, and to discover new knowledge directly or indirectly. On the legal side, the question whether the General Data Protection Regulation (GDPR) provides data subjects with the right to explanation in case of automated decision-making has equally been the subject of a heated doctrinal debate. While arguing that the right to explanation in the GDPR should be a result of interpretative analysis of several GDPR provisions jointly, the authors move this debate forward by discussing the technical and legal feasibility of the explanation of algorithmic decisions. Legal limits, in particular the secrecy of algorithms, as well as technical obstacles could potentially obstruct the practical implementation of this right. By adopting an interdisciplinary approach, the authors explore not only whether it is possible to translate the EU legal requirements for an explanation into the actual machine learning decision-making, but also whether those limitations can shape the way the legal right is used in practice.


Sign in / Sign up

Export Citation Format

Share Document