The Right of Access under EU Data Protection Law

Author(s):  
Helena U. Vrabec

Chapter 5 focuses on Article 15 of the GDPR and explains the scope of the information that can be accessed under the right. The chapter then discusses the importance of the interface to submit data subject access requests. The core part of Chapter 5 is the analysis of the regulatory boundaries of the right of access and various avenues to limit the right, for instance, a conflict with the rights of another individual. Finally, the chapter illustrates how the right of access is applied in the data-driven economy by applying it to three different contexts: shared data, anonymised/pseudonymised data, and automated decision-making.

2018 ◽  
Vol 12 (2) ◽  
pp. 221-246
Author(s):  
Angela Sobolčiaková

The paper discusses the right to obtain a copy of personal data based on the access right guaranteed in Articles 15 (3) and limited in 15 (4) of the GDPR. Main question is to what extent, the access right provided to data subject under the data protection rules is compatible with copyright. We argue that the subject matter of Article 15 (3) of the GDPR - copy of personal data – may infringe copyright protection of third parties but not a copyright protection attributed to the data controllers.Firstly, because the right of access and copyright may be in certain circumstances incompatible. Secondly, the data controllers are primarily responsible for balancing conflicting rights and neutral balancing exercise could only be applied by the Data Protection Authorities. Thirdly, the case law of the CJEU regarding this issue will need to be developed because the copy as a result of access right may be considered as a new element in data protection law.


2020 ◽  
Author(s):  
Frederik Zuiderveen Borgesius

Algorithmic decision-making and other types of artificial intelligence (AI) can be used to predict who will commit crime, who will be a good employee, who will default on a loan, etc. However, algorithmic decision-making can also threaten human rights, such as the right to non-discrimination. The paper evaluates current legal protection in Europe against discriminatory algorithmic decisions. The paper shows that non-discrimination law, in particular through the concept of indirect discrimination, prohibits many types of algorithmic discrimination. Data protection law could also help to defend people against discrimination. Proper enforcement of non-discrimination law and data protection law could help to protect people. However, the paper shows that both legal instruments have severe weaknesses when applied to artificial intelligence. The paper suggests how enforcement of current rules can be improved. The paper also explores whether additional rules are needed. The paper argues for sector-specific – rather than general – rules, and outlines an approach to regulate algorithmic decision-making.


Author(s):  
Jef Ausloos

This book critically investigates the role of data subject rights in countering information and power asymmetries online. It aims at dissecting ‘data subject empowerment’ in the information society through the lens of the right to erasure (‘right to be forgotten’) in Article 17 of the General Data Protection Regulation (GDPR). In doing so, it provides an extensive analysis of the interaction between the GDPR and the fundamental right to data protection in Article 8 of the Charter of Fundamental Rights of the EU (Charter), how data subject rights affect fair balancing of fundamental rights, and what the practical challenges are to effective data subject rights. The book starts with exploring the data-driven asymmetries that characterize individuals’ relationship with tech giants. These commercial entities increasingly anticipate and govern how people interact with each other and the world around them, affecting core values such as individual autonomy, dignity, and freedom. The book explores how data protection law, and data subject rights in particular, enable resisting, breaking down or at the very least critically engaging with these asymmetric relationships. It concludes that despite substantial legal and practical hurdles, the GDPR’s right to erasure does play a meaningful role in furthering the fundamental right to data protection (Art 8 Charter) in the face of power asymmetries online.


Author(s):  
Helena U. Vrabec

Chapter 7 analyses the right to data portability set out in Article 20 of the GDPR. It first provides an overview of several commercial and regulatory initiatives that preceded the GDPR version of the right to personal data portability. Next, it explores the language of Article 20 to demonstrate the effects of the narrow scope of the right. The chapter then shows how data portability interacts with other data subject rights, particularly with the right to access and the right to be forgotten, before it describes manifestations of data portability in legal areas outside of the data protection law. Finally, the chapter explores the specific objective of the right to data portability under the GDPR as an enabler of data subjects’ control.


2021 ◽  
Vol 46 (3-4) ◽  
pp. 321-345
Author(s):  
Robert Grzeszczak ◽  
Joanna Mazur

Abstract The development of automated decision-making technologies creates the threat of de-iuridification: replacement of the legal acts’ provisions with automated, technological solutions. The article examines how selected provisions of the General Data Protection Regulation concerning, among other things, data protection impact assessments, the right to not be subject to automated decision-making, information obligations and the right to access are applied in the Polish national legal order. We focus on the institutional and procedural solutions regarding the involvement of expert bodies and other stakeholders in the process of specification of the norms included in the gdpr and their enforcement. We argue that the example of Poland shows that the solutions adopted in the gdpr do not shift the balance concerning regulatory power in regard to automated decision-making to other stakeholders and as such do not favor of a more participative approach to the regulatory processes.


2019 ◽  
Vol 9 (4) ◽  
pp. 3-18
Author(s):  
Joanna Mazur

Abstract The article is predicated upon the allegation that there is a similarity between the scientific uncertainty linked to the hazard which human interventions pose to the natural environment and the hazard which the development of automated decision-making techniques poses to certain aspects of human lives in the digital environment. On the basis of this allegation, the analysis examines the similarities between the European environmental law, which is crucial for the natural environment, and the European data protection law, which is fundamental for the digital environment. As there are measures already adopted by the data protection law from the environmental law, such as impact assessments and the right to access information, the main hypothesis of this analysis is to consider whether there are further inspirations for the development of European data protection law which could be drawn from environmental law, regarding the scientific uncertainty which is common to these two areas of regulation. The article examines a legal measure, namely, the precautionary principle, as the conjectural response to the challenges linked to the development of the new technologies. The experiences collected in the area of environmental law concerning the precautionary principle are analysed as a source of lessons to be learned concerning the regulatory measures adopted in order to deal with scientific uncertainty, not only in the natural environment, but also in the digital one.


Author(s):  
Lee A. Bygrave

Article 3(2)(b) (Monitoring of data subjects’ behaviour); Article 5 (Principles relating to processing of personal data); Article 6 (Legal grounds for processing of personal data); Article 8 (Conditions applicable to children’s consent in relation to information society services) (see also recital 38); Article 13(2)(f) (Information on the existence of automated decision-making, including profiling) (see also recital 60); Article 14(2)(g) (Information on the existence of automated decision-making, including profiling) (see also recital 60); Article 15(1)(h) (Right of access regarding automated decision-making, including profiling) (see also recital 63); Article 21 (Right to object) (see also recital 70); Article 22 (Automated decision-making, including profiling) (see also recital 71); Article 23 (Restrictions) (see also recital 73); Article 35(3)(a) (Data protection impact assessment) (see also recital 91); Article 47(2)(e) (Binding corporate rules); Article 70(1)(f) (EDPB guidelines on automated decisions based on profiling)/


Author(s):  
Helena U. Vrabec

Chapter 8 focuses on the provisions of Articles 21 and 22 of the GDPR in relation to profiling. The chapter first provides the reader with the essential context on profiling as a building block of the data economy. It then discusses how the GDPR tackles the risks of profiling, providing a legal analysis of the right to object and the right not to be subject to automated decision-making. These two rights, together with the right to information and access to the extent that they refer to automated decision-making, form a cluster of rights that can be referred to as a ‘choice architecture for data subjects’ and can be particularly useful to control profiling.


Author(s):  
Lee A. Bygrave

Article 3(2)(b) (Monitoring of data subjects’ behaviour); Article 4(4) (Definition of ‘profiling’); Article 5(1)(a) (Fair and transparent processing) (see also recitals 39 and 60); Article 5(2) (Accountability); Article 6 (Legal grounds for processing of personal data); Article 8 (Conditions applicable to children’s consent in relation to information society services); Article 12 (see too recital 58); Article 13(2)(f) (Information on the existence of automated decision-making); Article 14(2)(g) (Information on the existence of automated decision-making); Article 15(1)(h) (Right of access regarding automated decision-making); Article 21 (Right to object) (see also recital 70); Article 23 (Restrictions); Article 35(3)(a) (Data protection impact assessment) (see too recital 84); Article 47(2)(e) (Binding corporate rules); Article 70(1)(f) (EDPB guidelines on automated decisions based on profiling).


Sign in / Sign up

Export Citation Format

Share Document