Big Data Analytics Under the EU General Data Protection Regulation

Author(s):  
E.M.L. Moerel ◽  
Alex van der Wolk
2019 ◽  
Author(s):  
Peter Kieseberg ◽  
Lukas Daniel Klausner ◽  
Andreas Holzinger

In discussions on the General Data Protection Regulation (GDPR), anonymisation and deletion are frequently mentioned as suitable technical and organisational methods (TOMs) for privacy protection. The major problem of distortion in machine learning environments, as well as related issues with respect to privacy, are rarely mentioned. The Big Data Analytics project addresses these issues.


Author(s):  
Sandra Wachter ◽  
Brent Mittelstadt

Big Data analytics and artificial intelligence (AI) draw non-intuitive and unverifiable inferences and predictions about the behaviors, preferences, and private lives of individuals. These inferences draw on highly diverse and feature-rich data of unpredictable value, and create new opportunities for discriminatory, biased, and invasive decision-making. Concerns about algorithmic accountability are often actually concerns about the way in which these technologies draw privacy invasive and non-verifiable inferences about us that we cannot predict, understand, or refute.Data protection law is meant to protect people’s privacy, identity, reputation, and autonomy, but is currently failing to protect data subjects from the novel risks of inferential analytics. The broad concept of personal data in Europe could be interpreted to include inferences, predictions, and assumptions that refer to or impact on an individual. If seen as personal data, individuals are granted numerous rights under data protection law. However, the legal status of inferences is heavily disputed in legal scholarship, and marked by inconsistencies and contradictions within and between the views of the Article 29 Working Party and the European Court of Justice.As we show in this paper, individuals are granted little control and oversight over how their personal data is used to draw inferences about them. Compared to other types of personal data, inferences are effectively ‘economy class’ personal data in the General Data Protection Regulation (GDPR). Data subjects’ rights to know about (Art 13-15), rectify (Art 16), delete (Art 17), object to (Art 21), or port (Art 20) personal data are significantly curtailed when it comes to inferences, often requiring a greater balance with controller’s interests (e.g. trade secrets, intellectual property) than would otherwise be the case. Similarly, the GDPR provides insufficient protection against sensitive inferences (Art 9) or remedies to challenge inferences or important decisions based on them (Art 22(3)).This situation is not accidental. In standing jurisprudence the European Court of Justice (ECJ; Bavarian Lager, YS. and M. and S., and Nowak) and the Advocate General (AG; YS. and M. and S. and Nowak) have consistently restricted the remit of data protection law to assessing the legitimacy of input personal data undergoing processing, and to rectify, block, or erase it. Critically, the ECJ has likewise made clear that data protection law is not intended to ensure the accuracy of decisions and decision-making processes involving personal data, or to make these processes fully transparent.Conflict looms on the horizon in Europe that will further weaken the protection afforded to data subjects against inferences. Current policy proposals addressing privacy protection (the ePrivacy Regulation and the EU Digital Content Directive) fail to close the GDPR’s accountability gaps concerning inferences. At the same time, the GDPR and Europe’s new Copyright Directive aim to facilitate data mining, knowledge discovery, and Big Data analytics by limiting data subjects’ rights over personal data. And lastly, the new Trades Secrets Directive provides extensive protection of commercial interests attached to the outputs of these processes (e.g. models, algorithms and inferences).In this paper we argue that a new data protection right, the ‘right to reasonable inferences’, is needed to help close the accountability gap currently posed ‘high risk inferences’ , meaning inferences that are privacy invasive or reputation damaging and have low verifiability in the sense of being predictive or opinion-based. In cases where algorithms draw ‘high risk inferences’ about individuals, this right would require ex-ante justification to be given by the data controller to establish whether an inference is reasonable. This disclosure would address (1) why certain data is a relevant basis to draw inferences; (2) why these inferences are relevant for the chosen processing purpose or type of automated decision; and (3) whether the data and methods used to draw the inferences are accurate and statistically reliable. The ex-ante justification is bolstered by an additional ex-post mechanism enabling unreasonable inferences to be challenged. A right to reasonable inferences must, however, be reconciled with EU jurisprudence and counterbalanced with IP and trade secrets law as well as freedom of expression and Article 16 of the EU Charter of Fundamental Rights: the freedom to conduct a business.


This new book provides an article-by-article commentary on the new EU General Data Protection Regulation. Adopted in April 2016 and applicable from May 2018, the GDPR is the centrepiece of the recent reform of the EU regulatory framework for protection of personal data. It replaces the 1995 EU Data Protection Directive and has become the most significant piece of data protection legislation anywhere in the world. This book is edited by three leading authorities and written by a team of expert specialists in the field from around the EU and representing different sectors (including academia, the EU institutions, data protection authorities, and the private sector), thus providing a pan-European analysis of the GDPR. It examines each article of the GDPR in sequential order and explains how its provisions work, thus allowing the reader to easily and quickly elucidate the meaning of individual articles. An introductory chapter provides an overview of the background to the GDPR and its place in the greater structure of EU law and human rights law. Account is also taken of closely linked legal instruments, such as the Directive on Data Protection and Law Enforcement that was adopted concurrently with the GDPR, and of the ongoing work on the proposed new E-Privacy Regulation.


AJIL Unbound ◽  
2020 ◽  
Vol 114 ◽  
pp. 5-9 ◽  
Author(s):  
Cedric Ryngaert ◽  
Mistale Taylor

The deterritorialization of the Internet and international communications technology has given rise to acute jurisdictional questions regarding who may regulate online activities. In the absence of a global regulator, states act unilaterally, applying their own laws to transborder activities. The EU's “extraterritorial” application of its data protection legislation—initially the Data Protection Directive (DPD) and, since 2018, the General Data Protection Regulation (GDPR)—is a case in point. The GDPR applies to “the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: (a) the offering of goods or services . . . to such data subjects in the Union; or (b) the monitoring of their behaviour . . . within the Union.” It also conditions data transfers outside the EU on third states having adequate (meaning essentially equivalent) data protection standards. This essay outlines forms of extraterritoriality evident in EU data protection law, which could be legitimized by certain fundamental rights obligations. It then looks at how the EU balances data protection with third states’ countervailing interests. This approach can involve burdens not only for third states or corporations, but also for the EU political branches themselves. EU law viewed through the lens of public international law shows how local regulation is going global, despite its goal of protecting only EU data subjects.


2020 ◽  
Vol 48 (S1) ◽  
pp. 187-195
Author(s):  
Edward S. Dove ◽  
Jiahong Chen

In this article, we consider the possible application of the European General Data Protection Regulation (GDPR) to “citizen scientist”-led health research with mobile devices. We argue that the GDPR likely does cover this activity, depending on the specific context and the territorial scope. Remaining open questions that result from our analysis lead us to call for lex specialis that would provide greater clarity and certainty regarding the processing of health data by for research purposes, including these non-traditional researchers.


2019 ◽  
Vol 16 (1) ◽  
pp. 158-191 ◽  
Author(s):  
Christopher Kuner

The importance of personal data processing for international organizations (‘IOs’) demonstrates the need for them to implement data protection in their work. The EU General Data Protection Regulation (‘GDPR’) will be influential around the world, and will impact IOs as well. Its application to them should be determined under relevant principles of EU law and public international law, and it should be interpreted consistently with the international obligations of the EU and its Member States. However, IOs should implement data protection measures regardless of whether the GDPR applies to them in a legal sense. There is a need for EU law and international law to take each other better into account, so that IOs can enjoy their privileges and immunities also with regard to EU law and avoid conflicts with international law, while still providing a high level of data protection in their operations.


IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 111709-111726 ◽  
Author(s):  
Mamoona N. Asghar ◽  
Nadia Kanwal ◽  
Brian Lee ◽  
Martin Fleury ◽  
Marco Herbst ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document