scholarly journals A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI

Author(s):  
Sandra Wachter ◽  
Brent Mittelstadt

Big Data analytics and artificial intelligence (AI) draw non-intuitive and unverifiable inferences and predictions about the behaviors, preferences, and private lives of individuals. These inferences draw on highly diverse and feature-rich data of unpredictable value, and create new opportunities for discriminatory, biased, and invasive decision-making. Concerns about algorithmic accountability are often actually concerns about the way in which these technologies draw privacy invasive and non-verifiable inferences about us that we cannot predict, understand, or refute.Data protection law is meant to protect people’s privacy, identity, reputation, and autonomy, but is currently failing to protect data subjects from the novel risks of inferential analytics. The broad concept of personal data in Europe could be interpreted to include inferences, predictions, and assumptions that refer to or impact on an individual. If seen as personal data, individuals are granted numerous rights under data protection law. However, the legal status of inferences is heavily disputed in legal scholarship, and marked by inconsistencies and contradictions within and between the views of the Article 29 Working Party and the European Court of Justice.As we show in this paper, individuals are granted little control and oversight over how their personal data is used to draw inferences about them. Compared to other types of personal data, inferences are effectively ‘economy class’ personal data in the General Data Protection Regulation (GDPR). Data subjects’ rights to know about (Art 13-15), rectify (Art 16), delete (Art 17), object to (Art 21), or port (Art 20) personal data are significantly curtailed when it comes to inferences, often requiring a greater balance with controller’s interests (e.g. trade secrets, intellectual property) than would otherwise be the case. Similarly, the GDPR provides insufficient protection against sensitive inferences (Art 9) or remedies to challenge inferences or important decisions based on them (Art 22(3)).This situation is not accidental. In standing jurisprudence the European Court of Justice (ECJ; Bavarian Lager, YS. and M. and S., and Nowak) and the Advocate General (AG; YS. and M. and S. and Nowak) have consistently restricted the remit of data protection law to assessing the legitimacy of input personal data undergoing processing, and to rectify, block, or erase it. Critically, the ECJ has likewise made clear that data protection law is not intended to ensure the accuracy of decisions and decision-making processes involving personal data, or to make these processes fully transparent.Conflict looms on the horizon in Europe that will further weaken the protection afforded to data subjects against inferences. Current policy proposals addressing privacy protection (the ePrivacy Regulation and the EU Digital Content Directive) fail to close the GDPR’s accountability gaps concerning inferences. At the same time, the GDPR and Europe’s new Copyright Directive aim to facilitate data mining, knowledge discovery, and Big Data analytics by limiting data subjects’ rights over personal data. And lastly, the new Trades Secrets Directive provides extensive protection of commercial interests attached to the outputs of these processes (e.g. models, algorithms and inferences).In this paper we argue that a new data protection right, the ‘right to reasonable inferences’, is needed to help close the accountability gap currently posed ‘high risk inferences’ , meaning inferences that are privacy invasive or reputation damaging and have low verifiability in the sense of being predictive or opinion-based. In cases where algorithms draw ‘high risk inferences’ about individuals, this right would require ex-ante justification to be given by the data controller to establish whether an inference is reasonable. This disclosure would address (1) why certain data is a relevant basis to draw inferences; (2) why these inferences are relevant for the chosen processing purpose or type of automated decision; and (3) whether the data and methods used to draw the inferences are accurate and statistically reliable. The ex-ante justification is bolstered by an additional ex-post mechanism enabling unreasonable inferences to be challenged. A right to reasonable inferences must, however, be reconciled with EU jurisprudence and counterbalanced with IP and trade secrets law as well as freedom of expression and Article 16 of the EU Charter of Fundamental Rights: the freedom to conduct a business.

2020 ◽  
Vol 22 (2) ◽  
pp. 139-177
Author(s):  
Niovi Vavoula

Abstract Since the past three decades, an elaborate framework of EU-wide information systems processing the personal data of third-country nationals has emerged. The vast majority of these systems (VIS, Eurodac, EES, ETIAS) are conceptualised as multi-purpose tools, whereby their consultation for crime-related objectives is listed among their ancillary objectives. As a result, immigration records may be accessed by national law enforcement authorities and Europol for the purposes of fighting terrorism and other serious crimes under specified and limited conditions. Drawing from the relevant jurisprudence of the European Court, this article evaluates whether the EU rules on law enforcement access to EU immigration databases comply with the rights to respect for private life and protection of personal data, as enshrined in Article 7 and 8 of the EU Charter respectively. In addition, challenges posed by the forthcoming interoperability between databases are also examined.


Author(s):  
Fabiana Accardo

The purpose of this article is that to explain the impact of the landmark decision Schrems c. Data Protection Commissioner [Ireland] - delivered on 7 October 2015 (Case C-362/2014 EU) by the Court of Justice - on the European scenario. Starting from a brief analysis of the major outcomes originated from the pronunciation of the Court of Justice, then it tries to study the level of criticality that the Safe Harbor Agreement and the subsequently adequacy Commission decision 2000/520/EC – that has been invalidated with Schrems judgment – have provoked before this pronunciation on the matter of safeguarding personal privacy of european citizens when their personal data are transferred outside the European Union, in particular the reference is at the US context. Moreover it focuses on the most important aspects of the new EU-US agreement called Privacy Shield: it can be really considered the safer solution for data sharing in the light of the closer implementation of the Regulation (EU) 2016/679, which will take the place of the Directive 95 /46/CE on the EU data protection law?


2007 ◽  
Vol 14 (2) ◽  
pp. 177-187 ◽  
Author(s):  
Deryck Beyleveld ◽  
Mark Taylor

AbstractThis paper has three parts. In Part One, we argue that while biological samples and genetic information extracted from them are not (in terms of Directive 95/46/EC) personal data in and of themselves, each is capable of being personal data in appropriate contexts. In Part Two, we argue that if this is correct, then the requirement for sources of human biological samples to give informed consent for any use of their samples (which the European Court of Justice has maintained to be a fundamental principle of EC law but not one to be enforced via patent law) must be enforced by data protection law in the EU. Finally, in Part Three, we consider the implications of our position for the capacity of Directive 95/46/EC to adequately protect third party interests given the shared nature of genetic data.


2019 ◽  
Author(s):  
Peter Kieseberg ◽  
Lukas Daniel Klausner ◽  
Andreas Holzinger

In discussions on the General Data Protection Regulation (GDPR), anonymisation and deletion are frequently mentioned as suitable technical and organisational methods (TOMs) for privacy protection. The major problem of distortion in machine learning environments, as well as related issues with respect to privacy, are rarely mentioned. The Big Data Analytics project addresses these issues.


Author(s):  
Iryna Yavorska ◽  
Ivan Bratsuk

Research of the decisions of the European Court of Justice and of the European Court of Human Right is crucial in the process of approximation of Ukrainian legislation to the EU Law. This article subjects to analysis certain decisions of the Court of Justice of the EU in the area of Personal Data Protection, in particular, the main principles of protection. Court of Justice of the EU forms its decisions on Personal Data Protection in the format of conclusions, provided by the Court in response to the pre-judicial requests from national courts in relation to enquiries from citizens on legality of processing of their personal data, on terms of response to such enquiries, on terms of access by citizens to information which is considered Personal Data, on ensuring security of keeping Personal Data, on restrictions in collecting data, on the provisions in the national law on independence of bodies responsible for collecting and storing personal data. These conclusions of the Court of Justice of the EU aim to prevent violations of the protection of personal data or of its security, which could lead to accidental or illegal destruction, loss, change, unauthorised access to data. It should be noted that the term Personal Data covers not only the private sphere of citizens but also their professional or civic activity. Key words: personal data; EU; Court of Justice of the EU; EU principles of Personal Data Protection.


2021 ◽  
Vol 6 (5) ◽  
pp. 203-212
Author(s):  
Atiqah Azman ◽  
Nur Shaura Azrin Binti Azman ◽  
Nurul Sahira Binti Kamal Azwan ◽  
Sherie Aneesa Binti Johary Al Bakry ◽  
Wan Nur Afiqah Binti Wan Daud ◽  
...  

Big Data has revolutionized the process of online activities such as marketing and advertisement based on individual preferences in the eCommerce industry. In Malaysia, the integration of Big Data in the commercial and business environment is keenly felt by establishing the National Big Data Analytics Framework catalyzing further economic growth in all sectors. However, the distinct features of Big Data spawn issues relating to privacy, such as data profiling, lack of transparency regarding privacy policies, accidental disclosures of data, false data or false analytics results. Hence, this research provides an insight into the intersection between Big Data and an individual's fundamental rights. The trade-off between privacy breaching and preserving is becoming more intense due to the rapid advancement of Big Data. Suggesting comparative analysis method as the data analysis approach, the adequacy of the Malaysian Personal Data Protection Act 2010 (PDPA 2010) in governing the risks of Big Data is evaluated against the European Union General Data Protection Regulation (GDPR) in managing the risk arising from the integration of Big Data. This research is hoped to initiate the improvement to the legislative framework, provides fundamentals to the formulation of national policy, and creation of specific law on Big Data in Malaysia, which will subsequently benefit industrial players and stakeholders.


2020 ◽  
Vol 10 (2) ◽  
pp. 1-6
Author(s):  
John M

Exponentially growing datasphere, double-digit growing market, increasing interest, many successful projects. When big data is joined by analytics, everything is possible. Big data analytics are the protagonists of the IT market in a massive way in all sectors. In the coming years, the job market will require many experts in this sector, and many professions will be transformed. But with what impact on data protection? Will GDPR help balance technology and privacy?


Sign in / Sign up

Export Citation Format

Share Document