scholarly journals Slave to the Algorithm? Why a 'right to an explanation' is probably not the remedy you are looking for

Author(s):  
Lilian Edwards ◽  
Michael Veale

Cite as Lilian Edwards and Michael Veale, 'Slave to the Algorithm? Why a 'right to an explanation' is probably not the remedy you are looking for' (2017) 16 Duke Law and Technology Review 18–84. (First posted on SSRN 24 May 2017)Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals’ lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a “right to an explanation” has emerged as a compellingly attractive remedy since it intuitively promises to “open the black box” to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any remedy in a storm has looked attractive.However, we argue that a right to an explanation in the EU General Data Protection Regulation (GDPR) is unlikely to present a complete remedy to algorithmic harms, particularly in some of the core “algorithmic war stories” that have shaped recent attitudes in this domain. Firstly, the law is restrictive, unclear, or even paradoxical concerning when any explanation-related right can be triggered. Secondly, even navigating this, the legal conception of explanations as “meaningful information about the logic of processing” may not be provided by the kind of ML “explanations” computer scientists have developed, partially in response. ML explanations are restricted both by the type of explanation sought, the dimensionality of the domain and the type of user seeking an explanation. However, “subject-centric" explanations (SCEs) focussing on particular regions of a model around a query show promise for interactive exploration, as do explanation systems based on learning a model from outside rather than taking it apart (pedagogical vs decompositional explanations ) in dodging developers' worries of IP or trade secrets disclosure.Based on our analysis, we fear that the search for a “right to an explanation” in the GDPR may be at best distracting, and at worst nurture a new kind of “transparency fallacy.” But all is not lost. We argue that other parts of the GDPR related (i) to the right to erasure ("right to be forgotten") and the right to data portability; and (ii) to privacy by design, Data Protection Impact Assessments and certification and privacy seals, may have the seeds we can use to make algorithms more responsible, explicable, and human-centred.

2020 ◽  
Vol 3 (1) ◽  
pp. 17
Author(s):  
Kajcsa Andrea

The changes that have been brought about by the General Data Protection Regulation starting with May 2018 are complex and ambitious. The General Data Protection Regulation is one of the most wide ranging pieces of legislation passed by the EU in recent years, and it introduces many concepts that are yet to be fully discovered in practice, such as the right to be forgotten, data portability and data breach notification. This paper intends to analyze the main obligations that public bodies, particularly, have after the GDPR has entered into force, and to evaluate the impact this legislative act has on the routine activities carried out by public authorities in Romania. To reach our goal, we will make reference to the obligations that are specific to public administration authorities as well as to those that public bodies are exempted from. We will also analyze the national legislative measures adopted in Romania after GDPR started to be in force, and the degree to which these have particularized the way public bodies are allowed and obliged to process personal data in Romania.


Author(s):  
Anabelen Casares Marcos

The right to informational self-determination has raised bitter debate over the last decade as to the opportunity and possible scope of the right to demand withdrawal from the internet of personal information which, while true, might represent a detriment that there is no legal duty to put up with. The leading case in this topic is that of Mario Costeja, Judgment of the EU Court of Justice, May 13, 2014. The interest of recent European jurisprudence lies not so much in the recognition of such a right but in the appreciation of certain limits to its implementation, assisting data protection authorities in balancing the rights at stake in each case. Reflection on the current status of the issue considers rights and duties imposed in the matter by Regulation (EU) 2016/679, of 27 April, known as the new General Data Protection Regulation.


2020 ◽  
Vol 9 (1) ◽  
pp. 86-101
Author(s):  
Aleksandra Gebuza

AbstractThe main aim of the article is to provide analysis on the notion of the right to be forgotten developed by the CJEU in the ruling Google v. AEPD & Gonzalez and by the General Data Protection Regulation within the context of the processing of personal data on the Internet. The analysis provides the comparison of approach towards the notion between European and American jurisprudence and doctrine, in order to demonstrate the scale of difficulty in applying the concept in practice.


2020 ◽  
Author(s):  
Bart Sloot

The General Data Protection Regulation in Plain Language is a guide for anyone interested in the much-discussed rules of the GDPR. In this legislation, which came into force in 2018, the European Union meticulously describes what you can and cannot do with data about other people. Violating these rules can lead to a fine of up to 20 million euros. This book sets out the most important obligations of individuals and organisations that process data about others. These include taking technical security measures, carrying out an impact assessment and registering all data-processing procedures within an organisation. It also discusses the rights of citizens whose data are processed, such as the right to be forgotten, the right to information and the right to data portability.


2017 ◽  
Vol 19 (5) ◽  
pp. 765-779 ◽  
Author(s):  
Milda Macenaite

The new European Union (EU) General Data Protection Regulation aims to adapt children’s right to privacy to the ‘digital age’. It explicitly recognizes that children deserve specific protection of their personal data, and introduces additional rights and safeguards for children. This article explores the dilemmas that the introduction of the child-tailored online privacy protection regime creates – the ‘empowerment versus protection’ and the ‘individualized versus average child’ dilemmas. It concludes that by favouring protection over the empowerment of children, the Regulation risks limiting children in their online opportunities, and by relying on the average child criteria, it fails to consider the evolving capacities and best interests of the child.


2020 ◽  
pp. 146144482093403
Author(s):  
Sarah Turner ◽  
July Galindo Quintero ◽  
Simon Turner ◽  
Jessica Lis ◽  
Leonie Maria Tanczer

The right to data portability (RtDP), as outlined in the European Union’s General Data Protection Regulation (GDPR), enables data subjects to transmit their data from one service to another. This is of particular interest in the evolving Internet of Things (IoT) environment. This research delivers the first empirical analysis detailing the exercisability of the RtDP in the context of consumer IoT devices and the information provided to users about exercising the right. In Study 1, we reviewed 160 privacy policies of IoT producers to understand the level of information provided to a data subject. In Study 2, we tested four widely available IoT systems to examine whether procedures are in place to enable users to exercise the RtDP. Both studies showcase how the RtDP is not yet exercisable in the IoT environment, risking consumers being unable to unlock the long-term benefits of IoT systems.


Author(s):  
Giovanni Sartor

This chapter explores the connection between host providers’ liability and data protection, particularly the right to be forgotten. A conceptual analysis provides basic ideas including privacy, publicity, and neutrality. Subsequently, host providers’ immunities in EU law are compared with safe harbour provisions in US law. Data protection exceptionalism, namely, the view that providers’ immunities do not apply to violations of data protection, is critically considered. Knowledge of illegality of hosted content as a condition for providers’ liability is examined, focusing on how different understandings of this requirement may affect providers’ behaviour. The EU General Data Protection Regulation is then considered, addressing the way it defines the interface between data protection and the role/liabilities of providers. Finally, an analysis of the right to be forgotten is proposed, focusing on how the passage of time affects the legally relevant interests involved and on how sanctions are likely to affect the actions of host providers/users.


Author(s):  
Ciara Staunton

AbstractThe coming into force of the General Data Protection Regulation (GDPR) on 25 May 2018 has brought about considerable changes in how data may collected, stored and used. Biobanks, which require the collection, use and re-use of large quantities of biological samples and data, will be affected by the proposed changes. In seeking to require ‘data protection by design’, the GDPR provides data subjects with certain individual rights. They are, the right to be informed, the right of access, the right to rectification, the right to erasure, the right to restrict processing, the right to data portability, the right to object and rights in relation to automated decision making and profiling.This chapter will consider each of these individual rights in turn and discuss the impact on biobank research. In particular, it will discuss the challenges that are now facing biobanks in upholding the individual rights, the limits of these rights in light of the technical realities of biobanks, and the potential impact that they may have on the collection, sharing, use and re-use of biological data and material.


2021 ◽  
Author(s):  
Aurelia Tamo-Larrieux ◽  
Zaira Zihlmann ◽  
Kimberly Garcia ◽  
Simon Mayer

Using a digital service is often framed in a binary way: Either one agrees to the service provider's data processing practices, and is granted access to the service, or one does not, and is denied the service. Many scholars have lamented these ‘take-it-or-leave-it’ situations, as this goes against the ideals of data protection law. To address this inadequacy, computer scientists and legal scholars have tried to come up with approaches to enable more privacy-friendly products and services. In this article, we call for a right to customize the processing of user data. Our arguments build upon technology-driven approaches as well as on the ideals of privacy by design and the now codified data protection by design and default norm within the General Data Protection Regulation. In addition, we draw upon the right to repair that is propagated to empower consumers and enable a more circular economy. We propose two technologically-oriented approaches, termed ‘variants’ and ‘alternatives’ that could enable the technical implementation of a right to customization. We posit that these approaches cannot be demanded without limitation, and that restrictions will depend on how reasonable a customization demand is.


Sign in / Sign up

Export Citation Format

Share Document