scholarly journals The Right to Explanation, Explained

Author(s):  
Margot Kaminski

Many have called for algorithmic accountability: laws governing decision-making by complex algorithms, or AI. The EU’s General Data Protection Regulation (GDPR) now establishes exactly this. The recent debate over the right to explanation (a right to information about individual decisions made by algorithms) has obscured the significant algorithmic accountability regime established by the GDPR. The GDPR’s provisions on algorithmic accountability, which include a right to explanation, have the potential to be broader, stronger, and deeper than the preceding requirements of the Data Protection Directive. This Essay clarifies, largely for a U.S. audience, what the GDPR actually requires, incorporating recently released authoritative guidelines.

2020 ◽  
Author(s):  
Bart Sloot

The General Data Protection Regulation in Plain Language is a guide for anyone interested in the much-discussed rules of the GDPR. In this legislation, which came into force in 2018, the European Union meticulously describes what you can and cannot do with data about other people. Violating these rules can lead to a fine of up to 20 million euros. This book sets out the most important obligations of individuals and organisations that process data about others. These include taking technical security measures, carrying out an impact assessment and registering all data-processing procedures within an organisation. It also discusses the rights of citizens whose data are processed, such as the right to be forgotten, the right to information and the right to data portability.


2021 ◽  
Vol 46 (3-4) ◽  
pp. 321-345
Author(s):  
Robert Grzeszczak ◽  
Joanna Mazur

Abstract The development of automated decision-making technologies creates the threat of de-iuridification: replacement of the legal acts’ provisions with automated, technological solutions. The article examines how selected provisions of the General Data Protection Regulation concerning, among other things, data protection impact assessments, the right to not be subject to automated decision-making, information obligations and the right to access are applied in the Polish national legal order. We focus on the institutional and procedural solutions regarding the involvement of expert bodies and other stakeholders in the process of specification of the norms included in the gdpr and their enforcement. We argue that the example of Poland shows that the solutions adopted in the gdpr do not shift the balance concerning regulatory power in regard to automated decision-making to other stakeholders and as such do not favor of a more participative approach to the regulatory processes.


2021 ◽  
Vol 11 (4) ◽  
pp. 149
Author(s):  
Emily M. Weitzenboeck

Norway has a high degree of digitalisation. In the public sector, there is a long tradition of automation of parts of case management. This includes automation of cases where a public sector body makes a so-called individual administrative decision, that is, a decision made in the exercise of public authority through which the rights or duties of one or more specified private persons are determined. In the last five years, various amendments to public sector legislation were proposed by a number of government departments and agencies in Norway to ensure that the relative administrative agency has a legal basis to carry out fully automated individual decisions. This is challenging both from an administrative law and from a data protection law standpoint. Among the main reasons for the move towards fully automated legal decision-making that are mentioned in the preparatory works to the proposed amendments are greater efficiency in decision-making, equal treatment of citizens and a claim that such decisions will be less prone to error than human decisions. This paper examines this trend in Norway and identifies the statutes and regulations that have been amended or are in the process of being amended. It analyses the measures specified in these amendments to safeguard the individual party’s rights, freedoms and legitimate interests. Finally, it discusses the tightrope that must be walked to safeguard important administrative law principles and rules such as protection from arbitrary decisions, the audi alternam partem rule and the right under the European Union’s General Data Protection Regulation not to be subject to fully automated decisions.


2020 ◽  
Vol 11 (1) ◽  
pp. 18-50 ◽  
Author(s):  
Maja BRKAN ◽  
Grégory BONNET

Understanding of the causes and correlations for algorithmic decisions is currently one of the major challenges of computer science, addressed under an umbrella term “explainable AI (XAI)”. Being able to explain an AI-based system may help to make algorithmic decisions more satisfying and acceptable, to better control and update AI-based systems in case of failure, to build more accurate models, and to discover new knowledge directly or indirectly. On the legal side, the question whether the General Data Protection Regulation (GDPR) provides data subjects with the right to explanation in case of automated decision-making has equally been the subject of a heated doctrinal debate. While arguing that the right to explanation in the GDPR should be a result of interpretative analysis of several GDPR provisions jointly, the authors move this debate forward by discussing the technical and legal feasibility of the explanation of algorithmic decisions. Legal limits, in particular the secrecy of algorithms, as well as technical obstacles could potentially obstruct the practical implementation of this right. By adopting an interdisciplinary approach, the authors explore not only whether it is possible to translate the EU legal requirements for an explanation into the actual machine learning decision-making, but also whether those limitations can shape the way the legal right is used in practice.


This new book provides an article-by-article commentary on the new EU General Data Protection Regulation. Adopted in April 2016 and applicable from May 2018, the GDPR is the centrepiece of the recent reform of the EU regulatory framework for protection of personal data. It replaces the 1995 EU Data Protection Directive and has become the most significant piece of data protection legislation anywhere in the world. This book is edited by three leading authorities and written by a team of expert specialists in the field from around the EU and representing different sectors (including academia, the EU institutions, data protection authorities, and the private sector), thus providing a pan-European analysis of the GDPR. It examines each article of the GDPR in sequential order and explains how its provisions work, thus allowing the reader to easily and quickly elucidate the meaning of individual articles. An introductory chapter provides an overview of the background to the GDPR and its place in the greater structure of EU law and human rights law. Account is also taken of closely linked legal instruments, such as the Directive on Data Protection and Law Enforcement that was adopted concurrently with the GDPR, and of the ongoing work on the proposed new E-Privacy Regulation.


2021 ◽  
pp. 77-91
Author(s):  
Kieron O’Hara

This chapter describes the Brussels Bourgeois Internet. The ideal consists of positive, managed liberty where rights of others are respected, as in the bourgeois public space, where liberty follows only when rights are secured. The exemplar of this approach is the European Union, which uses administrative means, soft law, and regulation to project its vision across the Internet. Privacy and data protection have become the most emblematic struggles. Under the Data Protection Directive of 1995, the European Union developed data-protection law and numerous privacy rights, including a right to be forgotten, won in a case against Google Spain in 2014, the arguments about which are dissected. The General Data Protection Regulation (GDPR) followed in 2018, amplifying this approach. GDPR is having the effect of enforcing European data-protection law on international players (the ‘Brussels effect’), while the European Union over the years has developed unmatched expertise in data-protection law.


AJIL Unbound ◽  
2020 ◽  
Vol 114 ◽  
pp. 5-9 ◽  
Author(s):  
Cedric Ryngaert ◽  
Mistale Taylor

The deterritorialization of the Internet and international communications technology has given rise to acute jurisdictional questions regarding who may regulate online activities. In the absence of a global regulator, states act unilaterally, applying their own laws to transborder activities. The EU's “extraterritorial” application of its data protection legislation—initially the Data Protection Directive (DPD) and, since 2018, the General Data Protection Regulation (GDPR)—is a case in point. The GDPR applies to “the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: (a) the offering of goods or services . . . to such data subjects in the Union; or (b) the monitoring of their behaviour . . . within the Union.” It also conditions data transfers outside the EU on third states having adequate (meaning essentially equivalent) data protection standards. This essay outlines forms of extraterritoriality evident in EU data protection law, which could be legitimized by certain fundamental rights obligations. It then looks at how the EU balances data protection with third states’ countervailing interests. This approach can involve burdens not only for third states or corporations, but also for the EU political branches themselves. EU law viewed through the lens of public international law shows how local regulation is going global, despite its goal of protecting only EU data subjects.


2019 ◽  
pp. 245-259
Author(s):  
Bernard Łukanko

The study is concerned with the issue of mutual relationship between the failure to comply with the laws on personal data protection and regulations relating to the protection of personal interests, including in particular the right to privacy. The article presents the views held by the Supreme Court with respect to the possibility of considering acts infringing upon the provisions of the Personal Data Protection Act of 1997 (after 24 May 2018) and of the General Data Protection Regulation (after 25 May 2018) as violation of personal interests, such as the right to privacy. The author shared the view of the case law stating that, if in specifc circumstances the processing of personal data violates the right to privacy, the party concerned may seek remedy on the grounds of Articles 23 and 24 of the Polish Civil Code. This position isalso relevant after the entry into force of the GDPR which, in a comprehensive and exhaustive manner, directly applicable in all Member States, regulates the issue of liability under civil law for infringements of the provisions of the Regulation, however, according to the position expressed in professional literature, it does not exclude the concurrence of claims and violation of the provisions on the protection of personal interests caused by a specifc event. In case of improper processing of personal data, the remedies available under domestic law on the protection of personal interests may be of particular importance outside the subject matter scope of the GDPR applicability. 


Sign in / Sign up

Export Citation Format

Share Document