Algorithms, artificial intelligence and automated decisions concerning workers and the risks of discrimination: the necessary collective governance of data protection

2019 ◽  
Vol 25 (4) ◽  
pp. 465-481 ◽  
Author(s):  
Adrián Todolí-Signes

Big data, algorithms and artificial intelligence now allow employers to process information on their employees and potential employees in a far more efficient manner and at a much lower cost than in the past. This makes it possible to profile workers automatically and even allows technology itself to replace human resources personnel in making decisions that have legal effects on employees (recruitment, promotion, dismissals, etc.). This entails great risks of worker discrimination and defencelessness, with workers unaware of the reasons underlying any such decision. This article analyses the protections established in the EU General Data Protection Regulation (GDPR) for safeguarding employees against discrimination. One of the main conclusions that can be drawn is that, in the face of the inadequacy of the GDPR in the field of labour relations, there is a need for the collective governance of workplace data protection, requiring the participation of workers’ representatives in establishing safeguards.

Pro Futuro ◽  
2021 ◽  
Vol 10 (4) ◽  
Author(s):  
Gizem Gültekin Várkonyi

This paper presents a general overview of the problems regarding the regulation of artificial intelligence (AI) raised in the official published works of the European Union (EU) and interprets these problems from the perspective of the Hungarian experts as a case study. Even though a new regulation on AI has already been proposed at the EU level, the paper evaluates specific rules and principles regarding data protection since data is the lifeblood of AI systems and the protection of such data is a fundamental right enshrined in the EU legislation via the General Data Protection Regulation (GDPR). The result of the study shows that the application of the GDPR on AI systems in an efficient and uniform way might be at stake since different outputs were generated by the experts to the same legal questions deriving from a scenario presented.


2018 ◽  
Vol 25 (3) ◽  
pp. 284-307
Author(s):  
Giovanni Comandè ◽  
Giulia Schneider

Abstract Health data are the most special of the ‘special categories’ of data under Art. 9 of the General Data Protection Regulation (GDPR). The same Art. 9 GDPR prohibits, with broad exceptions, the processing of ‘data concerning health’. Our thesis is that, through data mining technologies, health data have progressively undergone a process of distancing from the healthcare sphere as far as the generation, the processing and the uses are concerned. The case study aims thus to test the endurance of the ‘special category’ of health data in the face of data mining technologies and the never-ending lifecycles of health data they feed. At a more general level of analysis, the case of health data shows that data mining techniques challenge core data protection notions, such as the distinction between sensitive and non-sensitive personal data, requiring a shift in terms of systemic perspectives that the GDPR only partly addresses.


This new book provides an article-by-article commentary on the new EU General Data Protection Regulation. Adopted in April 2016 and applicable from May 2018, the GDPR is the centrepiece of the recent reform of the EU regulatory framework for protection of personal data. It replaces the 1995 EU Data Protection Directive and has become the most significant piece of data protection legislation anywhere in the world. This book is edited by three leading authorities and written by a team of expert specialists in the field from around the EU and representing different sectors (including academia, the EU institutions, data protection authorities, and the private sector), thus providing a pan-European analysis of the GDPR. It examines each article of the GDPR in sequential order and explains how its provisions work, thus allowing the reader to easily and quickly elucidate the meaning of individual articles. An introductory chapter provides an overview of the background to the GDPR and its place in the greater structure of EU law and human rights law. Account is also taken of closely linked legal instruments, such as the Directive on Data Protection and Law Enforcement that was adopted concurrently with the GDPR, and of the ongoing work on the proposed new E-Privacy Regulation.


AJIL Unbound ◽  
2020 ◽  
Vol 114 ◽  
pp. 5-9 ◽  
Author(s):  
Cedric Ryngaert ◽  
Mistale Taylor

The deterritorialization of the Internet and international communications technology has given rise to acute jurisdictional questions regarding who may regulate online activities. In the absence of a global regulator, states act unilaterally, applying their own laws to transborder activities. The EU's “extraterritorial” application of its data protection legislation—initially the Data Protection Directive (DPD) and, since 2018, the General Data Protection Regulation (GDPR)—is a case in point. The GDPR applies to “the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: (a) the offering of goods or services . . . to such data subjects in the Union; or (b) the monitoring of their behaviour . . . within the Union.” It also conditions data transfers outside the EU on third states having adequate (meaning essentially equivalent) data protection standards. This essay outlines forms of extraterritoriality evident in EU data protection law, which could be legitimized by certain fundamental rights obligations. It then looks at how the EU balances data protection with third states’ countervailing interests. This approach can involve burdens not only for third states or corporations, but also for the EU political branches themselves. EU law viewed through the lens of public international law shows how local regulation is going global, despite its goal of protecting only EU data subjects.


2019 ◽  
Vol 6 (1) ◽  
pp. 205395171986054 ◽  
Author(s):  
Heike Felzmann ◽  
Eduard Fosch Villaronga ◽  
Christoph Lutz ◽  
Aurelia Tamò-Larrieux

Transparency is now a fundamental principle for data processing under the General Data Protection Regulation. We explore what this requirement entails for artificial intelligence and automated decision-making systems. We address the topic of transparency in artificial intelligence by integrating legal, social, and ethical aspects. We first investigate the ratio legis of the transparency requirement in the General Data Protection Regulation and its ethical underpinnings, showing its focus on the provision of information and explanation. We then discuss the pitfalls with respect to this requirement by focusing on the significance of contextual and performative factors in the implementation of transparency. We show that human–computer interaction and human-robot interaction literature do not provide clear results with respect to the benefits of transparency for users of artificial intelligence technologies due to the impact of a wide range of contextual factors, including performative aspects. We conclude by integrating the information- and explanation-based approach to transparency with the critical contextual approach, proposing that transparency as required by the General Data Protection Regulation in itself may be insufficient to achieve the positive goals associated with transparency. Instead, we propose to understand transparency relationally, where information provision is conceptualized as communication between technology providers and users, and where assessments of trustworthiness based on contextual factors mediate the value of transparency communications. This relational concept of transparency points to future research directions for the study of transparency in artificial intelligence systems and should be taken into account in policymaking.


2020 ◽  
Vol 48 (S1) ◽  
pp. 187-195
Author(s):  
Edward S. Dove ◽  
Jiahong Chen

In this article, we consider the possible application of the European General Data Protection Regulation (GDPR) to “citizen scientist”-led health research with mobile devices. We argue that the GDPR likely does cover this activity, depending on the specific context and the territorial scope. Remaining open questions that result from our analysis lead us to call for lex specialis that would provide greater clarity and certainty regarding the processing of health data by for research purposes, including these non-traditional researchers.


2019 ◽  
Vol 16 (1) ◽  
pp. 158-191 ◽  
Author(s):  
Christopher Kuner

The importance of personal data processing for international organizations (‘IOs’) demonstrates the need for them to implement data protection in their work. The EU General Data Protection Regulation (‘GDPR’) will be influential around the world, and will impact IOs as well. Its application to them should be determined under relevant principles of EU law and public international law, and it should be interpreted consistently with the international obligations of the EU and its Member States. However, IOs should implement data protection measures regardless of whether the GDPR applies to them in a legal sense. There is a need for EU law and international law to take each other better into account, so that IOs can enjoy their privileges and immunities also with regard to EU law and avoid conflicts with international law, while still providing a high level of data protection in their operations.


IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 111709-111726 ◽  
Author(s):  
Mamoona N. Asghar ◽  
Nadia Kanwal ◽  
Brian Lee ◽  
Martin Fleury ◽  
Marco Herbst ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document