Minding the Machine v2.0

2019 ◽  
pp. 248-262
Author(s):  
Lee A Bygrave

This chapter focuses on Articles 22 and 25 of the EU’s General Data Protection Regulation (Regulation 2016/679). It examines how these provisions will impact automated decisional systems. Article 22 gives a person a qualified right ‘not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her’. Article 25 imposes a duty on controllers of personal data to implement technical and organizational measures so that the processing of the data will meet the Regulation’s requirements and otherwise ensure protection of the data subject’s rights. Both sets of rules are aimed squarely at subjecting automated decisional systems to greater accountability. The chapter argues that the rules suffer from significant weaknesses that are likely to hamper their ability to meet this aim.

2020 ◽  
pp. 155-186
Author(s):  
María Dolores Mas Badia

Despite the differences between credit risk and insurance risk, in many countries large insurance companies include credit history amongst the information to be taken into account when assigning consumers to risk pools and deciding whether or not to offer them an auto or homeowner insurance policy, or to determine the premium that they should pay. In this study, I will try to establish some conclusions concerning the requirements and limits that the use of credit history data by insurers in the European Union should be subject to. In order to do this, I shall focus my attention primarily on Regulation (EU) 2016/679. This regulation, that came into force on 24 May 2018, not only forms the backbone of personal data protection in the EU, but is also set to become a model for regulation beyond the borders of the Union. This article will concentrate on two main aspects: the lawful basis for the processing of credit history data by insurers, and the rules that should apply to decisions based solely on automated processing, including profiling.Received: 30 December 2019Accepted: 07 February 2020Published online: 02 April 2020


Author(s):  
Ammar Younas ◽  

The increasing ‘datafication of society’1 and ubiquitous computing resulted in high privacy risks such as commercial exploitation of personal data, discrimination, identity theft and profiling (automated processing of personal data). 2 Especially, minor data subjects are more likely to be victims of unfair commercial practices due to their behavioral characteristics (emotional volatility and impulsiveness) and unawareness of consequences of their virtual activities.3 Accordingly, it has been claimed that thousands of mobile apps utilized by children collected their data and used it for tracking their location, processed it for the development of child profiles so as to tailor behavioral advertising targeted at them and shared it with third parties without children’s or parent’s knowledge.4 Following these concerns, recently adopted EU General Data Protection Regulation (679/2016) departed from its Data Protection Directive (DPD) in terms of children’s data protection by explicitly recognizing that minors need more protection than adults5 and providing specific provisions aimed at protecting children’s right to data protection.6 Unlike the GDPR, the DPD was designed to provide “equal” protection for all data subjects irrespective of their age.7 This paper argues that consent principle along with the requirement of parental consent cannot effectively be implemented for the protection of children’s data due to the lack of actual choice, verification issues and complexity of data processing, and also the outcome of the privacy notices in a child-appropriate form is limited. However, there are other mechanisms and restrictions embodied in the GDPR, which provide opportunities for the protection of children’s data by placing burden on data controllers rather than data subjects.


2019 ◽  
pp. 79-101 ◽  
Author(s):  
Aleksandra Pyka

This article refers to the issue of personal data processing conducted in connection with scientific research and in accordance with the provisions of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). It is not uncommon for the purposes of scientific research to process personal data, which is connected with the obligation to respect the rights of the data of the subjects involved. Entities conducting scientific research that process personal data for this purpose are required to apply the general reg­ulation governing, among others, the obligations imposed on the controllers. The issue of personal data processing for scientific research purposes has also been regulated in national legislation in connection with the need to apply the General Data Protection Regulation. The article discusses the basics of the admissibility of data processing for the needs of scientific research; providing personal data regarding criminal convictions and offences extracted from public registers at the request of the entity conducting scientific research; exercising the rights of the data of the subjects concerned; as well as the implementation of appropriate technical and organizational measures to ensure the security of data processing. In addition, the article discusses the issue of anonymization of personal data carried out after achieving the purpose of personal data processing, as well as the processing of special categories of personal data. The topics discussed in the article were not discussed in detail, as this would require further elaboration in a publication with a much wider volume range.


Author(s):  
Raphaël Gellert

The main goal of this book is to provide an understanding of what is commonly referred to as “the risk-based approach to data protection”. An expression that came to the fore during the overhaul process of the EU’s General Data Protection Regulation (GDPR)—even though it can also be found in other statutes under different acceptations. At its core it consists in endowing the regulated organisation that process personal data with increased responsibility for complying with data protection mandates. Such increased compliance duties are performed through risk management tools. It addresses this topic from various perspectives. In framing the risk-based approach as the latest model of a series of regulation models, the book provides an analysis of data protection law from the perspective of regulation theory as well as risk and risk management literatures, and their mutual interlinkages. Further, it provides an overview of the policy developments that led to the adoption of such an approach, which it discusses in the light of regulation theory. It also includes various discussions pertaining to the risk-based approach’s scope and meaning, to the way it has been uptaken in statutes including key provisions such as accountability and data protection impact assessments, or to its potential and limitations. Finally, it analyses how the risk-based approach can be implemented in practice by providing technical analyses of various data protection risk management methodologies.


2021 ◽  
Vol 11 (10) ◽  
pp. 4537
Author(s):  
Christian Delgado-von-Eitzen ◽  
Luis Anido-Rifón ◽  
Manuel J. Fernández-Iglesias

Blockchain technologies are awakening in recent years the interest of different actors in various sectors and, among them, the education field, which is studying the application of these technologies to improve information traceability, accountability, and integrity, while guaranteeing its privacy, transparency, robustness, trustworthiness, and authenticity. Different interesting proposals and projects were launched and are currently being developed. Nevertheless, there are still issues not adequately addressed, such as scalability, privacy, and compliance with international regulations such as the General Data Protection Regulation in Europe. This paper analyzes the application of blockchain technologies and related challenges to issue and verify educational data and proposes an innovative solution to tackle them. The proposed model supports the issuance, storage, and verification of different types of academic information, both formal and informal, and complies with applicable regulations, protecting the privacy of users’ personal data. This proposal also addresses the scalability challenges and paves the way for a global academic certification system.


Author(s):  
Michael Veale ◽  
Reuben Binns ◽  
Lilian Edwards

Many individuals are concerned about the governance of machine learning systems and the prevention of algorithmic harms. The EU's recent General Data Protection Regulation (GDPR) has been seen as a core tool for achieving better governance of this area. While the GDPR does apply to the use of models in some limited situations, most of its provisions relate to the governance of personal data, while models have traditionally been seen as intellectual property. We present recent work from the information security literature around ‘model inversion’ and ‘membership inference’ attacks, which indicates that the process of turning training data into machine-learned systems is not one way, and demonstrate how this could lead some models to be legally classified as personal data. Taking this as a probing experiment, we explore the different rights and obligations this would trigger and their utility, and posit future directions for algorithmic governance and regulation. This article is part of the theme issue ‘Governing artificial intelligence: ethical, legal, and technical opportunities and challenges’.


2018 ◽  
Vol 25 (3) ◽  
pp. 284-307
Author(s):  
Giovanni Comandè ◽  
Giulia Schneider

Abstract Health data are the most special of the ‘special categories’ of data under Art. 9 of the General Data Protection Regulation (GDPR). The same Art. 9 GDPR prohibits, with broad exceptions, the processing of ‘data concerning health’. Our thesis is that, through data mining technologies, health data have progressively undergone a process of distancing from the healthcare sphere as far as the generation, the processing and the uses are concerned. The case study aims thus to test the endurance of the ‘special category’ of health data in the face of data mining technologies and the never-ending lifecycles of health data they feed. At a more general level of analysis, the case of health data shows that data mining techniques challenge core data protection notions, such as the distinction between sensitive and non-sensitive personal data, requiring a shift in terms of systemic perspectives that the GDPR only partly addresses.


Hypertension ◽  
2021 ◽  
Vol 77 (4) ◽  
pp. 1029-1035
Author(s):  
Antonia Vlahou ◽  
Dara Hallinan ◽  
Rolf Apweiler ◽  
Angel Argiles ◽  
Joachim Beige ◽  
...  

The General Data Protection Regulation (GDPR) became binding law in the European Union Member States in 2018, as a step toward harmonizing personal data protection legislation in the European Union. The Regulation governs almost all types of personal data processing, hence, also, those pertaining to biomedical research. The purpose of this article is to highlight the main practical issues related to data and biological sample sharing that biomedical researchers face regularly, and to specify how these are addressed in the context of GDPR, after consulting with ethics/legal experts. We identify areas in which clarifications of the GDPR are needed, particularly those related to consent requirements by study participants. Amendments should target the following: (1) restricting exceptions based on national laws and increasing harmonization, (2) confirming the concept of broad consent, and (3) defining a roadmap for secondary use of data. These changes will be achieved by acknowledged learned societies in the field taking the lead in preparing a document giving guidance for the optimal interpretation of the GDPR, which will be finalized following a period of commenting by a broad multistakeholder audience. In parallel, promoting engagement and education of the public in the relevant issues (such as different consent types or residual risk for re-identification), on both local/national and international levels, is considered critical for advancement. We hope that this article will open this broad discussion involving all major stakeholders, toward optimizing the GDPR and allowing a harmonized transnational research approach.


This new book provides an article-by-article commentary on the new EU General Data Protection Regulation. Adopted in April 2016 and applicable from May 2018, the GDPR is the centrepiece of the recent reform of the EU regulatory framework for protection of personal data. It replaces the 1995 EU Data Protection Directive and has become the most significant piece of data protection legislation anywhere in the world. This book is edited by three leading authorities and written by a team of expert specialists in the field from around the EU and representing different sectors (including academia, the EU institutions, data protection authorities, and the private sector), thus providing a pan-European analysis of the GDPR. It examines each article of the GDPR in sequential order and explains how its provisions work, thus allowing the reader to easily and quickly elucidate the meaning of individual articles. An introductory chapter provides an overview of the background to the GDPR and its place in the greater structure of EU law and human rights law. Account is also taken of closely linked legal instruments, such as the Directive on Data Protection and Law Enforcement that was adopted concurrently with the GDPR, and of the ongoing work on the proposed new E-Privacy Regulation.


Author(s):  
Yola Georgiadou ◽  
Rolf de By ◽  
Ourania Kounadi

The General Data Protection Regulation (GDPR) protects the personal data of natural persons and at the same time allows the free movement of such data within the European Union (EU). Hailed as majestic by admirers and dismissed as protectionist by critics, the Regulation is expected to have a profound impact around the world, including in the African Union (AU). For European–African consortia conducting research that may affect the privacy of African citizens, the question is ‘how to protect personal data of data subjects while at the same time ensuring a just distribution of the benefits of a global digital ecosystem?’ We use location privacy as a point of departure, because information about an individual’s location is different from other kinds of personally identifiable information. We analyse privacy at two levels, individual and cultural. Our perspective is interdisciplinary: we draw from computer science to describe three scenarios of transformation of volunteered/observed information to inferred information about a natural person and from cultural theory to distinguish four privacy cultures emerging within the EU in the wake of GDPR. We highlight recent data protection legislation in the AU and discuss factors that may accelerate or inhibit the alignment of data protection legislation in the AU with the GDPR.


Sign in / Sign up

Export Citation Format

Share Document