scholarly journals Soft ethics, the governance of the digital and the General Data Protection Regulation

Author(s):  
Luciano Floridi

The article discusses the governance of the digital as the new challenge posed by technological innovation. It then introduces a new distinction between soft ethics , which applies after legal compliance with legislation, such as the General Data Protection Regulation in the European Union, and hard ethics , which precedes and contributes to shape legislation. It concludes by developing an analysis of the role of digital ethics with respect to digital regulation and digital governance. This article is part of the theme issue ‘Governing artificial intelligence: ethical, legal, and technical opportunities and challenges’.

Author(s):  
Meg Leta Jones ◽  
Elizabeth Edenberg

This chapter addresses the controversy over the role of consent in data protection, as artificial intelligence systems have proliferated in people’s daily lives. Digital consent has been criticized as a meaningless, procedural act because users encounter so many different, long, and complicated terms of service that do not help them effectively assess potential harms or threats. AI systems have played a role in exacerbating existing issues, creating new challenges, and presenting alternative solutions. Most of the critiques and cures for this broken arrangement address choice-making, not consent. As the United States debates whether and why to break up big tech, and the European Union considers enforcement actions under the General Data Protection Regulation and how to update its laws to address tracking techniques in a new AI-driven smart world, consent cannot be confused with choice. Consent must be defined by its moral core, involving clear background conditions, defined scope, knowledge, voluntariness, and fairness. When consent meets these demands, it remains a powerful tool for contributing to meaningful data protection at the individual and societal levels.


Author(s):  
Michael Veale ◽  
Reuben Binns ◽  
Lilian Edwards

Many individuals are concerned about the governance of machine learning systems and the prevention of algorithmic harms. The EU's recent General Data Protection Regulation (GDPR) has been seen as a core tool for achieving better governance of this area. While the GDPR does apply to the use of models in some limited situations, most of its provisions relate to the governance of personal data, while models have traditionally been seen as intellectual property. We present recent work from the information security literature around ‘model inversion’ and ‘membership inference’ attacks, which indicates that the process of turning training data into machine-learned systems is not one way, and demonstrate how this could lead some models to be legally classified as personal data. Taking this as a probing experiment, we explore the different rights and obligations this would trigger and their utility, and posit future directions for algorithmic governance and regulation. This article is part of the theme issue ‘Governing artificial intelligence: ethical, legal, and technical opportunities and challenges’.


Hypertension ◽  
2021 ◽  
Vol 77 (4) ◽  
pp. 1029-1035
Author(s):  
Antonia Vlahou ◽  
Dara Hallinan ◽  
Rolf Apweiler ◽  
Angel Argiles ◽  
Joachim Beige ◽  
...  

The General Data Protection Regulation (GDPR) became binding law in the European Union Member States in 2018, as a step toward harmonizing personal data protection legislation in the European Union. The Regulation governs almost all types of personal data processing, hence, also, those pertaining to biomedical research. The purpose of this article is to highlight the main practical issues related to data and biological sample sharing that biomedical researchers face regularly, and to specify how these are addressed in the context of GDPR, after consulting with ethics/legal experts. We identify areas in which clarifications of the GDPR are needed, particularly those related to consent requirements by study participants. Amendments should target the following: (1) restricting exceptions based on national laws and increasing harmonization, (2) confirming the concept of broad consent, and (3) defining a roadmap for secondary use of data. These changes will be achieved by acknowledged learned societies in the field taking the lead in preparing a document giving guidance for the optimal interpretation of the GDPR, which will be finalized following a period of commenting by a broad multistakeholder audience. In parallel, promoting engagement and education of the public in the relevant issues (such as different consent types or residual risk for re-identification), on both local/national and international levels, is considered critical for advancement. We hope that this article will open this broad discussion involving all major stakeholders, toward optimizing the GDPR and allowing a harmonized transnational research approach.


Author(s):  
Yola Georgiadou ◽  
Rolf de By ◽  
Ourania Kounadi

The General Data Protection Regulation (GDPR) protects the personal data of natural persons and at the same time allows the free movement of such data within the European Union (EU). Hailed as majestic by admirers and dismissed as protectionist by critics, the Regulation is expected to have a profound impact around the world, including in the African Union (AU). For European–African consortia conducting research that may affect the privacy of African citizens, the question is ‘how to protect personal data of data subjects while at the same time ensuring a just distribution of the benefits of a global digital ecosystem?’ We use location privacy as a point of departure, because information about an individual’s location is different from other kinds of personally identifiable information. We analyse privacy at two levels, individual and cultural. Our perspective is interdisciplinary: we draw from computer science to describe three scenarios of transformation of volunteered/observed information to inferred information about a natural person and from cultural theory to distinguish four privacy cultures emerging within the EU in the wake of GDPR. We highlight recent data protection legislation in the AU and discuss factors that may accelerate or inhibit the alignment of data protection legislation in the AU with the GDPR.


2021 ◽  
pp. 77-91
Author(s):  
Kieron O’Hara

This chapter describes the Brussels Bourgeois Internet. The ideal consists of positive, managed liberty where rights of others are respected, as in the bourgeois public space, where liberty follows only when rights are secured. The exemplar of this approach is the European Union, which uses administrative means, soft law, and regulation to project its vision across the Internet. Privacy and data protection have become the most emblematic struggles. Under the Data Protection Directive of 1995, the European Union developed data-protection law and numerous privacy rights, including a right to be forgotten, won in a case against Google Spain in 2014, the arguments about which are dissected. The General Data Protection Regulation (GDPR) followed in 2018, amplifying this approach. GDPR is having the effect of enforcing European data-protection law on international players (the ‘Brussels effect’), while the European Union over the years has developed unmatched expertise in data-protection law.


2021 ◽  
Vol 273 ◽  
pp. 08099
Author(s):  
Mikhail Smolenskiy ◽  
Nikolay Levshin

The EU’s General Data Protection Regulation (GDPR) applies not only to the territory of the European Union, but also to all information systems containing data of EU’s citizens around the world. Misusing or carelessly handling personal data bring fines of up to 20 million euros or 4% of the annual turnover of the offending company. This article analyzes the main trends in the global implementation of the GDPR. Authors considered and analyzed results of personal data protection measures in nineteen regions: The USA, Canada, China, France, Germany, India, Kazakhstan, Nigeria, Russia, South Korea and Thailand, as well as the European Union and a handful of other. This allowed identifying a direct pattern between the global tightening of EU’s citizens personal data protection and the fragmentation of the global mediasphere into separate national segments. As a result of the study, the authors conclude that GDPR has finally slowed down the globalization of the online mediasphere, playing a main role in its regional fragmentation.


2020 ◽  
pp. 155-186
Author(s):  
María Dolores Mas Badia

Despite the differences between credit risk and insurance risk, in many countries large insurance companies include credit history amongst the information to be taken into account when assigning consumers to risk pools and deciding whether or not to offer them an auto or homeowner insurance policy, or to determine the premium that they should pay. In this study, I will try to establish some conclusions concerning the requirements and limits that the use of credit history data by insurers in the European Union should be subject to. In order to do this, I shall focus my attention primarily on Regulation (EU) 2016/679. This regulation, that came into force on 24 May 2018, not only forms the backbone of personal data protection in the EU, but is also set to become a model for regulation beyond the borders of the Union. This article will concentrate on two main aspects: the lawful basis for the processing of credit history data by insurers, and the rules that should apply to decisions based solely on automated processing, including profiling.Received: 30 December 2019Accepted: 07 February 2020Published online: 02 April 2020


2018 ◽  
Author(s):  
Duarte Gonçalves-Ferreira ◽  
Mariana Sousa ◽  
Gustavo M Bacelar-Silva ◽  
Samuel Frade ◽  
Luís Filipe Antunes ◽  
...  

BACKGROUND Concerns about privacy and personal data protection resulted in reforms of the existing legislation in the European Union (EU). The General Data Protection Regulation (GDPR) aims to reform the existing directive on the topic of personal data protection of EU citizens with a strong emphasis on more control of the citizens over their data and in the establishment of rules for the processing of personal data. OpenEHR is a standard that embodies many principles of interoperable and secure software for electronic health records (EHRs) and has been advocated as the best approach for the development of hospital information systems. OBJECTIVE This study aimed to understand to what extent the openEHR standard can help in the compliance of EHR systems to the GDPR requirements. METHODS A list of requirements for an EHR to support GDPR compliance and also a list of the openEHR design principles were made. The requirements were categorized and compared with the principles by experts on openEHR and GDPR. RESULTS A total of 50 GDPR requirements and 8 openEHR design principles were identified. The openEHR principles conformed to 30% (15/50) of GDPR requirements. All the openEHR principles were aligned with GDPR requirements. CONCLUSIONS This study showed that the openEHR principles conform well to GDPR, underlining the common wisdom that truly realizing security and privacy requires it to be built in from the start. By using an openEHR-based EHR, the institutions are closer to becoming compliant with GDPR while safeguarding the medical data.


2020 ◽  
Vol 4 (2) ◽  
pp. 81-94
Author(s):  
Matúš Mesarčík

A new era of data protection laws arises after the adoption of the General Data Protection Regulation (GDPR) in the European Union. One of the newly adopted regulations of processing of personal data is Californian Consumer Privacy Act commonly referred to as CCPA. The article aims to fill the gap considering a deep analysis of the territorial scope of both acts and practical consequences of the application. The article starts with a brief overview of privacy regulation in the EU and USA. Introduction to GDPR and CCPA follows focusing on the territorial scope of respective legislation. Three scenarios of applicability are derived in the following part including practical examples.


Author(s):  
Alexander Gurkov

AbstractThis chapter considers the legal framework of data protection in Russia. The adoption of the Yarovaya laws, data localization requirement, and enactment of sovereign Runet regulations allowing for isolation of the internet in Russia paint a grim representation of state control over data flows in Russia. Upon closer examination, it can be seen that the development of data protection in Russia follows many of the steps taken at the EU level, although some EU measures violated fundamental rights and were invalidated. Specific rules in this sphere in Russia are similar to the European General Data Protection Regulation. This chapter shows the special role of Roskomnadzor in forming data protection regulations by construing vaguely defined rules of legislation.


Sign in / Sign up

Export Citation Format

Share Document