scholarly journals General Data Protection Regulation and Horizon 2020 Ethics Review Process: Ethics Compliance under GDPR

Bioethica ◽  
2019 ◽  
Vol 5 (1) ◽  
pp. 6
Author(s):  
Albena Kuyumdzhieva

The present manuscript examines the new ethics data protection requirements introduced for the research projects funded by the European Programme Horizon 2020.Initially, reference is made to the basic data protection principles introduced by the General Data Protection Regulation (GDPR) and the derogations permitted in the research field in favor of the science advancement. Although these derogations are subject to a number of safeguards to protect personal data, new ethics requirements are introduced for research projects funded by the European Programme Horizon 2020. The aim of these safeguards is the increased transparency and accountability at the data processing and the consequent enhanced protection of the individuals’ rights. These requirements are geared to the main research ethics postulate, which requires free, voluntary and informed participation of the research subject.Under these new requirements, Horizon 2020 beneficiaries/applicants must comply with a set of predefined standards, reflecting their ethical and legal obligations, provide a detailed and precise description of the technical and organisational measures that will be implemented in order to safeguard the rights of the research participants and also demonstrate their observance. In addition, depending on the type of the data being processed and the data processing techniques, the H2020 applicants/beneficiaries may need to provide a number of additional documents/explanations and implement further measures.

2021 ◽  
Vol 11 (22) ◽  
pp. 10574
Author(s):  
Sung-Soo Jung ◽  
Sang-Joon Lee ◽  
Ieck-Chae Euom

With the growing awareness regarding the importance of personal data protection, many countries have established laws and regulations to ensure data privacy and are supervising managements to comply with them. Although various studies have suggested compliance methods of the general data protection regulation (GDPR) for personal data, no method exists that can ensure the reliability and integrity of the personal data processing request records of a data subject to enable its utilization as a GDPR compliance audit proof for an auditor. In this paper, we propose a delegation-based personal data processing request notarization framework for GDPR using a private blockchain. The proposed notarization framework allows the data subject to delegate requests to process of personal data; the framework makes the requests to the data controller, which performs the processing. The generated data processing request and processing result data are stored in the blockchain ledger and notarized via a trusted institution of the blockchain network. The Hypderledger Fabric implementation of the framework demonstrates the fulfillment of system requirements and feasibility of implementing a GDPR compliance audit for the processing of personal data. The analysis results with comparisons among the related works indicate that the proposed framework provides better reliability and feasibility for the GDPR audit of personal data processing request than extant methods.


2019 ◽  
pp. 79-101 ◽  
Author(s):  
Aleksandra Pyka

This article refers to the issue of personal data processing conducted in connection with scientific research and in accordance with the provisions of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). It is not uncommon for the purposes of scientific research to process personal data, which is connected with the obligation to respect the rights of the data of the subjects involved. Entities conducting scientific research that process personal data for this purpose are required to apply the general reg­ulation governing, among others, the obligations imposed on the controllers. The issue of personal data processing for scientific research purposes has also been regulated in national legislation in connection with the need to apply the General Data Protection Regulation. The article discusses the basics of the admissibility of data processing for the needs of scientific research; providing personal data regarding criminal convictions and offences extracted from public registers at the request of the entity conducting scientific research; exercising the rights of the data of the subjects concerned; as well as the implementation of appropriate technical and organizational measures to ensure the security of data processing. In addition, the article discusses the issue of anonymization of personal data carried out after achieving the purpose of personal data processing, as well as the processing of special categories of personal data. The topics discussed in the article were not discussed in detail, as this would require further elaboration in a publication with a much wider volume range.


2021 ◽  
Vol 11 (2) ◽  
pp. 3-24
Author(s):  
Jozef Andraško ◽  
Matúš Mesarčík

Abstract The article focuses on the intersections of the regulation of electronic identification as provided in the eIDAS Regulation and data protection rules in the European Union. The first part of the article is devoted to the explanation of the basic notions and framework related to the electronic identity in the European Union— the eIDAS Regulation. The second part of the article discusses specific intersections of the eIDAS Regulation with the General Data Protection Regulation (GDPR), specifically scope, the general data protection clause and mainly personal data processing in the context of mutual recognition of electronic identification means. The article aims to discuss the overlapping issues of the regulation of the GDPR and the eIDAS Regulation and provides a further guide for interpretation and implementation of the outcomes in practice.


Author(s):  
Teodora Lalova ◽  
Anastassia Negrouk ◽  
Laurent Dollé ◽  
Sofie Bekaert ◽  
Annelies Debucquoy ◽  
...  

AbstractThis contribution aims to present in a clear and concise manner the intricate legal framework for biobank research in Belgium. In Part 1, we describe the Belgian biobank infrastructure, with a focus on the concept of biobank. In Part 2, we provide an overview of the applicable legal framework, namely the Act of 19 December 2008 on Human Body Material (HBM), and its amendments. Attention is given to an essential piece of self-regulation, namely the Compendium on biobanks issued by the Federal Agency on Medicine Products and Health (FAMPH). Furthermore, we delineate the interplay with relevant data protection rules. Part 3 is dedicated to the main research oversight bodies in the field of biobanking. In Part 4, we provides several examples of the ‘law in context’. In particular, we discuss issues pertaining to presumed consent, processing of personal data associated with HBM, and information provided to the donor of HBM. Finally, Part 5 and 6 addresses the impact of the EU General Data Protection Regulation (GDPR), suggests lines for further research, and outline the future possibilities for biobanking in Belgium. 


Author(s):  
Raphaël Gellert

The main goal of this book is to provide an understanding of what is commonly referred to as “the risk-based approach to data protection”. An expression that came to the fore during the overhaul process of the EU’s General Data Protection Regulation (GDPR)—even though it can also be found in other statutes under different acceptations. At its core it consists in endowing the regulated organisation that process personal data with increased responsibility for complying with data protection mandates. Such increased compliance duties are performed through risk management tools. It addresses this topic from various perspectives. In framing the risk-based approach as the latest model of a series of regulation models, the book provides an analysis of data protection law from the perspective of regulation theory as well as risk and risk management literatures, and their mutual interlinkages. Further, it provides an overview of the policy developments that led to the adoption of such an approach, which it discusses in the light of regulation theory. It also includes various discussions pertaining to the risk-based approach’s scope and meaning, to the way it has been uptaken in statutes including key provisions such as accountability and data protection impact assessments, or to its potential and limitations. Finally, it analyses how the risk-based approach can be implemented in practice by providing technical analyses of various data protection risk management methodologies.


2021 ◽  
Vol 11 (10) ◽  
pp. 4537
Author(s):  
Christian Delgado-von-Eitzen ◽  
Luis Anido-Rifón ◽  
Manuel J. Fernández-Iglesias

Blockchain technologies are awakening in recent years the interest of different actors in various sectors and, among them, the education field, which is studying the application of these technologies to improve information traceability, accountability, and integrity, while guaranteeing its privacy, transparency, robustness, trustworthiness, and authenticity. Different interesting proposals and projects were launched and are currently being developed. Nevertheless, there are still issues not adequately addressed, such as scalability, privacy, and compliance with international regulations such as the General Data Protection Regulation in Europe. This paper analyzes the application of blockchain technologies and related challenges to issue and verify educational data and proposes an innovative solution to tackle them. The proposed model supports the issuance, storage, and verification of different types of academic information, both formal and informal, and complies with applicable regulations, protecting the privacy of users’ personal data. This proposal also addresses the scalability challenges and paves the way for a global academic certification system.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Iwona Karasek-Wojciechowicz

AbstractThis article is an attempt to reconcile the requirements of the EU General Data Protection Regulation (GDPR) and anti-money laundering and combat terrorist financing (AML/CFT) instruments used in permissionless ecosystems based on distributed ledger technology (DLT). Usually, analysis is focused only on one of these regulations. Covering by this research the interplay between both regulations reveals their incoherencies in relation to permissionless DLT. The GDPR requirements force permissionless blockchain communities to use anonymization or, at the very least, strong pseudonymization technologies to ensure compliance of data processing with the GDPR. At the same time, instruments of global AML/CFT policy that are presently being implemented in many countries following the recommendations of the Financial Action Task Force, counteract the anonymity-enhanced technologies built into blockchain protocols. Solutions suggested in this article aim to induce the shaping of permissionless DLT-based networks in ways that at the same time would secure the protection of personal data according to the GDPR rules, while also addressing the money laundering and terrorist financing risks created by transactions in anonymous blockchain spaces or those with strong pseudonyms. Searching for new policy instruments is necessary to ensure that governments do not combat the development of all privacy-blockchains so as to enable a high level of privacy protection and GDPR-compliant data processing. This article indicates two AML/CFT tools which may be helpful for shaping privacy-blockchains that can enable the feasibility of such tools. The first tool is exceptional government access to transactional data written on non-transparent ledgers, obfuscated by advanced anonymization cryptography. The tool should be optional for networks as long as another effective AML/CFT measures are accessible for the intermediaries or for the government in relation to a given network. If these other measures are not available and the network does not grant exceptional access, the regulations should allow governments to combat the development of those networks. Effective tools in that scope should target the value of privacy-cryptocurrency, not its users. Such tools could include, as a tool of last resort, state attacks which would undermine the trust of the community in a specific network.


Author(s):  
Michael Veale ◽  
Reuben Binns ◽  
Lilian Edwards

Many individuals are concerned about the governance of machine learning systems and the prevention of algorithmic harms. The EU's recent General Data Protection Regulation (GDPR) has been seen as a core tool for achieving better governance of this area. While the GDPR does apply to the use of models in some limited situations, most of its provisions relate to the governance of personal data, while models have traditionally been seen as intellectual property. We present recent work from the information security literature around ‘model inversion’ and ‘membership inference’ attacks, which indicates that the process of turning training data into machine-learned systems is not one way, and demonstrate how this could lead some models to be legally classified as personal data. Taking this as a probing experiment, we explore the different rights and obligations this would trigger and their utility, and posit future directions for algorithmic governance and regulation. This article is part of the theme issue ‘Governing artificial intelligence: ethical, legal, and technical opportunities and challenges’.


Sign in / Sign up

Export Citation Format

Share Document