The Data Protection Impact Assessment, or: How the General Data Protection Regulation May Still Come to Foster Ethically Responsible Data Processing

2015 ◽  
Author(s):  
Claudia Quelle
2021 ◽  
Vol 11 (22) ◽  
pp. 10574
Author(s):  
Sung-Soo Jung ◽  
Sang-Joon Lee ◽  
Ieck-Chae Euom

With the growing awareness regarding the importance of personal data protection, many countries have established laws and regulations to ensure data privacy and are supervising managements to comply with them. Although various studies have suggested compliance methods of the general data protection regulation (GDPR) for personal data, no method exists that can ensure the reliability and integrity of the personal data processing request records of a data subject to enable its utilization as a GDPR compliance audit proof for an auditor. In this paper, we propose a delegation-based personal data processing request notarization framework for GDPR using a private blockchain. The proposed notarization framework allows the data subject to delegate requests to process of personal data; the framework makes the requests to the data controller, which performs the processing. The generated data processing request and processing result data are stored in the blockchain ledger and notarized via a trusted institution of the blockchain network. The Hypderledger Fabric implementation of the framework demonstrates the fulfillment of system requirements and feasibility of implementing a GDPR compliance audit for the processing of personal data. The analysis results with comparisons among the related works indicate that the proposed framework provides better reliability and feasibility for the GDPR audit of personal data processing request than extant methods.


2018 ◽  
Vol 25 (2) ◽  
pp. 36-41
Author(s):  
Cassandra Yuill

AbstractIn May 2018, the European Union (EU) introduced the General Data Protection Regulation (GDPR) with the aim of increasing transparency in data processing and enhancing the rights of data subjects. Within anthropology, concerns have been raised about how the new legislation will affect ethnographic fieldwork and whether the laws contradict the discipline’s core tenets. To address these questions, the School of Oriental and African Studies (SOAS) at the University of London hosted an event on 25 May 2018 entitled ‘Is Anthropology Legal?’, bringing together researchers and data managers to begin a dialogue about the future of anthropological work in the context of the GDPR. In this article, I report and reflect on the event and on the possible implications for anthropological research within this climate of increasing governance.


2018 ◽  
Vol 18 (2) ◽  
pp. 76-79 ◽  
Author(s):  
Susan Doe

AbstractIn this article Susan Doe reports from the perspective of the law firm sector on the progress towards the introduction of the General Data Protection Regulation that became automatically ‘live’ on 25 May 2018. She provides an introduction to the Regulation, highlights some practicalities for law firms when considering compliance with GDPR and offers a ‘to do’ list with reference to the record of data processing, training needs, security, and contracts and documentation. She also provides advice on what should be considered especially in respect of client demands.


2021 ◽  
Vol 1 (1) ◽  
pp. 16-28
Author(s):  
Gianclaudio Malgieri

Abstract This paper argues that if we want a sustainable environment of desirable AI systems, we should aim not only at transparent, explainable, fair, lawful, and accountable algorithms, but we also should seek for “just” algorithms, that is, automated decision-making systems that include all the above-mentioned qualities (transparency, explainability, fairness, lawfulness, and accountability). This is possible through a practical “justification” statement and process (eventually derived from algorithmic impact assessment) through which the data controller proves, in practical ways, why the AI system is not unfair, not discriminatory, not obscure, not unlawful, etc. In other words, this justification (eventually derived from data protection impact assessment on the AI system) proves the legality of the system with respect to all data protection principles (fairness, lawfulness, transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity, and accountability). All these principles are necessary components of a broader concept of just algorithmic decision-making and is already required by the GDPR, in particular considering: the data protection principles (Article 5), the need to enable (meaningful) contestations of automated decisions (Article 22) and the need to assess the AI system necessity, proportionality and legality under the Data Protection Impact Assessment model framework. (Article 35).


Sign in / Sign up

Export Citation Format

Share Document