A Process for Data Protection Impact Assessment Under the European General Data Protection Regulation

Author(s):  
Felix Bieker ◽  
Michael Friedewald ◽  
Marit Hansen ◽  
Hannah Obersteller ◽  
Martin Rost
2021 ◽  
Vol 1 (1) ◽  
pp. 16-28
Author(s):  
Gianclaudio Malgieri

Abstract This paper argues that if we want a sustainable environment of desirable AI systems, we should aim not only at transparent, explainable, fair, lawful, and accountable algorithms, but we also should seek for “just” algorithms, that is, automated decision-making systems that include all the above-mentioned qualities (transparency, explainability, fairness, lawfulness, and accountability). This is possible through a practical “justification” statement and process (eventually derived from algorithmic impact assessment) through which the data controller proves, in practical ways, why the AI system is not unfair, not discriminatory, not obscure, not unlawful, etc. In other words, this justification (eventually derived from data protection impact assessment on the AI system) proves the legality of the system with respect to all data protection principles (fairness, lawfulness, transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity, and accountability). All these principles are necessary components of a broader concept of just algorithmic decision-making and is already required by the GDPR, in particular considering: the data protection principles (Article 5), the need to enable (meaningful) contestations of automated decisions (Article 22) and the need to assess the AI system necessity, proportionality and legality under the Data Protection Impact Assessment model framework. (Article 35).


2020 ◽  
Author(s):  
Dariusz Kloza ◽  
Alessandra Calvi ◽  
Simone Casiraghi ◽  
Sergi Vazquez Maymir ◽  
Nikolaos Ioannidis ◽  
...  

This Policy Brief proposes a template for a report from a process of data protection impact assessment (DPIA) in the European Union (EU). Grounded in the previously elaborated framework (cf. Policy Brief No. 1/2017) and method for impact assessment (cf. Policy Brief No. 1/2019), the proposed template conforms to the requirements of Articles 35–36 of the General Data Protection Regulation (GDPR) and reflects best practices for impact assessment, offering at the same time five novel aspects. First, it aims at comprehensiveness to arrive at the most robust advice for decision making. Second, it aims at efficiency, that is, to produce effects with the least use of resources. Third, it aims at exploring and accommodating the perspectives of various stakeholders, although the perspective of individuals dominates; it, therefore, fosters fundamental rights thinking by, for example, requiring justification for each choice, hence going beyond a mere ‘tick-box’ exercise. Fourth, it aims at adhering to the legal design approach to guide the assessors in a practical, easy and intuitive manner throughout the 11-step assessment process, providing necessary explanations for each step, while being structured in expandable and modifiable tables and fields to fill in. Fifth, it assumes its lack of finality as it will need to be revised as experience with its use grows. The template is addressed predominantly to assessors entrusted by data controllers to perform the assessment process, yet it may also assist data protection authorities (DPA) in the EU to develop (tailored down) templates for DPIA for their own jurisdictions.


Sign in / Sign up

Export Citation Format

Share Document