scholarly journals “Just” Algorithms: Justification (Beyond Explanation) of Automated Decisions Under the General Data Protection Regulation

2021 ◽  
Vol 1 (1) ◽  
pp. 16-28
Author(s):  
Gianclaudio Malgieri

Abstract This paper argues that if we want a sustainable environment of desirable AI systems, we should aim not only at transparent, explainable, fair, lawful, and accountable algorithms, but we also should seek for “just” algorithms, that is, automated decision-making systems that include all the above-mentioned qualities (transparency, explainability, fairness, lawfulness, and accountability). This is possible through a practical “justification” statement and process (eventually derived from algorithmic impact assessment) through which the data controller proves, in practical ways, why the AI system is not unfair, not discriminatory, not obscure, not unlawful, etc. In other words, this justification (eventually derived from data protection impact assessment on the AI system) proves the legality of the system with respect to all data protection principles (fairness, lawfulness, transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity, and accountability). All these principles are necessary components of a broader concept of just algorithmic decision-making and is already required by the GDPR, in particular considering: the data protection principles (Article 5), the need to enable (meaningful) contestations of automated decisions (Article 22) and the need to assess the AI system necessity, proportionality and legality under the Data Protection Impact Assessment model framework. (Article 35).

2021 ◽  
Vol 11 (22) ◽  
pp. 10574
Author(s):  
Sung-Soo Jung ◽  
Sang-Joon Lee ◽  
Ieck-Chae Euom

With the growing awareness regarding the importance of personal data protection, many countries have established laws and regulations to ensure data privacy and are supervising managements to comply with them. Although various studies have suggested compliance methods of the general data protection regulation (GDPR) for personal data, no method exists that can ensure the reliability and integrity of the personal data processing request records of a data subject to enable its utilization as a GDPR compliance audit proof for an auditor. In this paper, we propose a delegation-based personal data processing request notarization framework for GDPR using a private blockchain. The proposed notarization framework allows the data subject to delegate requests to process of personal data; the framework makes the requests to the data controller, which performs the processing. The generated data processing request and processing result data are stored in the blockchain ledger and notarized via a trusted institution of the blockchain network. The Hypderledger Fabric implementation of the framework demonstrates the fulfillment of system requirements and feasibility of implementing a GDPR compliance audit for the processing of personal data. The analysis results with comparisons among the related works indicate that the proposed framework provides better reliability and feasibility for the GDPR audit of personal data processing request than extant methods.


2020 ◽  
Vol 1 (1) ◽  
Author(s):  
Marta Choroszewicz ◽  
Beata Mäihäniemi

This article uses the sociolegal perspective to address current problems surrounding data protection and the experimental use of automated decision-making systems. This article outlines and discusses the hard laws regarding national adaptations of the European General Data Protection Regulation and other regulations as well as the use of automated decision-making in the public sector in six European countries (Denmark, Sweden, Germany, Finland, France, and the Netherlands). Despite its limitations, the General Data Protection Regulation has impacted the geopolitics of the global data market by empowering citizens and data protection authorities to voice their complaints and conduct investigations regarding data breaches. We draw on the Esping-Andersen welfare state typology to advance our understanding of the different approaches of states to citizens’ data protection and data use for automated decision-making between countries in the Nordic regime and the Conservative-Corporatist regime. Our study clearly indicates a need for additional legislation regarding the use of citizens’ data for automated decision-making and regulation of automated decision-making. Our results also indicate that legislation in Finland, Sweden, and Denmark draws upon the mutual trust between public administrations and citizens and thus offers only general guarantees regarding the use of citizens’ data. In contrast, Germany, France, and the Netherlands have enacted a combination of general and sectoral regulations to protect and restrict citizens’ rights. We also identify some problematic national policy responses to the General Data Protection Regulation that empower governments and related institutions to make citizens accountable to states’ stricter obligations and tougher sanctions. The article contributes to the discussion on the current phase of the developing digital welfare state in Europe and the role of new technologies (i.e., automated decision-making) in this phase. We argue that states and public institutions should play a central role in strengthening the social norms associated with data privacy and protection as well as citizens’ right to social security.


2016 ◽  
Vol 19 ◽  
pp. 252-286
Author(s):  
Orla LYNSKEY

AbstractEU data protection law has, to date, been monitored and enforced in a decentralised way by independent supervisory authorities in each Member State. While the independence of these supervisory authorities is an essential element of EU data protection law, this decentralised governance structure has led to competing claims from supervisory authorities regarding the national law applicable to a data processing operation and the national authority responsible for enforcing the data protection rules. These competing claims – evident in investigations conducted into the data protection compliance of Google and Facebook – jeopardise the objectives of the EU data protection regime. The new General Data Protection Regulation will revolutionise data protection governance by providing for a centralised decision-making body, the European Data Protection Board. While this agency will ensure the ‘Europeanisation’ of data protection law, given the nature and the extent of this Board’s powers, it marks another significant shift in the EU’s agency-creating process and must, therefore, also be considered in its broader EU context.


2021 ◽  
Vol 28 (2) ◽  
pp. 531-565
Author(s):  
Md. Toriqul Islam ◽  
Mohammad Ershadul Karim

The General Data Protection Regulation (the GDPR) of the European Union (EU) emerges as a hot-button issue in contemporary global politics, policies, and business. Based on an omnibus legal substance, extensive extraterritorial scope and influential market powers, it appears as a standard for global data protection regulations as can be witnessed by the growing tendency of adopting, or adjusting relevant national laws following the instrument across the globe. Under Article 3, of the GDPR applies against any data controller or processor within and outside the EU, who process the personal data of EU residents. Therefore, the long arm of the GDPR is extended to cover the whole world, including Malaysia. This gives rise to tension worldwide, as non-compliance thereof leads to severe fines of up to €20 million or 4% of annual turnover. This is not a hypothetical possibility, rather a reality, as a huge amount of fines are already imposed on many foreign companies, such as Google, Facebook, Uber, and Equifax to name a few. Such a scenario, due to the existence of state sovereignty principles under international law, has made the researchers around the world curious about some questions, why does the EU adopt an instrument having the extraterritorial application; whether the extraterritorial scope is legitimate under normative international law; how the provisions of this instrument can be enforced, and how these are justified. This article attempts to search for answers to those questions by analyzing the relevant rules and norms of international law and the techniques of the EU employed. The article concludes with the findings that the extraterritorial scope of the GDPR is justified under international law in a changed global context. The findings of this article will enlighten the relevant stakeholders, including Malaysian policymakers and business entities, to realise the theoretical aspects of inclusion of the extraterritorial feature of the GDPR, and this understanding may facilitate them to map their future strategies.


2020 ◽  
Vol 11 (3) ◽  
pp. 167-185
Author(s):  
Goran Vojković ◽  
Melita Milenković ◽  
Tihomir Katulić

AbstractBackgroundIoT and smart devices have become extremely popular in the last few years. With their capabilities to collect data, it is reasonable to have concerns about the protection of users’ personal information and privacy in general.ObjectivesComparing existing regulations on data protection and information security rules with the new capabilities provided by IoT and smart devices.Methods/approachThis paper will analyse information on data collected by IoT and smart devices and the corresponding legal framework to explore whether the legal framework also covers these new devices and their functionalities.ResultsVarious IoT and smart devices pose a high risk to an individual's privacy. The General Data Protection Regulation, although a relatively recent law, may not adequately regulate all instances and uses of this technology. Also, due to inadequate technological protection, abuse of such devices by unauthorized persons is possible and even likely.ConclusionsThe number of IoT and smart devices is rapidly increasing. The number of IoT and smart home device security incidents is on the rise. The regulatory framework to ensure data controller and processor compliance needs to be improved in order to create a safer environment for new innovative IoT services and products without jeopardizing the rights and freedoms of data subjects. Also, it is important to increase awareness of homeowners about potential security threats when using IoT and smart devices and services.


2021 ◽  
Vol 46 (3-4) ◽  
pp. 321-345
Author(s):  
Robert Grzeszczak ◽  
Joanna Mazur

Abstract The development of automated decision-making technologies creates the threat of de-iuridification: replacement of the legal acts’ provisions with automated, technological solutions. The article examines how selected provisions of the General Data Protection Regulation concerning, among other things, data protection impact assessments, the right to not be subject to automated decision-making, information obligations and the right to access are applied in the Polish national legal order. We focus on the institutional and procedural solutions regarding the involvement of expert bodies and other stakeholders in the process of specification of the norms included in the gdpr and their enforcement. We argue that the example of Poland shows that the solutions adopted in the gdpr do not shift the balance concerning regulatory power in regard to automated decision-making to other stakeholders and as such do not favor of a more participative approach to the regulatory processes.


Sign in / Sign up

Export Citation Format

Share Document