Cookie Banners and Privacy Policies: Measuring the Impact of the GDPR on the Web

2021 ◽  
Vol 15 (4) ◽  
pp. 1-42
Author(s):  
Michael Kretschmer ◽  
Jan Pennekamp ◽  
Klaus Wehrle

The General Data Protection Regulation (GDPR) is in effect since May of 2018. As one of the most comprehensive pieces of legislation concerning privacy, it sparked a lot of discussion on the effect it would have on users and providers of online services in particular, due to the large amount of personal data processed in this context. Almost three years later, we are interested in revisiting this question to summarize the impact this new regulation has had on actors in the World Wide Web. Using Scopus, we obtain a vast corpus of academic work to survey studies related to changes on websites since and around the time the GDPR went into force. Our findings show that the emphasis on privacy increased w.r.t. online services, but plenty potential for improvements remains. Although online services are on average more transparent regarding data processing practices in their public data policies, a majority of these policies still either lack information required by the GDPR (e.g., contact information for users to file privacy inquiries) or do not provide this information in a user-friendly form. Additionally, we summarize that online services more often provide means for their users to opt out of data processing, but regularly obstruct convenient access to such means through unnecessarily complex and sometimes illegitimate interface design. Our survey further details that this situation contradicts the preferences expressed by users both verbally and through their actions, and researchers have proposed multiple approaches to facilitate GDPR-conform data processing without negatively impacting the user experience. Thus, we compiled reoccurring points of criticism by privacy researchers and data protection authorities into a list of four guidelines for service providers to consider.

2019 ◽  
Vol 30 (3) ◽  
pp. 607-625 ◽  
Author(s):  
Jan Hendrik Betzing ◽  
Matthias Tietz ◽  
Jan vom Brocke ◽  
Jörg Becker

Abstract Smart devices provide unprecedented access to users’ personal information, on which businesses capitalize to offer personalized services. Although users must grant permission before their personal information is shared, they often do so without knowing the consequences of their decision. Based on the EU General Data Protection Regulation, which mandates service providers to comprehensively inform users about the purpose and terms of personal data processing, this article examines how increased transparency regarding personal data processing practices in mobile permission requests impact users in making informed decisions. We conducted an online experiment with 307 participants to test the effect of transparency on users’ decisions about and comprehension of the requested permission. The results indicate increased comprehension of data processing practices when privacy policies are transparently disclosed, whereas acceptance rates do not vary significantly. We condense our findings into principles that service providers can apply to design privacy-transparent mobile apps.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Iwona Karasek-Wojciechowicz

AbstractThis article is an attempt to reconcile the requirements of the EU General Data Protection Regulation (GDPR) and anti-money laundering and combat terrorist financing (AML/CFT) instruments used in permissionless ecosystems based on distributed ledger technology (DLT). Usually, analysis is focused only on one of these regulations. Covering by this research the interplay between both regulations reveals their incoherencies in relation to permissionless DLT. The GDPR requirements force permissionless blockchain communities to use anonymization or, at the very least, strong pseudonymization technologies to ensure compliance of data processing with the GDPR. At the same time, instruments of global AML/CFT policy that are presently being implemented in many countries following the recommendations of the Financial Action Task Force, counteract the anonymity-enhanced technologies built into blockchain protocols. Solutions suggested in this article aim to induce the shaping of permissionless DLT-based networks in ways that at the same time would secure the protection of personal data according to the GDPR rules, while also addressing the money laundering and terrorist financing risks created by transactions in anonymous blockchain spaces or those with strong pseudonyms. Searching for new policy instruments is necessary to ensure that governments do not combat the development of all privacy-blockchains so as to enable a high level of privacy protection and GDPR-compliant data processing. This article indicates two AML/CFT tools which may be helpful for shaping privacy-blockchains that can enable the feasibility of such tools. The first tool is exceptional government access to transactional data written on non-transparent ledgers, obfuscated by advanced anonymization cryptography. The tool should be optional for networks as long as another effective AML/CFT measures are accessible for the intermediaries or for the government in relation to a given network. If these other measures are not available and the network does not grant exceptional access, the regulations should allow governments to combat the development of those networks. Effective tools in that scope should target the value of privacy-cryptocurrency, not its users. Such tools could include, as a tool of last resort, state attacks which would undermine the trust of the community in a specific network.


2021 ◽  
Vol 11 (22) ◽  
pp. 10574
Author(s):  
Sung-Soo Jung ◽  
Sang-Joon Lee ◽  
Ieck-Chae Euom

With the growing awareness regarding the importance of personal data protection, many countries have established laws and regulations to ensure data privacy and are supervising managements to comply with them. Although various studies have suggested compliance methods of the general data protection regulation (GDPR) for personal data, no method exists that can ensure the reliability and integrity of the personal data processing request records of a data subject to enable its utilization as a GDPR compliance audit proof for an auditor. In this paper, we propose a delegation-based personal data processing request notarization framework for GDPR using a private blockchain. The proposed notarization framework allows the data subject to delegate requests to process of personal data; the framework makes the requests to the data controller, which performs the processing. The generated data processing request and processing result data are stored in the blockchain ledger and notarized via a trusted institution of the blockchain network. The Hypderledger Fabric implementation of the framework demonstrates the fulfillment of system requirements and feasibility of implementing a GDPR compliance audit for the processing of personal data. The analysis results with comparisons among the related works indicate that the proposed framework provides better reliability and feasibility for the GDPR audit of personal data processing request than extant methods.


2021 ◽  
Vol 12 ◽  
Author(s):  
Dorota Krekora-Zając ◽  
Błażej Marciniak ◽  
Jakub Pawlikowski

Personal data protection has become a fundamental normative challenge for biobankers and scientists researching human biological samples and associated data. The General Data Protection Regulation (GDPR) harmonises the law on protecting personal data throughout Europe and allows developing codes of conduct for processing personal data based on GDPR art. 40. Codes of conduct are a soft law measure to create protective standards for data processing adapted to the specific area, among others, to biobanking of human biological material. Challenges in this area were noticed by the European Data Protection Supervisor on data protection and Biobanking and BioMolecular Resources Research Infrastructure–European Research Infrastructure Consortium (BBMRI.ERIC). They concern mainly the specification of the definitions of the GDPR and the determination of the appropriate legal basis for data processing, particularly for transferring data to other European countries. Recommendations indicated in the article, which are based on the GDPR, guidelines published by the authority and expert bodies, and our experiences regarding the creation of the Polish code of conduct, should help develop how a code of conduct for processing personal data in biobanks should be developed.


2012 ◽  
Vol 13 (2) ◽  
Author(s):  
Peter Traung

AbstractAmong other things, the proposed General Data Protection Regulation aims at substantially reducing fragmentation, administrative burden and cost and to provide clear rules, simplifying the legal environment. This article argues that considerable work is needed to achieve those goals and that the proposal fails to provide either substantial legal certainty or simplification, that it adds administrative burden while leaving ample risk of fragmentation. In particular, the proposal misses the opportunity of strengthening data protection while achieving substantial simplification through abolishing the controller/ processor distinction and allowing transfers with no reduction of the controller’s liability. Large parts of the proposal depend entirely on clarification through delegated acts issued by the Commission. Prospects for those being adopted look dire. Failing either delegated acts or substantial redrafting, those parts may become dead letter or worse. There is a highly problematic obligation to “demonstrate compliance” with the law. The proportionate alternative to a number of other obligations on controllers, such as to maintain various documentation, appoint data protection officers etc, is to include such obligations as possible behavioural sanctions in case of a proven breach of the law. The proposal also appears to raise issues regarding freedom of movement. The impact assessment largely fails to demonstrate a need and net benefit from the proposed additional obligations. It also appears to severely underestimate the costs of the proposals, partly due to what appears to be arithmetic errors. The proposal does interestingly and rudimentarily put a value on personal data, but the approach could be extended.


Author(s):  
Antonia Russo ◽  
Gianluca Lax ◽  
Baptiste Dromard ◽  
Menad Mezred

AbstractThe General Data Protection Regulation highlights the principle of data minimization, which means that only data required to successfully accomplish a given task should be processed. In this paper, we propose a Blockchain-based scheme that allows users to have control over the personal data revealed when accessing a service. The proposed solution does not rely on sophisticated cryptographic primitives, provides mechanisms for revoking the authorization to access a service and for guessing the identity of a user only in cases of need, and is compliant with the recent eIDAS Regulation. We prove that the proposed scheme is secure and reaches the expected goal, and we present an Ethereum-based implementation to show the effectiveness of the proposed solution.


2019 ◽  
pp. 79-101 ◽  
Author(s):  
Aleksandra Pyka

This article refers to the issue of personal data processing conducted in connection with scientific research and in accordance with the provisions of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). It is not uncommon for the purposes of scientific research to process personal data, which is connected with the obligation to respect the rights of the data of the subjects involved. Entities conducting scientific research that process personal data for this purpose are required to apply the general reg­ulation governing, among others, the obligations imposed on the controllers. The issue of personal data processing for scientific research purposes has also been regulated in national legislation in connection with the need to apply the General Data Protection Regulation. The article discusses the basics of the admissibility of data processing for the needs of scientific research; providing personal data regarding criminal convictions and offences extracted from public registers at the request of the entity conducting scientific research; exercising the rights of the data of the subjects concerned; as well as the implementation of appropriate technical and organizational measures to ensure the security of data processing. In addition, the article discusses the issue of anonymization of personal data carried out after achieving the purpose of personal data processing, as well as the processing of special categories of personal data. The topics discussed in the article were not discussed in detail, as this would require further elaboration in a publication with a much wider volume range.


2021 ◽  
Vol 11 (2) ◽  
pp. 3-24
Author(s):  
Jozef Andraško ◽  
Matúš Mesarčík

Abstract The article focuses on the intersections of the regulation of electronic identification as provided in the eIDAS Regulation and data protection rules in the European Union. The first part of the article is devoted to the explanation of the basic notions and framework related to the electronic identity in the European Union— the eIDAS Regulation. The second part of the article discusses specific intersections of the eIDAS Regulation with the General Data Protection Regulation (GDPR), specifically scope, the general data protection clause and mainly personal data processing in the context of mutual recognition of electronic identification means. The article aims to discuss the overlapping issues of the regulation of the GDPR and the eIDAS Regulation and provides a further guide for interpretation and implementation of the outcomes in practice.


2020 ◽  
pp. 161-180
Author(s):  
Aleksandra Pyka

This article deals with the issue of impact assessment for the protection of personal data. This is a new obligation for the controller. The article presents the essence of impact assessment (DPIA), exclusion from the obligation to carry it out, the prerequisite for mandatory DPIA, the role of the data protection officer and the powers of the supervisory authority. The analysis of legal provisions related to the impact assessment presented here does not refer to specific situations, due to the wide scope for interpreting specific phrases contained in the General Regulation. Nevertheless, the article discusses the issue of conducting data protection impact assessments as one of the most problematic obligations incumbent on the controller, who in practice raises many doubts. The DPIA has been imprecisely regulated by the EU legislator, thus leaving controllers plenty of leeway to interpret the terms used in the General Regulation. In addition, carrying out a DPIA in practice (as a new obligation on entities setting the purposes and means of data processing) can be problematic due to the lack of harmonized methods for conducting a data protection impact assessment. However, controllers cannot assign DPIA implementation to other entities involved in data processing, such as an entity processing personal data on behalf of another. Entities setting the purposes and methods of data processing should not only take into account the provisions of the General Regulation but also a list of data processing operations that are obligatorily subject to DPIA. Controllers fulfilling the obligation to carry out a data protection impact assessment will be obliged by the supervisory authority to demonstrate how to carry out a data protection impact assessment.


Sign in / Sign up

Export Citation Format

Share Document