Oblivion Is Full of Memory

Author(s):  
Anabelen Casares Marcos

The right to informational self-determination has raised bitter debate over the last decade as to the opportunity and possible scope of the right to demand withdrawal from the internet of personal information which, while true, might represent a detriment that there is no legal duty to put up with. The leading case in this topic is that of Mario Costeja, Judgment of the EU Court of Justice, May 13, 2014. The interest of recent European jurisprudence lies not so much in the recognition of such a right but in the appreciation of certain limits to its implementation, assisting data protection authorities in balancing the rights at stake in each case. Reflection on the current status of the issue considers rights and duties imposed in the matter by Regulation (EU) 2016/679, of 27 April, known as the new General Data Protection Regulation.

2020 ◽  
Vol 9 (1) ◽  
pp. 86-101
Author(s):  
Aleksandra Gebuza

AbstractThe main aim of the article is to provide analysis on the notion of the right to be forgotten developed by the CJEU in the ruling Google v. AEPD & Gonzalez and by the General Data Protection Regulation within the context of the processing of personal data on the Internet. The analysis provides the comparison of approach towards the notion between European and American jurisprudence and doctrine, in order to demonstrate the scale of difficulty in applying the concept in practice.


2017 ◽  
Vol 19 (5) ◽  
pp. 765-779 ◽  
Author(s):  
Milda Macenaite

The new European Union (EU) General Data Protection Regulation aims to adapt children’s right to privacy to the ‘digital age’. It explicitly recognizes that children deserve specific protection of their personal data, and introduces additional rights and safeguards for children. This article explores the dilemmas that the introduction of the child-tailored online privacy protection regime creates – the ‘empowerment versus protection’ and the ‘individualized versus average child’ dilemmas. It concludes that by favouring protection over the empowerment of children, the Regulation risks limiting children in their online opportunities, and by relying on the average child criteria, it fails to consider the evolving capacities and best interests of the child.


Author(s):  
Giovanni Sartor

This chapter explores the connection between host providers’ liability and data protection, particularly the right to be forgotten. A conceptual analysis provides basic ideas including privacy, publicity, and neutrality. Subsequently, host providers’ immunities in EU law are compared with safe harbour provisions in US law. Data protection exceptionalism, namely, the view that providers’ immunities do not apply to violations of data protection, is critically considered. Knowledge of illegality of hosted content as a condition for providers’ liability is examined, focusing on how different understandings of this requirement may affect providers’ behaviour. The EU General Data Protection Regulation is then considered, addressing the way it defines the interface between data protection and the role/liabilities of providers. Finally, an analysis of the right to be forgotten is proposed, focusing on how the passage of time affects the legally relevant interests involved and on how sanctions are likely to affect the actions of host providers/users.


Author(s):  
Lilian Edwards ◽  
Michael Veale

Cite as Lilian Edwards and Michael Veale, 'Slave to the Algorithm? Why a 'right to an explanation' is probably not the remedy you are looking for' (2017) 16 Duke Law and Technology Review 18–84. (First posted on SSRN 24 May 2017)Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals’ lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a “right to an explanation” has emerged as a compellingly attractive remedy since it intuitively promises to “open the black box” to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any remedy in a storm has looked attractive.However, we argue that a right to an explanation in the EU General Data Protection Regulation (GDPR) is unlikely to present a complete remedy to algorithmic harms, particularly in some of the core “algorithmic war stories” that have shaped recent attitudes in this domain. Firstly, the law is restrictive, unclear, or even paradoxical concerning when any explanation-related right can be triggered. Secondly, even navigating this, the legal conception of explanations as “meaningful information about the logic of processing” may not be provided by the kind of ML “explanations” computer scientists have developed, partially in response. ML explanations are restricted both by the type of explanation sought, the dimensionality of the domain and the type of user seeking an explanation. However, “subject-centric" explanations (SCEs) focussing on particular regions of a model around a query show promise for interactive exploration, as do explanation systems based on learning a model from outside rather than taking it apart (pedagogical vs decompositional explanations ) in dodging developers' worries of IP or trade secrets disclosure.Based on our analysis, we fear that the search for a “right to an explanation” in the GDPR may be at best distracting, and at worst nurture a new kind of “transparency fallacy.” But all is not lost. We argue that other parts of the GDPR related (i) to the right to erasure ("right to be forgotten") and the right to data portability; and (ii) to privacy by design, Data Protection Impact Assessments and certification and privacy seals, may have the seeds we can use to make algorithms more responsible, explicable, and human-centred.


AJIL Unbound ◽  
2020 ◽  
Vol 114 ◽  
pp. 31-34
Author(s):  
Nani Jansen Reventlow

The General Data Protection Regulation (GDPR) imposes important transparency and accountability requirements on different actors who process personal data. This is great news for the protection of individual data privacy. However, given that “personal information and human stories are the raw material of journalism,” what does the GDPR mean for freedom of expression and especially for journalistic activity? This essay argues that, although EU states seem to have taken their data protection obligations under the GDPR seriously, efforts to balance this against the right to freedom of expression have been more uneven. The essay concludes that it is of key importance to ensure that the GDPR's safeguards for data privacy do not compromise a free press.


AJIL Unbound ◽  
2020 ◽  
Vol 114 ◽  
pp. 10-14
Author(s):  
Svetlana Yakovleva ◽  
Kristina Irion

The European Union's (EU) negotiating position on cross-border data flows, which the EU has recently included in its proposal for the World Trade Organization (WTO) talks on e-commerce, not only enshrines the protection of privacy and personal data as fundamental rights, but also creates a broad exception for a Member's restrictions on cross-border transfers of personal data. This essay argues that maintaining such a strong position in trade negotiations is essential for the EU to preserve the internal compatibility of its legal system when it comes to the right to protection of personal data under the EU Charter of Fundamental Rights (EU Charter) and the recently adopted General Data Protection Regulation (GDPR).


2020 ◽  
Vol 3 (1) ◽  
pp. 17
Author(s):  
Kajcsa Andrea

The changes that have been brought about by the General Data Protection Regulation starting with May 2018 are complex and ambitious. The General Data Protection Regulation is one of the most wide ranging pieces of legislation passed by the EU in recent years, and it introduces many concepts that are yet to be fully discovered in practice, such as the right to be forgotten, data portability and data breach notification. This paper intends to analyze the main obligations that public bodies, particularly, have after the GDPR has entered into force, and to evaluate the impact this legislative act has on the routine activities carried out by public authorities in Romania. To reach our goal, we will make reference to the obligations that are specific to public administration authorities as well as to those that public bodies are exempted from. We will also analyze the national legislative measures adopted in Romania after GDPR started to be in force, and the degree to which these have particularized the way public bodies are allowed and obliged to process personal data in Romania.


Sign in / Sign up

Export Citation Format

Share Document