scholarly journals Competing Jurisdictions: Data Privacy Across the Borders

Author(s):  
Edoardo Celeste ◽  
Federico Fabbrini

Abstract Borderless cloud computing technologies are exacerbating tensions between European and other existing regulatory models for data privacy. On the one hand, in the European Union (EU), a series of data localisation initiatives are emerging with the objective of preserving Europe’s digital sovereignty, guaranteeing the respect of EU fundamental rights and preventing foreign law enforcement and intelligence agencies from accessing personal data. On the other hand, foreign countries are unilaterally adopting legislation requiring national corporations to disclose data stored in Europe, in this way bypassing jurisdictional boundaries grounded on physical data location. The chapter investigates this twofold dynamic by focusing particularly on the current friction between the EU data protection approach and the data privacy model of the United States (US) in the field of cloud computing.

2021 ◽  
Vol 60 (1) ◽  
pp. 53-98
Author(s):  
Michael S. Aktipis ◽  
Ron B. Katwan

On July 16, 2020, the Court of Justice of the European Union (CJEU) issued its ruling in Data Protection Commissioner v. Facebook Ireland Limited and Maximillian Schrems, commonly known as Schrems II, invalidating the EU–U.S. Privacy Shield as a valid transfer mechanism under the EU's General Data Protection Regulation (GDPR) and creating significant legal uncertainty for the continued availability of another widely used transfer mechanism, Standard Contractual Clauses (SCCs), for transfers of EU personal data from commercial entities in the EU to the United States. The widely anticipated ruling marked the second time in five years that the CJEU had invalidated the legal foundation for such data transfers, which in both cases had been the result of a carefully negotiated compromise balancing European data privacy concerns with statutory and constitutional limitations of the U.S. system (see Schrems I).


2018 ◽  
Vol 9 (3) ◽  
pp. 386-401
Author(s):  
Giulia Lasagni

With the advent of digital technologies, most people are constantly carrying in their pockets or personal belongings an increasing amount of information stored on mobile electronic devices (like smartphones or smartwatches, just to mention a few). Most of these ‘multifunctional computers that just happen to have telephone capabilities’ can store tens of gigabytes of private information, a circumstance simply unthinkable only a few decades ago. The consequences of this situation heavily affect criminal investigations and appear especially evident in search incident to arrest. Indeed, while in a predigital era, searching a person meant searching of a physical body and potentially, of carried physical items, applying the same rules to smartphones or other equivalent devices changes rather drastically the impact of this investigative technique and confers to law enforcement and/or prosecutors access to an incredible amount of personal data. Search incident to arrest, however, represents only a tip of the iceberg of the revolution brought to criminal justice systems by digital technology, to which most legal frameworks remains utterly unprepared. Against this background, this article compares the state of play on procedural safeguards concerning search of digital devices like smartphones in the United States, after the notorious decision Riley v. California, with the Italian legal system. From this specific circumstance, general considerations will be drawn upon the need of rethinking the foundational basis of fundamental rights and freedoms established by the European Convention on Human Rights and by the Charter of the Fundamental Rights of the European Union in light of the advent of digital technology, trying to delineate some guidelines from which to extrapolate procedural rules able to guarantee an adequate level of safeguard in the digital era.


2017 ◽  
Vol 107 ◽  
pp. 181-193
Author(s):  
Sylwia Majkowska-Szulc

EU–U.S. PRIVACY SHIELD AFTER A COLLISION IN THE “SAFE HARBOUR”. THE SCOPE OF PRIVACY PROTECTION AFTER THE JUDGEMENT IN THE C-362/14 SCHREMS CASETransfer of personal data is an essential element of the transatlantic trade relationship, because the EU and the United States are for each other the most important trading partners. Data transfers increasingly form an integral part of their commercial exchanges. The Court of Justice of the European Union ruling of 6 October 2015 in case C-362/14 Schrems reaffirmed the importance of the fundamental right to the protection of personal data, as enshrined in the Charter of Fundamental Rights of the EU, including the situation when such data are transferred outside the EU. In the wake of the hereinabove judgement the transatlantic data transfer has been regulated anew. European Commission has launched EU-U.S. Privacy Shield in order to ensure stronger protection for transatlantic data flows. This article aims to analyse the importance and results of the above-mentioned judgement.


Author(s):  
Anastasia Kozyreva ◽  
Philipp Lorenz-Spreen ◽  
Ralph Hertwig ◽  
Stephan Lewandowsky ◽  
Stefan M. Herzog

AbstractPeople rely on data-driven AI technologies nearly every time they go online, whether they are shopping, scrolling through news feeds, or looking for entertainment. Yet despite their ubiquity, personalization algorithms and the associated large-scale collection of personal data have largely escaped public scrutiny. Policy makers who wish to introduce regulations that respect people’s attitudes towards privacy and algorithmic personalization on the Internet would greatly benefit from knowing how people perceive personalization and personal data collection. To contribute to an empirical foundation for this knowledge, we surveyed public attitudes towards key aspects of algorithmic personalization and people’s data privacy concerns and behavior using representative online samples in Germany (N = 1065), Great Britain (N = 1092), and the United States (N = 1059). Our findings show that people object to the collection and use of sensitive personal information and to the personalization of political campaigning and, in Germany and Great Britain, to the personalization of news sources. Encouragingly, attitudes are independent of political preferences: People across the political spectrum share the same concerns about their data privacy and show similar levels of acceptance regarding personalized digital services and the use of private data for personalization. We also found an acceptability gap: People are more accepting of personalized services than of the collection of personal data and information required for these services. A large majority of respondents rated, on average, personalized services as more acceptable than the collection of personal information or data. The acceptability gap can be observed at both the aggregate and the individual level. Across countries, between 64% and 75% of respondents showed an acceptability gap. Our findings suggest a need for transparent algorithmic personalization that minimizes use of personal data, respects people’s preferences on personalization, is easy to adjust, and does not extend to political advertising.


Author(s):  
Francisco García Martínez

The creation of the General Data Protection Regulation (GDPR) constituted an enormous advance in data privacy, empowering the online consumers, who were doomed to the complete loss of control of their personal information. Although it may first seem that it only affects companies within the European Union, the regulation clearly states that every company who has businesses in the EU must be compliant with the GDPR. Other non-EU countries, like the United States, have seen the benefits of the GDPR and are already developing their own privacy laws. In this article, the most important updates introduced by the GDPR concerning US corporations will be discussed, as well as how American companies can become compliant with the regulation. Besides, a comparison between the GDPR and the state of art of privacy in the US will be presented, highlighting similarities and disparities at the national level and in states of particular interest.


Author(s):  
Peter O’Connor

The Web provides unprecedented opportunities for Web site operators to implicitly and explicitly gather highly detailed personal data about site visitors, resulting in a real and pressing threat to privacy. Approaches to protecting such personal data differ greatly throughout the world. To generalize greatly, most countries follow one of two diametrically opposed philosophies—the self-regulation approach epitomized by the United States, or the comprehensive omnibus legislative approach mandated by the European Union. In practice, of course, the situation is not so black and white as most countries utilize elements of both approaches. This chapter explains the background and importance of protecting the privacy of personal data, contrasts the two major philosophical approaches to protection mentioned above, performs a comparative analysis of the current situation throughout the world, and highlights how the legislative approach is being adopted as the de facto standard throughout the world. The use of trust marks as an alternative to the self-regulation or legislative approach is also discussed, while the effectiveness of each of these efforts is also examined.


Author(s):  
Francisco García Martínez

The creation of the General Data Protection Regulation (GDPR) constituted an enormous advance in data privacy, empowering the online consumers, who were doomed to the complete loss of control of their personal information. Although it may first seem that it only affects companies within the European Union, the regulation clearly states that every company who has businesses in the EU must be compliant with the GDPR. Other non-EU countries, like the United States, have seen the benefits of the GDPR and are already developing their own privacy laws. In this article, the most important updates introduced by the GDPR concerning US corporations will be discussed, as well as how American companies can become compliant with the regulation. Besides, a comparison between the GDPR and the state of art of privacy in the US will be presented, highlighting similarities and disparities at the national level and in states of particular interest.


2020 ◽  
Vol 11 (3) ◽  
pp. 375-389
Author(s):  
Isadora Neroni Rezende

Since 2019, over 600 law enforcement agencies across the United States have started using a groundbreaking facial recognition app designed by Clearview AI, a tech start-up which now plans to market its technology also in Europe. While the Clearview app is an expression of the wider phenomenon of the repurposing of privately held data in the law enforcement context, its use in criminal proceedings is likely to encroach on individuals’ rights in unprecedented ways. Indeed, the Clearview app goes far beyond traditional facial recognition tools. If these have been historically limited to matching government-stored images, Clearview now combines its technology with a database of over three billion images published on the Internet. Against this background, this article will review the use of this new investigative tool in light of the European Union (EU) legal framework on privacy and data protection. The proposed assessment will proceed as follows. Firstly, it will briefly assess the lawfulness of Clearview AI’s data scraping practices under the General Data Protection Regulation. Secondly, it will discuss the transfer of scraped data from the company to EU law enforcement agencies under the regime of the Directive 2016/680/EU (the Directive). Finally, it will analyse the compliance of the Clearview app with art 10 of the Police Directive, which lays down the criteria for lawful processing of biometric data. More specifically, this last analysis will focus on the strict necessity test, as defined in the Charter of Fundamental Rights of the European Union and the European Convention on Human Rights. Following this assessment, it will be argued that the Clearview app’s use in criminal proceedings is highly problematic in light of the EU legislation on privacy and data protection.


Author(s):  
Araz Poladov

Purpose of research: define the general characteristics of the protection of personal data; analysis of legislation and case law.Methods of research: analysis and study of regulatory documents containing provisions on protection of personal data.Results: normative and practical importance of personal data protection provisions in various legal acts has been underscored.The right to privacy strengthened its position in the United States in the late 19th century and is now recognized by most States.Although the right to privacy in the United States was originally a British political legacy, judicial decisions in England were more conservativeand cautious than those of U.S. courts. One of the important features of this law in the Anglo-Saxon legal system is that itwas previously formed by judicial precedents and legal doctrine. Also, the right to privacy was not among the rights provided for in theBill of Rights. In general, there is an industry-wide approach to data privacy in the United States. There is no specific federal law thatwould guarantee the confidentiality and protection of personal data. Instead, legislation at the federal level is dispersed and aims to protectdata in certain sectors. Judicial practice and court decisions taken at different times play an important role in regulating personaldata protection in the United States. It is also worth mentioning that until the 1970s, decisions of the U.S. courts did not provide thenecessary privacy protection safeguards.Discussion: offering a comprehensive and detailed study and use of this practice in other states.


Author(s):  
Pamela Samuelson

For more than two decades, internet service providers (ISPs) in the United States, the European Union (EU), and many other countries have been shielded from copyright liability under “safe harbor” rules. These rules apply to ISPs who did not know about or participate in user-uploaded infringements and who take infringing content down after receiving notice from rights holders. Major copyright industry groups were never satisfied with these safe harbors, and their dissatisfaction has become more strident over time as online infringements have grown to scale. Responding to copyright industry complaints, the EU in 2019 adopted its Directive on Copyright and Related Rights in the Digital Single Market. In particular, the Directive’s Article 17 places much stricter obligations on for-profit ISPs that host large amounts of user contents. Article 17 is internally contradictory, deeply ambiguous, and harmful to small and medium-sized companies as well as to user freedoms of expression. Moreover, Article 17 may well violate the European Charter of Fundamental Rights. In the United States, Congress commenced a series of hearings in 2020 on the safe harbor rules now codified as 17 U.S.C. § 512 of the Digital Millennium Copyright Act (DMCA). In May 2020, the U.S. Copyright Office issued its long-awaited study on Section 512, which recommended several significant changes to existing safe harbor rules. The Study’s almost exclusively pro–copyright industry stances on reform of virtually every aspect of the rules notably shortchanges other stakeholder interests. Congress should take a balanced approach in considering any changes to the DMCA safe harbor rules. Any meaningful reform of ISP liability rules should consider the interests of a wide range of stakeholders. This includes U.S.-based Internet platforms, smaller and medium-sized ISPs, startups, and the hundreds of millions of Internet users who create and enjoy user-generated content (UGC) uploaded to these platforms, as well as the interests of major copyright industries and individual creators who have been dissatisfied with the DMCA safe harbor rules.


Sign in / Sign up

Export Citation Format

Share Document