A Process-based Approach to Informational Privacy and the Case of Big Medical Data

2019 ◽  
Vol 20 (1) ◽  
pp. 257-290 ◽  
Author(s):  
Michael Birnhack

Abstract Data protection law has a linear logic, in that it purports to trace the lifecycle of personal data from creation to collection, processing, transfer, and ultimately its demise, and to regulate each step so as to promote the data subject’s control thereof. Big data defies this linear logic, in that it decontextualizes data from its original environment and conducts an algorithmic nonlinear mix, match, and mine analysis. Applying data protection law to the processing of big data does not work well, to say the least. This Article examines the case of big medical data. A survey of emerging research practices indicates that studies either ignore data protection law altogether or assume an ex post position, namely that because they are conducted after the data has already been created in the course of providing medical care, and they use de-identified data, they go under the radar of data protection law. These studies focus on the end-point of the lifecycle of big data: if sufficiently anonymous at publication, the previous steps are overlooked, on the claim that they enjoy immunity. I argue that this answer is too crude. To portray data protection law in its best light, we should view it as a process-based attempt to equip data subjects with some power to control personal data about them, in all phases of data processing. Such control reflects the underlying justification of data protection law as an implementation of human dignity. The process-based approach fits current legal practices and is justified by reflecting dignitarian conceptions of informational privacy.

2020 ◽  
Vol 30 (Supplement_5) ◽  
Author(s):  
J Doetsch ◽  
I Lopes ◽  
R Redinha ◽  
H Barros

Abstract The usage and exchange of “big data” is at the forefront of the data science agenda where Record Linkage plays a prominent role in biomedical research. In an era of ubiquitous data exchange and big data, Record Linkage is almost inevitable, but raises ethical and legal problems, namely personal data and privacy protection. Record Linkage refers to the general merging of data information to consolidate facts about an individual or an event that are not available in a separate record. This article provides an overview of ethical challenges and research opportunities in linking routine data on health and education with cohort data from very preterm (VPT) infants in Portugal. Portuguese, European and International law has been reviewed on data processing, protection and privacy. A three-stage analysis was carried out: i) interplay of threefold law-levelling for Record Linkage at different levels; ii) impact of data protection and privacy rights for data processing, iii) data linkage process' challenges and opportunities for research. A framework to discuss the process and its implications for data protection and privacy was created. The GDPR functions as utmost substantial legal basis for the protection of personal data in Record Linkage, and explicit written consent is considered the appropriate basis for the processing sensitive data. In Portugal, retrospective access to routine data is permitted if anonymised; for health data if it meets data processing requirements declared with an explicit consent; for education data if the data processing rules are complied. Routine health and education data can be linked to cohort data if rights of the data subject and requirements and duties of processors and controllers are respected. A strong ethical context through the application of the GDPR in all phases of research need to be established to achieve Record Linkage between cohort and routine collected records for health and education data of VPT infants in Portugal. Key messages GDPR is the most important legal framework for the protection of personal data, however, its uniform approach granting freedom to its Member states hampers Record Linkage processes among EU countries. The question remains whether the gap between data protection and privacy is adequately balanced at three legal levels to guarantee freedom for research and the improvement of health of data subjects.


2021 ◽  
Vol 54 (1) ◽  
pp. 1-35
Author(s):  
Nikolaus Marsch ◽  
Timo Rademacher

German data protection laws all provide for provisions that allow public authorities to process personal data whenever this is ‘necessary’ for the respective authority to fulfil its tasks or, in the case of sensitive data in the meaning of art. 9 GDPR, if this is ‘absolutely necessary’. Therewith, in theory, data protection law provides for a high degree of administrative flexibility, e. g. to cope with unforeseen situations like the Coronavirus pandemic. However, these provisions, referred to in German doctrine as ‘Generalklauseln’ (general clauses or ‘catch-all’-provisions in English), are hardly used, as legal orthodoxy assumes that they are too vague to form a sufficiently clear legal basis for public purpose processing under the strict terms of the German fundamental right to informational self-determination (art. 2‍(1), 1‍(1) German Basic Law). As this orthodoxy appears to be supported by case law of the German Constitutional Court, legislators have dutifully reacted by creating a plethora of sector specific laws and provisions to enable data processing by public authorities. As a consequence, German administrative data protection law has become highly detailed and confusing, even for legal experts, therewith betraying the very purpose of legal clarity and foreseeability that scholars intended to foster by requiring ever more detailed legal bases. In our paper, we examine the reasons that underlie the German ‘ban’ on using the ‘Generalklauseln’. We conclude that the reasons do not justify the ban in general, but only in specific areas and/or processing situations such as security and criminal law. Finally, we list several arguments that do speak in favour of a more ‘daring’ approach when it comes to using the ‘Generalklauseln’ for public purpose data processing.


Cyber Crime ◽  
2013 ◽  
pp. 300-309
Author(s):  
Anna Tsiftsoglou

The Greek Data Protection Authority (DPA) was asked in July 2009 to review a proposed legislation that was exempting personal data processing via camera installations in public spaces from the scope of the Greek Data Protection Law 2472/1997. Such an exemption was justified, among other reasons, for the protection of public safety and crime prevention. This paper examines the legitimacy of this security measure from two angles: European and Greek Law. Furthermore, our analysis focuses on questions of privacy, the concept of public safety and its application, as well as the DPA’s role in safeguarding citizens’ privacy even in city streets.


Author(s):  
Jef Ausloos

Chapter 2 lays the groundwork for the rest of the book, clearly delineating the fundamental right to data protection, its relation to the GDPR, and the right to erasure in it. The historical overview demonstrates that the emergence of data protection is inherently tied to technological developments and how these may amplify power asymmetries. It is also made clear that informational self-determination or control over personal data lies at the heart of the fundamental right to data protection as proclaimed in Article 8 Charter. This is a clear difference with the GDPR that has a much wider prerogative, ie protecting all fundamental rights and freedoms whenever personal data is being processed. Put differently, whereas Article 8 Charter safeguards a minimum level of control over one’s personal data, the GDPR installs a fair balancing framework that safeguards any and all fundamental rights and freedoms as they are affected by the processing of personal data. The substantive provisions of the GDPR can be divided into four categories along the lines of ex ante v ex post and protective v empowerment measures (see data protection matrix). This chapter ends with positioning the right to erasure within the GDPR’s arsenal of ex post empowerment measures, describing its legislative history as well as its main benefits and drawbacks.


2018 ◽  
Vol 19 (5) ◽  
pp. 1269-1290 ◽  
Author(s):  
Anne de Hingh

AbstractAs the use of the Internet and online platforms grows, the scale of collecting and processing personal data and turnovers have increased correspondingly.1At the same time, public awareness about the Internet turning into a genuine profiling and advertisement machine, as well as a powerful surveillance instrument, grows. More people today are concerned about the ways in which public and private actors store and use private information. Many individuals note that they lose sight of the consequences once they give consent to the collection of their sometimes most intimate personal data. The Snowden revelations and the recent Facebook and Cambridge Analytica scandal have only reinforced this public awareness.Objections against these data processing practices cannot be explained as breaches of data protection or privacy regulation alone. In this Article, it is argued that recently passed regulations fail to solve the unease of data subjects as other, more fundamental values are at stake here. A different or complementary ethical and legal framework is needed to interpret this generally felt unease vis-à-vis current data practices and secondly to confront future developments on the data market. The concept of human dignity may be a helpful perspective in this respect. In the context of data processing, human dignity is generally interpreted in a quite specific manner, such as contributing to the empowerment and self-determination of autonomous individuals. It can be argued, however, that human dignity—in the context of the commodification and commoditization of online personal data—should be seen in a different, quite opposite, light. In sum, future regulation of privacy and data protection attention should shift towards more constraining dimensions of human dignity.


Author(s):  
Sandra Wachter ◽  
Brent Mittelstadt

Big Data analytics and artificial intelligence (AI) draw non-intuitive and unverifiable inferences and predictions about the behaviors, preferences, and private lives of individuals. These inferences draw on highly diverse and feature-rich data of unpredictable value, and create new opportunities for discriminatory, biased, and invasive decision-making. Concerns about algorithmic accountability are often actually concerns about the way in which these technologies draw privacy invasive and non-verifiable inferences about us that we cannot predict, understand, or refute.Data protection law is meant to protect people’s privacy, identity, reputation, and autonomy, but is currently failing to protect data subjects from the novel risks of inferential analytics. The broad concept of personal data in Europe could be interpreted to include inferences, predictions, and assumptions that refer to or impact on an individual. If seen as personal data, individuals are granted numerous rights under data protection law. However, the legal status of inferences is heavily disputed in legal scholarship, and marked by inconsistencies and contradictions within and between the views of the Article 29 Working Party and the European Court of Justice.As we show in this paper, individuals are granted little control and oversight over how their personal data is used to draw inferences about them. Compared to other types of personal data, inferences are effectively ‘economy class’ personal data in the General Data Protection Regulation (GDPR). Data subjects’ rights to know about (Art 13-15), rectify (Art 16), delete (Art 17), object to (Art 21), or port (Art 20) personal data are significantly curtailed when it comes to inferences, often requiring a greater balance with controller’s interests (e.g. trade secrets, intellectual property) than would otherwise be the case. Similarly, the GDPR provides insufficient protection against sensitive inferences (Art 9) or remedies to challenge inferences or important decisions based on them (Art 22(3)).This situation is not accidental. In standing jurisprudence the European Court of Justice (ECJ; Bavarian Lager, YS. and M. and S., and Nowak) and the Advocate General (AG; YS. and M. and S. and Nowak) have consistently restricted the remit of data protection law to assessing the legitimacy of input personal data undergoing processing, and to rectify, block, or erase it. Critically, the ECJ has likewise made clear that data protection law is not intended to ensure the accuracy of decisions and decision-making processes involving personal data, or to make these processes fully transparent.Conflict looms on the horizon in Europe that will further weaken the protection afforded to data subjects against inferences. Current policy proposals addressing privacy protection (the ePrivacy Regulation and the EU Digital Content Directive) fail to close the GDPR’s accountability gaps concerning inferences. At the same time, the GDPR and Europe’s new Copyright Directive aim to facilitate data mining, knowledge discovery, and Big Data analytics by limiting data subjects’ rights over personal data. And lastly, the new Trades Secrets Directive provides extensive protection of commercial interests attached to the outputs of these processes (e.g. models, algorithms and inferences).In this paper we argue that a new data protection right, the ‘right to reasonable inferences’, is needed to help close the accountability gap currently posed ‘high risk inferences’ , meaning inferences that are privacy invasive or reputation damaging and have low verifiability in the sense of being predictive or opinion-based. In cases where algorithms draw ‘high risk inferences’ about individuals, this right would require ex-ante justification to be given by the data controller to establish whether an inference is reasonable. This disclosure would address (1) why certain data is a relevant basis to draw inferences; (2) why these inferences are relevant for the chosen processing purpose or type of automated decision; and (3) whether the data and methods used to draw the inferences are accurate and statistically reliable. The ex-ante justification is bolstered by an additional ex-post mechanism enabling unreasonable inferences to be challenged. A right to reasonable inferences must, however, be reconciled with EU jurisprudence and counterbalanced with IP and trade secrets law as well as freedom of expression and Article 16 of the EU Charter of Fundamental Rights: the freedom to conduct a business.


Author(s):  
Maja Nisevic

Manipulation with Big Data Analytics allows commercial exploitation of individuals based on unfair commercial practices. Consequently, the concepts of consumer protection are essential in the data-driven economy and a central issue for effective safety for individuals in the Big Data Age. Although the fields of consumer protection and data protection in the European Union (EU) have developed separately, there is an unambiguous relationship between them. While the GDPR plays a crucial role in an individual’s data protection in a case of personal data processing, Directive 2005/29/EC (UCPD) plays an essential role in regulating an individual’s protection from the unfair commercial practice when it comes to personal data processing. A vital aspect of the UCPD is the enforcement of issues related to consumer privacy. However, a much-debated question is whether the UCPD is fully effective or not when it comes to personal data processing. This paper examines case law examples on WhatsApp and Facebook in Italy, Germany and the United Kingdom. This paper also aims to come to a conclusion on the issue of the applicability of the rules on unfair commercial practice when it comes to data processing.


Author(s):  
Anna Tsiftsoglou

The Greek Data Protection Authority (DPA) was asked in July 2009 to review a proposed legislation that was exempting personal data processing via camera installations in public spaces from the scope of the Greek Data Protection Law 2472/1997. Such an exemption was justified, among other reasons, for the protection of public safety and crime prevention. This paper examines the legitimacy of this security measure from two angles: European and Greek Law. Furthermore, our analysis focuses on questions of privacy, the concept of public safety and its application, as well as the DPA’s role in safeguarding citizens’ privacy even in city streets.


2017 ◽  
Vol 2017 (1) ◽  
pp. 35-44
Author(s):  
Dawid Zadura

Abstract In the review below the author presents a general overview of the selected contemporary legal issues related to the present growth of the aviation industry and the development of aviation technologies. The review is focused on the questions at the intersection of aviation law and personal data protection law. Massive processing of passenger data (Passenger Name Record, PNR) in IT systems is a daily activity for the contemporary aviation industry. Simultaneously, since the mid- 1990s we can observe the rapid growth of personal data protection law as a very new branch of the law. The importance of this new branch of the law for the aviation industry is however still questionable and unclear. This article includes the summary of the author’s own research conducted between 2011 and 2017, in particular his audits in LOT Polish Airlines (June 2011-April 2013) and Lublin Airport (July - September 2013) and the author’s analyses of public information shared by International Civil Aviation Organization (ICAO), International Air Transport Association (IATA), Association of European Airlines (AEA), Civil Aviation Authority (ULC) and (GIODO). The purpose of the author’s research was to determine the applicability of the implementation of technical and organizational measures established by personal data protection law in aviation industry entities.


Sign in / Sign up

Export Citation Format

Share Document