The Role of the EU Fundamental Right to Data Protection in an Algorithmic and Big Data World

Author(s):  
Yordanka Ivanova,
Keyword(s):  
Big Data ◽  
2017 ◽  
Vol 2 (Suppl. 1) ◽  
pp. 1-10
Author(s):  
Denis Horgan

In the fast-moving arena of modern healthcare with its cutting-edge science it is already, and will become more, vital that stakeholders collaborate openly and effectively. Transparency, especially on drug pricing, is of paramount importance. There is also a need to ensure that regulations and legislation covering, for example the new, smaller clinical trials required to make personalised medicine work effectively, and the huge practical and ethical issues surrounding Big Data and data protection, are common, understood and enforced across the EU. With more integration, collaboration, dialogue and increased trust among each and every one in the field, stakeholders can help mould the right frameworks, in the right place, at the right time. Once achieved, this will allow us all to work more quickly and more effectively towards creating a healthier - and thus wealthier - European Union.


2019 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Matteo La Torre ◽  
Vida Lucia Botes ◽  
John Dumay ◽  
Elza Odendaal

Purpose Privacy concerns and data security are changing the risks for businesses and organisations. This indicates that the accountability of all governance participants changes. This paper aims to investigate the role of external auditors within data protection practices and how their role is evolving due to the current digital ecosystem. Design/methodology/approach By surveying the literature, the authors embrace a practice-oriented perspective to explain how data protection practices emerge, exist and occur and examine the auditors’ position within data protection. Findings Auditors need to align their tasks to the purpose of data protection practices. Accordingly, in accessing and using data, auditors are required to engage moral judgements and follow ethical principles that go beyond their legal responsibility. Simultaneously, their accountability extends to data protection ends for instilling confidence that security risks are properly managed. Due to the changing technological conditions under, which auditors operate, the traditional auditors’ task of hearing and verifying extend to new phenomena that create risks for businesses. Thus, within data protection practices, auditors have the accountability to keep interested parties informed about data security and privacy risks, continue to transmit signals to users and instill confidence in businesses. Research limitations/implications The normative level of the study is a research limitation, which calls for future empirical research on how Big Data and data protection is reshaping accounting and auditing practices. Practical implications This paper provides auditing standard setters and practitioners with insights into the redefinitions of auditing practices in the era of Big Data. Social implications Recent privacy concerns at Facebook have sent warning signals across the world about the risks posed by in Big Data systems in terms of privacy, to those charged with governance of organisations. Auditors need to understand these privacy issues to better serve their clients. Originality/value This paper contributes to triggering discussions and future research on data protection and privacy in accounting and auditing research, which is an emerging, yet unresearched topic.


2021 ◽  
Vol 24 (6) ◽  
pp. 7-14
Author(s):  
Olga Potemkina ◽  

The article presents the EU Commission’s legislative initiative to amend the current Regulation of 2016, which defines powers and functions of the EU Agency for Law Enforcement Cooperation (Europol). The author cites the arguments used by the EU Commission in its decision to expand the functions and powers of the agency: the successful acquisition of new technologies by criminal gangs, the challenges of digital threats for law enforcement and judicial agencies of Member States, which find it difficult at the national level to properly process big data for the investigation of cross-border crimes. The article analyses the main thematic blocks of the new Regulation: enabling Europol to cooperate in the fight against criminal offenses with private parties; empowering the agency to carry out preliminary processing of big and complex databases; strengthening the role of Europol in the field of research and innovation; enabling Europol to enter alerts into the Schengen Information System, etc. The author believes that the expansion of Europol's operational powers brings it one step closer to the «European FB», i.e., an organization of a supranational nature. At the same time, the author cites the arguments of the reform’s opponents, including the political groups of the European Parliament and human rights organisations, which can be divided into two groups: a) under the pretext of ensuring security, the Commission legalises the current practice of Europol, which has gone beyond its current mandate, b) the new functions of the agency for processing a big data pose a threat to the citizens’ rights.


2020 ◽  
pp. 161-180
Author(s):  
Aleksandra Pyka

This article deals with the issue of impact assessment for the protection of personal data. This is a new obligation for the controller. The article presents the essence of impact assessment (DPIA), exclusion from the obligation to carry it out, the prerequisite for mandatory DPIA, the role of the data protection officer and the powers of the supervisory authority. The analysis of legal provisions related to the impact assessment presented here does not refer to specific situations, due to the wide scope for interpreting specific phrases contained in the General Regulation. Nevertheless, the article discusses the issue of conducting data protection impact assessments as one of the most problematic obligations incumbent on the controller, who in practice raises many doubts. The DPIA has been imprecisely regulated by the EU legislator, thus leaving controllers plenty of leeway to interpret the terms used in the General Regulation. In addition, carrying out a DPIA in practice (as a new obligation on entities setting the purposes and means of data processing) can be problematic due to the lack of harmonized methods for conducting a data protection impact assessment. However, controllers cannot assign DPIA implementation to other entities involved in data processing, such as an entity processing personal data on behalf of another. Entities setting the purposes and methods of data processing should not only take into account the provisions of the General Regulation but also a list of data processing operations that are obligatorily subject to DPIA. Controllers fulfilling the obligation to carry out a data protection impact assessment will be obliged by the supervisory authority to demonstrate how to carry out a data protection impact assessment.


Author(s):  
Sandra Wachter ◽  
Brent Mittelstadt

Big Data analytics and artificial intelligence (AI) draw non-intuitive and unverifiable inferences and predictions about the behaviors, preferences, and private lives of individuals. These inferences draw on highly diverse and feature-rich data of unpredictable value, and create new opportunities for discriminatory, biased, and invasive decision-making. Concerns about algorithmic accountability are often actually concerns about the way in which these technologies draw privacy invasive and non-verifiable inferences about us that we cannot predict, understand, or refute.Data protection law is meant to protect people’s privacy, identity, reputation, and autonomy, but is currently failing to protect data subjects from the novel risks of inferential analytics. The broad concept of personal data in Europe could be interpreted to include inferences, predictions, and assumptions that refer to or impact on an individual. If seen as personal data, individuals are granted numerous rights under data protection law. However, the legal status of inferences is heavily disputed in legal scholarship, and marked by inconsistencies and contradictions within and between the views of the Article 29 Working Party and the European Court of Justice.As we show in this paper, individuals are granted little control and oversight over how their personal data is used to draw inferences about them. Compared to other types of personal data, inferences are effectively ‘economy class’ personal data in the General Data Protection Regulation (GDPR). Data subjects’ rights to know about (Art 13-15), rectify (Art 16), delete (Art 17), object to (Art 21), or port (Art 20) personal data are significantly curtailed when it comes to inferences, often requiring a greater balance with controller’s interests (e.g. trade secrets, intellectual property) than would otherwise be the case. Similarly, the GDPR provides insufficient protection against sensitive inferences (Art 9) or remedies to challenge inferences or important decisions based on them (Art 22(3)).This situation is not accidental. In standing jurisprudence the European Court of Justice (ECJ; Bavarian Lager, YS. and M. and S., and Nowak) and the Advocate General (AG; YS. and M. and S. and Nowak) have consistently restricted the remit of data protection law to assessing the legitimacy of input personal data undergoing processing, and to rectify, block, or erase it. Critically, the ECJ has likewise made clear that data protection law is not intended to ensure the accuracy of decisions and decision-making processes involving personal data, or to make these processes fully transparent.Conflict looms on the horizon in Europe that will further weaken the protection afforded to data subjects against inferences. Current policy proposals addressing privacy protection (the ePrivacy Regulation and the EU Digital Content Directive) fail to close the GDPR’s accountability gaps concerning inferences. At the same time, the GDPR and Europe’s new Copyright Directive aim to facilitate data mining, knowledge discovery, and Big Data analytics by limiting data subjects’ rights over personal data. And lastly, the new Trades Secrets Directive provides extensive protection of commercial interests attached to the outputs of these processes (e.g. models, algorithms and inferences).In this paper we argue that a new data protection right, the ‘right to reasonable inferences’, is needed to help close the accountability gap currently posed ‘high risk inferences’ , meaning inferences that are privacy invasive or reputation damaging and have low verifiability in the sense of being predictive or opinion-based. In cases where algorithms draw ‘high risk inferences’ about individuals, this right would require ex-ante justification to be given by the data controller to establish whether an inference is reasonable. This disclosure would address (1) why certain data is a relevant basis to draw inferences; (2) why these inferences are relevant for the chosen processing purpose or type of automated decision; and (3) whether the data and methods used to draw the inferences are accurate and statistically reliable. The ex-ante justification is bolstered by an additional ex-post mechanism enabling unreasonable inferences to be challenged. A right to reasonable inferences must, however, be reconciled with EU jurisprudence and counterbalanced with IP and trade secrets law as well as freedom of expression and Article 16 of the EU Charter of Fundamental Rights: the freedom to conduct a business.


Sign in / Sign up

Export Citation Format

Share Document