scholarly journals The Digital Network of Networks: Regulatory Risk and Policy Challenges of Vaccine Passports

2021 ◽  
pp. 1-11
Author(s):  
Sara Helen WILFORD ◽  
Neil MCBRIDE ◽  
Laurence BROOKS ◽  
Damian Okaibedi EKE ◽  
Simisola AKINTOYE ◽  
...  

The extensive disruption to and digital transformation of travel administration across borders largely due to COVID-19 mean that digital vaccine passports are being developed to resume international travel and kick-start the global economy. Currently, a wide range of actors are using a variety of different approaches and technologies to develop such a system. This paper considers the techno-ethical issues raised by the digital nature of vaccine passports and the application of leading-edge technologies such as blockchain in developing and deploying them. We briefly analyse four of the most advanced systems – IBM’s Digital Health Passport “Common Pass,” the International Air Transport Association’s Travel Pass, the Linux Foundation Public Health’s COVID-19 Credentials Initiative and the Vaccination Credential Initiative (Microsoft and Oracle) – and then consider the approach being taken for the EU Digital COVID Certificate. Each of these raises a range of issues, particularly relating to the General Data Protection Regulation (GDPR) and the need for standards and due diligence in the application of innovative technologies (eg blockchain) that will directly challenge policymakers when attempting to regulate within the network of networks.

2019 ◽  
Vol 6 (1) ◽  
pp. 205395171986054 ◽  
Author(s):  
Heike Felzmann ◽  
Eduard Fosch Villaronga ◽  
Christoph Lutz ◽  
Aurelia Tamò-Larrieux

Transparency is now a fundamental principle for data processing under the General Data Protection Regulation. We explore what this requirement entails for artificial intelligence and automated decision-making systems. We address the topic of transparency in artificial intelligence by integrating legal, social, and ethical aspects. We first investigate the ratio legis of the transparency requirement in the General Data Protection Regulation and its ethical underpinnings, showing its focus on the provision of information and explanation. We then discuss the pitfalls with respect to this requirement by focusing on the significance of contextual and performative factors in the implementation of transparency. We show that human–computer interaction and human-robot interaction literature do not provide clear results with respect to the benefits of transparency for users of artificial intelligence technologies due to the impact of a wide range of contextual factors, including performative aspects. We conclude by integrating the information- and explanation-based approach to transparency with the critical contextual approach, proposing that transparency as required by the General Data Protection Regulation in itself may be insufficient to achieve the positive goals associated with transparency. Instead, we propose to understand transparency relationally, where information provision is conceptualized as communication between technology providers and users, and where assessments of trustworthiness based on contextual factors mediate the value of transparency communications. This relational concept of transparency points to future research directions for the study of transparency in artificial intelligence systems and should be taken into account in policymaking.


Info ◽  
2014 ◽  
Vol 16 (3) ◽  
pp. 22-39 ◽  
Author(s):  
Rachel L. Finn ◽  
Kush Wadhwa

Purpose – This paper aims to study the ethics of “smart” advertising and regulatory initiatives in the consumer intelligence industry. Increasingly, online behavioural advertising strategies, especially in the mobile media environment, are being integrated with other existing and emerging technologies to create new techniques based on “smart” surveillance practices. These “smart” surveillance practices have ethical impacts including identifiability, inequality, a chilling effect, the objectification, exploitation and manipulation of consumers as well as information asymmetries. This article examines three regulatory initiatives – privacy-by-design considerations, the proposed General Data Protection Regulation of the EU and the US Do-Not-Track Online Act of 2013 – that have sought to address the privacy and data protection issues associated with these practices. Design/methodology/approach – The authors performed a critical literature review of academic, grey and journalistic publications surrounding behavioural advertising to identify the capabilities of existing and emerging advertising practices and their potential ethical impacts. This information was used to explore how well-proposed regulatory mechanisms might address current and emerging ethical and privacy issues in the emerging mobile media environment. Findings – The article concludes that all three regulatory initiatives fall short of providing adequate consumer and citizen protection in relation to online behavioural advertising as well as “smart” advertising. Originality/value – The article demonstrates that existing and proposed regulatory initiatives need to be amended to provide adequate citizen protection and describes how a focus on privacy and data protection does not address all of the ethical issues raised.


2020 ◽  
Author(s):  
Stuart McLennan ◽  
Leo Anthony Celi ◽  
Alena Buyx

UNSTRUCTURED The coronavirus disease (COVID-19) pandemic is very much a global health issue and requires collaborative, international health research efforts to address it. A valuable source of information for researchers is the large amount of digital health data that are continuously collected by electronic health record systems at health care organizations. The European Union’s General Data Protection Regulation (GDPR) will be the key legal framework with regard to using and sharing European digital health data for research purposes. However, concerns persist that the GDPR has made many organizations very risk-averse in terms of data sharing, even if the regulation permits such sharing. Health care organizations focusing on individual risk minimization threaten to undermine COVID-19 research efforts. In our opinion, there is an ethical obligation to use the research exemption clause of the GDPR during the COVID-19 pandemic to support global collaborative health research efforts. Solidarity is a European value, and here is a chance to exemplify it by using the GDPR regulatory framework in a way that does not hinder but actually fosters solidarity during the COVID-19 pandemic.


2020 ◽  
Vol 6 (4) ◽  
pp. 170-176
Author(s):  
Eric Wierda ◽  
Sebastiaan Blok ◽  
G Aernout Somsen ◽  
Enno T van der Velde ◽  
Igor I Tulevski ◽  
...  

Innovative ways of healthcare delivery like m-Health, the practice of medicine by mobile devices and wearable devices are the promising new technique that may lead to improvement in quality of care at lower costs. While fully acknowledging the importance of m-Health development, there are challenges on privacy legislation. We address the legal framework, especially the General Data Protection Regulation, applied to m-Health and its implications for m-Health developments in Europe. We discuss how these rules are applied using a representative example of an m-Health programme with remote monitoring in the Netherlands. We consider informing patients about the data processing and obtaining their explicit consent as main responsibilities of healthcare providers introducing m-Health in their practice.


Global Jurist ◽  
2018 ◽  
Vol 18 (2) ◽  
Author(s):  
Paul Quinn

Abstract Citizen science is an emerging trend with an ever greater number of adherents. It involves the collection and contribution of large amounts of data by private individuals for scientific research. Often such data will concern the individuals themselves and will be collected through processes of self monitoring. This phenomenon has been greatly influenced by the Internet of Things (IoT) and the connectivity of a wide range monitoring devices through the internet. In collecting such data use will often be made of the services of various commercial organisations, for example that offer cloud storage services. The possibility of data portability is extremely important in citizen science as it allows individuals (or data subjects) to be able move their data from one source to another (i. e. to new areas of scientific research). This article explores the limits and possibilities that legal rights to data portability offer, in particular the new right as outlined by the European Union’s General Data Protection Regulation. In doing so this article will look at where this right (and how it operates in the international legal context) is able to facilitate the phenomenon of citizen science.


Author(s):  
Bocong Yuan ◽  
Jiannan Li

The rapid development of digital health poses a critical challenge to the personal health data protection of patients. The European Union General Data Protection Regulation (EU GDPR) works in this context; it was passed in April 2016 and came into force in May 2018 across the European Union. This study is the first attempt to test the effectiveness of this legal reform for personal health data protection. Using the difference-in-difference (DID) approach, this study empirically examines the policy influence of the GDPR on the financial performance of hospitals across the European Union. Results show that hospitals with the digital health service suffered from financial distress after the GDPR was published in 2016. This reveals that during the transition period (2016–2018), hospitals across the European Union indeed made costly adjustments to meet the requirements of personal health data protection introduced by this new regulation, and thus inevitably suffered a policy shock to their financial performance in the short term. The implementation of GDPR may have achieved preliminary success.


2020 ◽  
Vol 33 (12) ◽  
pp. 828
Author(s):  
Marta Almada ◽  
Luis Midão ◽  
Diana Portela ◽  
Ines Dias ◽  
Francisco J. Núñez-Benjumea ◽  
...  

The digital era, that we are living nowadays, is transforming health, health care models and services, and the role of society in this new reality. We currently have a large amount of stored health data, including clinical, biometric, and scientific research data. Nonetheless, its potential is not being fully exploited. It is essential to foster the sharing and reuse of this data not only in research but also towards the development of health technologies in order to improve health care efficiency, as well as products, services or digital health apps, to promote preventive and individualized medicine and to empower citizens in health literacy and self-management. In this sense, the FAIR concept has emerged, which implies that health data is findable, accessible, shared and reusable, facilitating interoperability between systems, ensuring the protection of personal and sensitive data. In this paper we review the FAIR concept, ‘FAIRification’ process, FAIR data versus open access data, ethical issues and the general data protection regulation, and digital health and citizen science.


Data & Policy ◽  
2020 ◽  
Vol 2 ◽  
Author(s):  
Giorgia Bincoletto

Abstract This study investigates the data protection concerns arising in the context of the cross-border interoperability of Electronic Health Record (EHR) systems in the European Union. The article first introduces the policies on digital health and examines the related interoperability issues. Second, the work analyses the latest Recommendation of the European Commission on this topic. Then, the study discusses the rules and the obligations settled by the General Data Protection Regulation to be taken into account when developing interoperable EHRs. According to the data protection by design and by default provision, EHR systems should be designed ex ante to guarantee data protection rules.


2020 ◽  
Vol 27 (3) ◽  
pp. 242-258
Author(s):  
Fedele Bonifazi ◽  
Elisabetta Volpe ◽  
Giuseppe Digregorio ◽  
Viviana Giannuzzi ◽  
Adriana Ceci

Abstract The use of machine learning (ML) in medicine is becoming increasingly fundamental to analyse complex problems by discovering associations among different types of information and to generate knowledge for medical decision support. Many regulatory and ethical issues should be considered. Some relevant EU provisions, such as the General Data Protection Regulation, are applicable. However, the regulatory framework for developing and marketing a new health technology implementing ML may be quite complex. Other issues include the legal liability and the attribution of negligence in case of errors. Some of the above-mentioned concerns could be, at least partially, resolved in case the ML software is classified as a ‘medical device’, a category covered by EU/national provisions. Concluding, the challenge is to understand how sustainable is the regulatory system in relation to the ML innovation and how legal procedures should be revised in order to adapt them to the current regulatory framework.


2021 ◽  
Vol 8 (1) ◽  
pp. 205395172110187
Author(s):  
Luca Marelli ◽  
Giuseppe Testa ◽  
Ine van Hoyweghen

The emergence of a global industry of digital health platforms operated by Big Tech corporations, and its growing entanglements with academic and pharmaceutical research networks, raise pressing questions on the capacity of current data governance models, regulatory and legal frameworks to safeguard the sustainability of the health research ecosystem. In this article, we direct our attention toward the challenges faced by the European General Data Protection Regulation in regulating the potentially disruptive engagement of Big Tech platforms in health research. The General Data Protection Regulation upholds a rather flexible regime for scientific research through a number of derogations to otherwise stricter data protection requirements, while providing a very broad interpretation of the notion of “scientific research”. Precisely the breadth of these exemptions combined with the ample scope of this notion could provide unintended leeway to the health data processing activities of Big Tech platforms, which have not been immune from carrying out privacy-infringing and socially disruptive practices in the health domain. We thus discuss further finer-grained demarcations to be traced within the broadly construed notion of scientific research, geared to implementing use-based data governance frameworks that distinguish health research activities that should benefit from a facilitated data protection regime from those that should not. We conclude that a “re-purposing” of big data governance approaches in health research is needed if European nations are to promote research activities within a framework of high safeguards for both individual citizens and society.


Sign in / Sign up

Export Citation Format

Share Document