scholarly journals Analysis of the NHSX Contact Tracing App ‘Isle of Wight’ Data Protection Impact Assessment

Author(s):  
Michael Veale

This note examines the published data protection impact assessment (DPIA) released by NHSX in relation to their contact tracing/proximity tracing app. It highlights a range of significant issues which leave the app falling short of data protection legislation. It does this in order so that these issues can be remedied before the next DPIA is published. The main issues this note focuses on are the following:Personal data- The DPIA must not claim this data is anonymous, or that the app preserves anonymity, as under UK law, it does not.- The document (and associated public messaging) must be changed throughout to reflect the fact that it is not the case that personal data about a user is only uploaded with a user’s permission, as other people upload data revealing a user's social interactions.User rights- The lawful basis for a blanket refusal of the right to erasure is unspecified by NHSX in this DPIA.- The NHSX App unlawfully designs out the right to access when there is a legal obligation to design it in.- If the controller plans to, as with the right to erasure and the right to access, refuse all attempts at the right to object, this needs a justification in the DPIA.Monitoring and automated decision making- The DPIA must acknowledge the NHSX App systematically monitors publicly accessible spaces.- The DPIA does not set out a valid lawful basis for the solely automated, significant decision-making it correctly identifies as occurring.- The information contained in the document embedded in the DPIA describing the logic of automated decisions must be provided under GDPR, article 13.Prior consultation and e-Privacy- The Information Commissioner must be consulted prior to processing within the meaning of GDPR, art 36, not just briefed.- The DPIA should explain how the The Privacy and Electronic Communications Regulations are complied with, both in relation to Bluetooth usage and in relation to embedded trackers.The note does not consider alternative architectures or less intrusive means to achieve the purposes of the NHSX app, although these are critical issues that this DPIA could be argued as failing to assess. This note is unable to assess the risks of the app as provided by the DPIA as all the risks have been redacted.

Author(s):  
Lee A. Bygrave ◽  
Luca Tosoni

Article 4(1) (Definition of ‘personal data’) (see too recital 26); Article 4(15) (Definition of ‘data concerning health’) (see also recital 35); Article 4(16) (Definition of ‘biometric data’) (see too recital 51); Article 9(1) (Processing of special categories of personal data) (see also recital 53); Article 22(4) (Automated individual decision-making, including profiling) (see also recital 71); Article 35(3)(b) (Data protection impact assessment) (see too recital 91).


2021 ◽  
Vol 16 (2) ◽  
pp. 63-75
Author(s):  
Denitza Toptchiyska

During the pandemic of COVID-19 in April 2020 the Ministry of Health in Bulgaria began the administration of the Virusafe contact tracking application. With the Law on Emergency Measures and Actions, declared by a decision of the National Assembly of 13th March 2020 amendments to the Electronic Communications Act were adopted. The purpose of the legislative amendments was to provide access of the competent authorities to the localization data from the public electronic communication networks of the individuals, who have refused or do not fulfill the obligatory isolation or treatment under art. 61 of the Health Act. This publication aims to analyze the main features of mobile applications for tracking the contacts of infected persons, as well as the adopted legislative changes, comparing them with the standards of personal data protection provided in the EU General Data Protection Regulation 2016/679 and Directive 2002/58/EC on the right to privacy and electronic communications.


Author(s):  
Lee A. Bygrave ◽  
Luca Tosoni

Article 4(1) (Definition of ‘personal data’) (see too recital 26); Article 4(13) (Definition of ‘genetic data’) (see also recital 34); Article 4(16) (Definition of ‘biometric data’) (see too recital 51); Article 9(1) (Processing of special categories of personal data) (see also recital 53); Article 22(4) (Automated individual decision-making, including profiling) (see also recital 71); Article 35(3)(b) (Data protection impact assessment) (see too recital 91).


Author(s):  
Lee A. Bygrave

Article 3(2)(b) (Monitoring of data subjects’ behaviour); Article 5 (Principles relating to processing of personal data); Article 6 (Legal grounds for processing of personal data); Article 8 (Conditions applicable to children’s consent in relation to information society services) (see also recital 38); Article 13(2)(f) (Information on the existence of automated decision-making, including profiling) (see also recital 60); Article 14(2)(g) (Information on the existence of automated decision-making, including profiling) (see also recital 60); Article 15(1)(h) (Right of access regarding automated decision-making, including profiling) (see also recital 63); Article 21 (Right to object) (see also recital 70); Article 22 (Automated decision-making, including profiling) (see also recital 71); Article 23 (Restrictions) (see also recital 73); Article 35(3)(a) (Data protection impact assessment) (see also recital 91); Article 47(2)(e) (Binding corporate rules); Article 70(1)(f) (EDPB guidelines on automated decisions based on profiling)/


2013 ◽  
pp. 1269-1282
Author(s):  
Pedro Pina

Copyright and privacy are two fundamental values for a democratic society, since both enhance the development of each individual’s personality. Nevertheless, in cyberspace, copyright enforcement and the right to informational self determination have become two clashing realities. In fact, with the arrival of digital technology, especially the Internet, rightholders, facing massive on-line copyright infringements, mainly by file-sharers on peer-to-peer (P2P) systems, started developing more and more intrusive new enforcement strategies in electronic communications as a means to identify the infringers and the committed infractions. The goal of the present paper is to study, in a context where massive unauthorized uses of copyrighted works is an undeniable reality, how the boundaries between what is public or private become fainter, whether the use of tracking software is consistent with personal data protection legislation, and whether it is possible to reconcile these two human rights.


Author(s):  
Andoni POLO ROCA

LABURPENA: Telekomunikazioen edo komunikazio elektronikoen sektorea etengabe dabil datuen babesa probatzen; are gehiago, sektoreak oinarrizko eskubide hori edukirik gabe uzten duela ikusiko dugu agian. Sareen interkonexioak, sare-operadoreak edo berehalako mezularitza-sistemak datu pertsonalekin gatazkan sar daitezke, trafikoari buruzko datuekin adibidez. Komunikazioen sekretua edo intimitatea urratzera irits daitezke. Izan ere, Telekomunikazioei buruzko Lege Orokorra, Datuak babesteko Legea, Europar Batasuneko araudia (datuak babesteari buruzko zuzentaraua, e-Privacy zuzentaraua, e-Privacy erregelamendu berri baten proposamena) eta Estatuko nahiz Europako gainerako araudiak saiatu dira sektore horretako datuak babesteko erregimena ezartzen, baina agian ez da erabat lortu. Horregatik aztertu behar da eratu den erregimen juridikoa nahikoa den eta, horrez gain, bateragarria den Datuak Babesteko Erregelamendu Orokorrarekin (DBEO) eta datu pertsonalak babesteari eta eskubide digitalak bermatzeari buruzko abenduaren 5eko 3/2018 Lege Organikoarekin (DPBEDBLO). ABSTRACT: The telecommunications sector (or electronic communications sector) is constantly putting data protection to the test and, it is possible, that we see how this sector empties this fundamental right out. This will be cases such as those of network interconnections, network operators or instant messaging services that may conflict with personal data, such as traffic data, and may even violate the right to respect for communications or for private and family life. Thus, the Spanish General Telecommunications Law, the Data Retention Law or European regulation such as the Data Retention Directive, the e-Privacy Directive, the proposal for a new e-Privacy Regulation (ePR) and other national and European regulations have attempted to build a regime that protects data protection in this sector, but it may not have been achieved at all. Therefore, it is necessary to analyze whether the legal regime built is sufficient and, in addition, if it is compatible with the General Data Protection Regulation (GDPR) and with the Organic Law 3/2018, of 5 December, on the Protection of Personal Data and Guarantee of Digital Rights. RESUMEN: El sector de las telecomunicaciones o comunicaciones electrónicas pone constantemente a prueba la protección de datos y, es posible, que podamos ver cómo este sector vacía de contenido este derecho fundamental. Ello serán casos como los de las interconexiones de redes, operadores de red o sistemas de mensajería instantánea que pueden entrar en conflicto con datos personales, como los datos de tráfico, pudiendo llegar a vulnerar, incluso, el secreto de las comunicaciones o la intimidad. Así, la Ley General de Telecomunicaciones, la Ley de Conservación de Datos o regulación comunitaria como la Directiva sobre Conservación de Datos, la Directiva e-Privacy, la propuesta de un nuevo Reglamento e-Privacy y demás normativa nacional y europea han intentado construir un régimen que proteja la protección de datos en este sector, pero puede que no se haya conseguido del todo. Por ello, es preciso analizar si el régimen jurídico construido es suficiente y si, además, este es compatible con el Reglamento General de Protección de Datos (RGPD) y la Ley Orgánica 3/2018, de 5 de diciembre, de Protección de Datos Personales y Garantía de los Derechos Digitales (LOPDGDD).


Author(s):  
Pedro Pina

Copyright and privacy are two fundamental values for a democratic society, since both enhance the development of each individual’s personality. Nevertheless, in cyberspace, copyright enforcement and the right to informational self determination have become two clashing realities. In fact, with the arrival of digital technology, especially the Internet, rightholders, facing massive on-line copyright infringements, mainly by file-sharers on peer-to-peer (P2P) systems, started developing more and more intrusive new enforcement strategies in electronic communications as a means to identify the infringers and the committed infractions. The goal of the present paper is to study, in a context where massive unauthorized uses of copyrighted works is an undeniable reality, how the boundaries between what is public or private become fainter, whether the use of tracking software is consistent with personal data protection legislation, and whether it is possible to reconcile these two human rights.


Author(s):  
Lee A. Bygrave ◽  
Luca Tosoni

Article 4(1) (Definition of ‘personal data’) (see too recital 26); Article 4(13) (Definition of ‘genetic data’) (see too recital 34); Article 4(15) (Definition of ‘data concerning health’) (see also recital 35); Article 9(1) (Special categories of personal data); Article 22(4) (Automated individual decision-making, including profiling) (see also recital 71); Article 35(3)(b) (Data protection impact assessment) (see too recital 91).


Author(s):  
Lee A. Bygrave

Article 3(2)(b) (Monitoring of data subjects’ behaviour); Article 4(4) (Definition of ‘profiling’); Article 5(1)(a) (Fair and transparent processing) (see also recitals 39 and 60); Article 5(2) (Accountability); Article 6 (Legal grounds for processing of personal data); Article 8 (Conditions applicable to children’s consent in relation to information society services); Article 12 (see too recital 58); Article 13(2)(f) (Information on the existence of automated decision-making); Article 14(2)(g) (Information on the existence of automated decision-making); Article 15(1)(h) (Right of access regarding automated decision-making); Article 21 (Right to object) (see also recital 70); Article 23 (Restrictions); Article 35(3)(a) (Data protection impact assessment) (see too recital 84); Article 47(2)(e) (Binding corporate rules); Article 70(1)(f) (EDPB guidelines on automated decisions based on profiling).


Sign in / Sign up

Export Citation Format

Share Document