Protecting Citizen Privacy in Digital Government

Author(s):  
R R. Arnesen

Protecting the privacy of citizens is a critical issue in digital government services. The right to privacy is widely recognized as a fundamental human right, as stated in Article 12 of the Universal Declaration of Human Rights (United Nations, 1948). The first definition of privacy was given by American lawyers Warren and Brandeis (1890), who defined it as “the right to be let alone.” However, the right to privacy has been recognized for millenniums. The Hippocratic oath (n.d.) dates back to around 400 B.C. and instructs medical doctors to respect the privacy of their patients. During the last three decades, many countries have passed privacy legislation, the Swedish Data Act from 1973 being the first national privacy act in the world. During the 1970s, many countries adopted data protection acts (Fischer-Hübner, 2001). In 1980, OECD published its privacy guidelines with the purpose of reducing the potential privacy problems incurred by cross-border trade (OECD, 1980). The European Council adopted Directive 95/46/EC in 1995, and all member states are required to implement national privacy legislation in compliance with this directive (European Union (EU) Directive 95/46/EC, 1995). Privacy is under increasing pressure in the digital age, and the introduction of digital government services may escalate this development. The way government has been organized until now, with separate departments with their own “silos” of personal data, has inherently provided some privacy protection. In such a distributed environment data matching is expensive and resource consuming. This form of privacy protection is referred to as “practical obscurity” in Crompton (2004, p.12). Some examples of threats to privacy related to the development of digital government are as follows: • Data collection capabilities increase as new technology for continuous and automatic data collection is introduced. Examples of such technologies include digital video surveillance, biometric identification and radio frequency identification (RFID). • Data processing capabilities are rapidly increasing. The very existence of large amounts of stored personal data, together with the availability of sophisticated tools for analysis, increases the probability for misuse of data. • There is a trend towards integration of formerly separated governmental services, including physical offices. Providing a single point of contact is more user friendly, but it may also provide an attacker with a single point of attack. • Outsourcing of services (e.g., customer relationship management) is increasingly popular both among companies and governmental organizations. Those who deliver such services to many customers have a unique opportunity to gather personal information from many different sources. If services are outsourced across country borders, and perhaps in several layers, responsibilities soon become unclear. • Even if the organization responsible for stored personal information does not have malicious intents, one cannot expect all its employees to be equally trustworthy. Disloyal employees are a severe threat when increasing amounts of information are stored. • Tax records and other public records made available on the Internet enable efficient searches and aggregation of information about individuals. Identity thefts and fraud are common uses of information gathered in this way.

Author(s):  
William Bülow ◽  
Misse Wester

As information technology is becoming an integral part of modern society, there is a growing concern that too much data containing personal information is stored by different actors in society and that this could potentially be harmful for the individual. The aim of this contribution is to show how the extended use of ICT can affect the individual’s right to privacy and how the public perceives risks to privacy. Three points are raised in this chapter: first, if privacy is important from a philosophical perspective, how is this demonstrated by empirical evidence? Do individuals trust the different actors that control their personal information, and is there a consensus that privacy can and should be compromised in order to reach another value? Second, if compromises in privacy are warranted by increased safety, is this increased security supported by empirical evidence? Third, the authors will argue that privacy can indeed be a means to increase the safety of citizens and that the moral burden of ensuring and protecting privacy is a matter for policy makers, not individuals. In conclusion, the authors suggest that more nuanced discussion on the concepts of privacy and safety should be acknowledged and the importance of privacy must be seen as an important objective in the development and structure of ICT uses.


2016 ◽  
Vol 44 (1) ◽  
pp. 68-84 ◽  
Author(s):  
Aart C. Hendriks ◽  
Rachèl E. van Hellemondt

The Netherlands does not have any specific legislation pertaining to human biological materials and data collection by biobanks. Instead, these issues are governed by a patchwork of laws, codes of practices, and other ethical instruments, where special emphasis is given to the right to privacy and self-determination. While draft legislation for biobanking was scheduled to enter into force in 2007, as of mid-2015 such legislation was still under consideration, with the intent that it would focus particularly on individual self-determination, the interests of research, the use of bodily materials collected by biobanks for criminal law purposes, and dilemmas around results that are clinically relevant for biobank participants. Under the current framework, the amount of privacy protection afforded to data is linked to its level of identifiability. International sharing of personal data to non-EU/European Economic Area countries is allowed if these countries provide adequate protection.


Author(s):  
Ronggong Song ◽  
Larry Korba ◽  
George Yee

Pseudonym technology is attracting more and more attention and, together with privacy violations, is becoming a major issue in various e-services. Current e-service systems make personal data collection very easy and efficient through integration, interconnection, and data mining technologies since they use the user’s real identity. Pseudonym technology with unlinkability, anonymity, and accountability can give the user the ability to control the collection, retention, and distribution of his or her personal information. This chapter explores the challenges, issues, and solutions associated with pseudonym technology for privacy protection in e-services. To have a better understanding of how the pseudonym technology provides privacy protection in e-services, we describe a general pseudonym system architecture, discuss its relationships with other privacy technologies, and summarize its requirements. Based on the requirements, we review, analyze, and compare a number of existing pseudonym technologies. We then give an example of a pseudonym practice — e-wallet for e-services and discuss current issues.


Information ◽  
2022 ◽  
Vol 13 (1) ◽  
pp. 27
Author(s):  
Diego Garat ◽  
Dina Wonsever

In order to provide open access to data of public interest, it is often necessary to perform several data curation processes. In some cases, such as biological databases, curation involves quality control to ensure reliable experimental support for biological sequence data. In others, such as medical records or judicial files, publication must not interfere with the right to privacy of the persons involved. There are also interventions in the published data with the aim of generating metadata that enable a better experience of querying and navigation. In all cases, the curation process constitutes a bottleneck that slows down general access to the data, so it is of great interest to have automatic or semi-automatic curation processes. In this paper, we present a solution aimed at the automatic curation of our National Jurisprudence Database, with special focus on the process of the anonymization of personal information. The anonymization process aims to hide the names of the participants involved in a lawsuit without losing the meaning of the narrative of facts. In order to achieve this goal, we need, not only to recognize person names but also resolve co-references in order to assign the same label to all mentions of the same person. Our corpus has significant differences in the spelling of person names, so it was clear from the beginning that pre-existing tools would not be able to reach a good performance. The challenge was to find a good way of injecting specialized knowledge about person names syntax while taking profit of previous capabilities of pre-trained tools. We fine-tuned an NER analyzer and we built a clusterization algorithm to solve co-references between named entities. We present our first results, which, for both tasks, are promising: We obtained a 90.21% of F1-micro in the NER task—from a 39.99% score before retraining the same analyzer in our corpus—and a 95.95% ARI score in clustering for co-reference resolution.


2018 ◽  
Vol 1 (XVIII) ◽  
pp. 335-353
Author(s):  
Weronika Kupny

The protection of the right to privacy is one of the basic human rights and as a fundamental subject in most modern laws. Legal systems extend the privacy protection instruments to a significant extent, but at the same time they find reasons to strongly interfere in this area. Certainly, the dynamic development of modern technologies does not help the legislator to find a comprehensive solution. The article deals with the subject of privacy protection in the employment relationship on the area of innovation, technology development. In this study, the author also compares the impact of the use of modern technologies in the workplace today – in the light of the applicable regulations and tomorrow – taking into account enactment of Regulation (EU) 2016/679 of European Parlliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealinf Directive 95/46/EC (General Data Protection Regulation).


2021 ◽  
pp. 217-226
Author(s):  
Alexandru Țărnă ◽  

The protection and storage of personal data are clearly related to the right to respect for privacy, as guaranteed by art. 8 of the European Convention on Human Rights. The latter provision protects a whole range of rights, namely the right to respect for private and family life, home and correspondence. The principle is that art. 8 protects personal information in respect of which an individual can legitimately hope that it will not be published or used without his or her consent. The study aims to break into the jurisprudence of the European Court of Human Rights, the main objective being to identify decisions that have a fundamental impact on the doctrine and practice of personal data collection. We are aware that multiple regulations in the field of personal data collection can be deduced from the practice of the Court of Justice of the European Union (CJEU). However, given the direct impact of ECtHR decisions on the Republic of Moldova, we found it appropriate to summarize only this aspect. However, in subsequent studies we will address the issue of personal data protection by the Court of Justice of the European Union. The basic idea, derived from that study, is that the Moldovan authorities should adjust their legislation and practices to the standards set out by the ECtHR and thus avoid possible convictions by the European Court.


2020 ◽  
pp. 36-50
Author(s):  
Irina Aseeva

Being an inalienable right of a citizen of a democratic state, the right to privacy of life in the digital age is exposed to constant intrusions and encroachments. Private life is becoming an object of interest for the public, state intelligence agencies, commercial organizations, and crime, who have received the opportunity through information and communication technologies not only to look after a person through correspondence and analysis of personal data, but also to manipulate consumer choice, generate demand, track movements and contacts. At the same time, as the results of sociological studies show, modern society itself is becoming more open, and users of Internet resources give the important personal information, often voluntarily post terabytes of photos and videos, losing the border between privacy and publicity, morally acceptable and legally prohibited.


Blockchain technologies are becoming more popular in securing the sensitive data such as government holding citizens’ s wealth, health and personal information. A blockchain is a shared encrypted data of records, consisting of a ledger of transactions. As the data stored in blockchain is tamper proof, it is proposed to implement new Aadhar enrolments with P2P Blockchains and migrate the existing centralized Aadhar personnel’s personal data from the conventional RDBMS / Big data system repositories to distributed ledger technologies by creating private blockchains. In this paper, we will discuss how to provide security for Aadhar card enrolment data using blockchain architectures. A blockchain-based Aadhaar would help UIDAI in truly complying with the data protection and privacy stipulations outlined in the Right to Privacy Act judgment


2021 ◽  
pp. 10-19
Author(s):  
Greta Angjeli ◽  
Besmir Premalaj

One of the fundamental human rights protected by various international conventions is the right to the protection of privacy, or as defined in the European Convention on Human Rights, the right to respect private and family life. Affiliated to this right is also the right to data protection, which is described by various authors as a modern derivation of the right to privacy protection. The protection of personal data in the context of privacy protection was jeopardized by the rapid and widespread of information technology, automated data processing and the risk of access to this data by unauthorized persons on the network. The legal regulation for the non-violation of the right to respect private life by the processing of personal data with automated systems was one of the challenges of many states which had to allow the use of artificial intelligence for the benefit of further economic and social development, at the same time they had to ensure the protection of the personal data of their citizens. In this context, the EU has issued another regulation on personal data protection (General Data Protection Regulation (EU) 2016/679). The purpose of this paper is to highlight the impact of artificial intelligence on the right to respect private life and the legal protection of personal data from misuse through artificial intelligence.


2013 ◽  
Vol 20 (1) ◽  
pp. 63-78
Author(s):  
Maria Inês de Oliveira Martins

Abstract The need of private insurers for information on the candidate’s health risks is recognized by the law, which places pre-contractual duties of disclosure upon the candidates. When the risks are influenced by health factors, e.g. in the case of life- and health insurances, it implies the provision of health information by the candidates, who thus voluntarily limit their right to privacy. This consent, however, often happens in a context of factual coercion to contract. Next to this, from a legal standpoint, the collection of personal information must respond to the principle of proportionality. Against this background, this article assesses the compatibility of questionnaire techniques that rely on open-ended health related questions with the right to privacy, as protected by Portuguese and international law. It then analyses the extent of pre-contractual duties of disclosure as defined by the Portuguese Insurance Act, which requires the candidate to volunteer all the relevant information independently of being asked for it. In doing so, the article also refers to some other European countries. It concludes that the relevant Portuguese legislation is incompatible both with Portuguese constitutional law and with international law.


Sign in / Sign up

Export Citation Format

Share Document