Valuing Personal Data to Foster Privacy: A Thought Experiment and Opportunities for Research

2016 ◽  
Vol 30 (2) ◽  
pp. 169-181 ◽  
Author(s):  
Juergen Sidgman ◽  
Malcolm Crompton

ABSTRACT Despite the efforts of regulatory bodies and the private sector, effective protection of personal data through legislation and business self-regulation efforts remains elusive. Privacy legislation is difficult because the flow of data is difficult to predict. Businesses tend to be ineffective at data protection because, generally, they misunderstand the value of the data they possess. Businesses, therefore, do not invest enough in protecting the undervalued asset and data are not managed to reflect their importance to organizations, individuals, and markets. This paper presents the argument that to understand data properly and to improve privacy protection, data must be valued. The paper also elaborates on major impediments to the valuation of data, as well as advantages of overcoming these impediments. In light of the paucity of both privacy and data valuation studies by accounting scholars, the paper also identifies opportunities for research.

Author(s):  
Ella Gorian

The object of this research is the relations in the area of implementation of artificial intelligence technologies. The subject of this research is the normative documents of Singapore that establish requirements towards development and application of artificial intelligence technologies. The article determines the peculiarities of Singaporean approach towards regulation of relations in the indicated sphere. Characteristic is given to the national initiative and circle of actors involved in the development and realization of normative provisions with regards to implementation of digital technologies. The author explores the aspects of private public partnership, defines the role of government in regulation of relation, as well as gives special attention to the question of ensuring personal data protection used by the artificial intelligence technologies. Positive practices that can be utilized in Russian strategy for the development of artificial intelligence are described. Singapore applies the self-regulation approach towards the processes of implementation of artificial intelligence technologies, defining the backbone role of the government, establishing common goals, and involving representative of private sector and general public. Moreover, the government acts as the guarantor of meeting the interests of private sector by creating an attractive investment regime and citizens, setting strict requirements with regards to data usage and control over the artificial intelligence technologies. A distinguishing feature of Singaporean approach consists in determination of the priority sectors of economy and instruments of ensuring systematicity in implementation of artificial intelligence. Singapore efficiently uses its demographic and economic peculiarities for proliferation of the technologies of artificial intelligence in Asian Region; the developed and successfully tested on the national level model of artificial intelligence management received worldwide recognition and application. Turning Singapore into the international center of artificial intelligence is also instigated by the improvement of legal regime with simultaneous facilitation in the sphere of intellectual property. These specificities should be taken into account by the Russian authors of national strategy for the development of artificial intelligence.


2014 ◽  
Vol 2 (2) ◽  
pp. 72 ◽  
Author(s):  
Joanna Kulesza

The paper covers the political and legal consequences of US deployed extensive cyber surveillance program, usually referred to with the codename PRISM. The author identifies the significant transnational legal challenges for privacy protection originated by US cybersecurity policy and the steps taken by other states aimed at limiting its consequences harmful to individual privacy. The author covers varying reactions to USimposed privacy intrusions, from Brazil’s plans to withdraw from the global network to some states’ suggestions of holding Washington internationally responsible for violating the International Covenant on Civil and Political Rights. The paper’s focus however is on the European personal data protection thus far not providing effective transnational protection of privacy, primarily through the strongly criticised and ineffective EU-US Safe Harbor arrangement. The EU personal data reform, approved by the European Parliament in March of 2014, seems the most significant consequence of mass privacy violations committed by the US National Security Agency and its agents. The 2012 proposed Data Protection Regulation, which, together with the new personal data Directive, are to replace the 1995 Data Protection Directive 95/46/EC put strong emphasis on the effectiveness of transboundary privacy protection, although cover also many other significant changes, such as introducing the right to be forgotten or centralising the personal data protection decisions thus-far distributed among national Data Protection Authorities, often varying in their interpretations of community law. The reform is to oblige all companies, regardless of their country of incorporation, to meet EU privacy laws as it introduces high financial responsibility for those who fail to do so, making it a trigger for a significant change in the way the online markets operate. The European approach seems significant for the entire international community not only because European citizens are an important element of the online markets, but also because personal data protection as a tool for safeguarding individual privacy has been adopted in over 100 out of the roughly 190 world’s countries. Including an element of transnational data protection in EU law is therefore certain to influence the approach to privacy in other continents.


FIAT JUSTISIA ◽  
2018 ◽  
Vol 12 (3) ◽  
pp. 206
Author(s):  
Rudi Natamiharja

The rights to privacy as an individual fundamental right should be protected. Ironically, this right is deliberately delivered publicly in social media. And Facebook, the largest social media, keep more than 2.2 billion privacies data in the whole world. In early April 2018, one million personal data of Indonesian Facebook users was stolen by other parties. Mark Zuckerberg, as a founder and CEO, acknowledged that the Facebook data consisting of customer personal data had been stolen and used by other parties. It is one of the weaknesses and negligence of Facebook that needs to be addressed in the future. Indonesia government issued a warning letter to Facebook and required formal explanation concerning those recent cases. However, the Government's seriousness on the protection of personal data of its citizens is still questioned. How Indonesian regulations cover private data protection on their citizen and what steps should be taken to protect personal data in Indonesia? By using the International instrument and Indonesia legal instruments on the protection of privacy right, this article would give the answer what government Indonesian should do to undertake this situation. The research found that the regulation of privacy protection is sufficient yet the government has no determination to take account seriously on protecting the privacy right, and no sanction to the parties was involved. Socialization on the importance of personal data toward Indonesian society in Indonesia should be done, from the basic to the top level. Keyword: Right Privacy, International Law, Fundamental Rights


2002 ◽  
Vol 1 (4) ◽  
Author(s):  
Nicola Green ◽  
Sean Smith

The growth of mobile digital communication devices has seen a corresponding growth in the data created by users in the course of their mobile communications. The ease with which such data - including sensitive time-dependent location information - can be collected and stored raises clear data protection and concerns. The value such data offers to both law enforcement agencies and the private sector has complicated regulatory responses to such data protection concerns. This has lead to the contradictory situation in which mobile data is used by the law enforcement agencies and the private sector to identify individual users, yet this same information is not considered to be 'personal data'.


2019 ◽  
pp. 146-164
Author(s):  
Christopher Millard

This chapter brings a legal perspective to bear on the topic of data protection on the contemporary Internet in which personal information is increasingly stored and processed in, and accessed from, “the cloud.” The reliance of ever more apps, websites, and services on cloud providers contrasts with earlier days of the Internet in which much more data was stored locally on personal computers. At a time when there is ever more use of cloud computing, this chapter illuminates the complexities over what information in cloud computing environments is protected as personal data, and who is responsible. Will data protection laws, such as those in the EU, protect us, or are there alternative approaches to providing effective protection for personal data in clouds? This chapter airs the question of whether a greater focus should be placed on localizing personal data, as advocated by the Internet pioneer, Tim Berners-Lee.


2021 ◽  
pp. 201-222
Author(s):  
Omri Ben-Shahar ◽  
Ariel Porat

Personalized law requires massive information, and this chapter examines some of the problems relating to the accumulation of personal data in the hands of the government. It first surveys what kinds of information would be needed and how lawmakers might hope to acquire that necessary data. While much information is already available in government databases, is it realistic to expect commercial databases to share the data with the government? The chapter then shifts to asking how the personalized commands would be communicated to actors. It argues, counterintuitively, that in important areas, private actors may often find it easier to know their personalized command than figure out the uniform command. Finally, the chapter examines problems of privacy and data protection, arising from the accumulation of data in the hands of governments. It argues that privacy interests vary across people, and thus privacy protection—like other aspects of personalized law—could itself be personalized, allowing people to opt out of some privacy-sensitive personalized treatments.


2018 ◽  
Vol 42 (3) ◽  
pp. 290-303 ◽  
Author(s):  
Montserrat Batet ◽  
David Sánchez

Purpose To overcome the limitations of purely statistical approaches to data protection, the purpose of this paper is to propose Semantic Disclosure Control (SeDC): an inherently semantic privacy protection paradigm that, by relying on state of the art semantic technologies, rethinks privacy and data protection in terms of the meaning of the data. Design/methodology/approach The need for data protection mechanisms able to manage data from a semantic perspective is discussed and the limitations of statistical approaches are highlighted. Then, SeDC is presented by detailing how it can be enforced to detect and protect sensitive data. Findings So far, data privacy has been tackled from a statistical perspective; that is, available solutions focus just on the distribution of the data values. This contrasts with the semantic way by which humans understand and manage (sensitive) data. As a result, current solutions present limitations both in preventing disclosure risks and in preserving the semantics (utility) of the protected data. Practical implications SeDC captures more general, realistic and intuitive notions of privacy and information disclosure than purely statistical methods. As a result, it is better suited to protect heterogenous and unstructured data, which are the most common in current data release scenarios. Moreover, SeDC preserves the semantics of the protected data better than statistical approaches, which is crucial when using protected data for research. Social implications Individuals are increasingly aware of the privacy threats that the uncontrolled collection and exploitation of their personal data may produce. In this respect, SeDC offers an intuitive notion of privacy protection that users can easily understand. It also naturally captures the (non-quantitative) privacy notions stated in current legislations on personal data protection. Originality/value On the contrary to statistical approaches to data protection, SeDC assesses disclosure risks and enforces data protection from a semantic perspective. As a result, it offers more general, intuitive, robust and utility-preserving protection of data, regardless their type and structure.


Author(s):  
Yue WANG

LANGUAGE NOTE | Document text in Chinese; abstract in English only.At present, the development of AI depends on three core elements: high-quality data, accurate algorithms and sufficient computing power. New technologies represented by big data, cloud computing and AI are exerting a significant impact on traditional data protection. Individuals' control over their personal data is weakening, data protection is becoming more difficult, and traditional measures of privacy protection are at risk of failure. These are the most representative problems in the conflict between the development of new technology and privacy protection. A new legal and ethical framework that values humans' physical safety, health and dignity should be established and deeply integrated into the entire life cycle of the design, production and application of medical AI. Based on this premise, effort should be made to promote the development of medical AI for the benefit of mankind.DOWNLOAD HISTORY | This article has been downloaded 38 times in Digital Commons before migrating into this platform.


Author(s):  
Anastazja Gajda

The paper concentrates on the protection of personal data in the European Union. The paper presents a comprehensive reform of the data protection frame‑ work, proposed by the European Commission in January 2012, including a policy Communication setting out the Commission’s objectives and two legislative pro‑ posals: a regulation setting out a general EU framework for data protection and a directive on protecting personal data processed for the purposes of prevention, detection, investigation or prosecution of criminal offences and related judicial activities. Both proposals concern the question of ensuring effective protection of fundamental rights. The analysis of the proposed legislation shows nevertheless that in this shape they do not lead to consistency and uniformity of the entire system of personal data protection in the EU. Significant differences in both proposals concern including different subject matter and material scope, effective protection of fundamental rights and the establishment of the hierarchy of the existing legal acts in this area.


Khazanah ◽  
2020 ◽  
Vol 12 (2) ◽  
Author(s):  
Hidayatun Nafi'ah ◽  
◽  
athifah Nur Hasna ◽  

Background: Personal data is the most fundamental right for everyone including children. Children are the most vulnerable subjects when it comes to the processing of personal data, it is because they do not have awareness and understanding of the risks of misuse of personal data. Regulations regarding the protection of children's personal data in Indonesia are already contained in the draft of personal data protection law but with very limited guidance. Through this comparative study, researchers wanted to compare the United State's COPPA(Children's Online Privacy Protection Act) with the Children and GDPR by the United Kingdom. Both of these regulations are very detailed in regulating the protection of children's personal data. This study will provide a clearer picture of children’s privacy protection regulations so that it can be used as a reference for Indonesia's draft of personal data protection law in regard to the rights of children's privacy. Method: This comparative research uses qualitative descriptive methods with library research and approach. Result: There are fundamental differences regarding the form of guidance, the definition of child, the perpetrator processing of the child's personal data, and things that are included in the child's personal data. Conclusion: The application of children's personal data protection is adjusted to the values and cultures of the country.


Sign in / Sign up

Export Citation Format

Share Document