The Structuration of Digital Ecosystem, Privacy, and Big Data Intelligence

2018 ◽  
Vol 62 (10) ◽  
pp. 1319-1337 ◽  
Author(s):  
Yong Jin Park ◽  
Jae Eun Chung ◽  
Dong Hee Shin

This study presents a conceptual model of understanding algorithmic digital surveillance systems, borrowing insight from Giddens, who proposed the notion of structuration as social practices deriving from the intersection between structure and agents. We argue that the status of privacy, or lack of it, is a product of these interactions, of which the personal data practices and related interests constitute the reproduction of a data ecosystem. We trace the process of data production and consumption, dissecting the interactive dynamics between digital media producers (personal data users) and users (personal data producers). Inadequacies, limits, and social and policy implications of data surveillance and its algorithmic reproduction of identities are discussed.

2021 ◽  
Vol 4 ◽  
Author(s):  
Vibhushinie Bentotahewa ◽  
Chaminda Hewage ◽  
Jason Williams

The growing dependency on digital technologies is becoming a way of life, and at the same time, the collection of data using them for surveillance operations has raised concerns. Notably, some countries use digital surveillance technologies for tracking and monitoring individuals and populations to prevent the transmission of the new coronavirus. The technology has the capacity to contribute towards tackling the pandemic effectively, but the success also comes at the expense of privacy rights. The crucial point to make is regardless of who uses and which mechanism, in one way another will infringe personal privacy. Therefore, when considering the use of technologies to combat the pandemic, the focus should also be on the impact of facial recognition cameras, police surveillance drones, and other digital surveillance devices on the privacy rights of those under surveillance. The GDPR was established to ensure that information could be shared without causing any infringement on personal data and businesses; therefore, in generating Big Data, it is important to ensure that the information is securely collected, processed, transmitted, stored, and accessed in accordance with established rules. This paper focuses on Big Data challenges associated with surveillance methods used within the COVID-19 parameters. The aim of this research is to propose practical solutions to Big Data challenges associated with COVID-19 pandemic surveillance approaches. To that end, the researcher will identify the surveillance measures being used by countries in different regions, the sensitivity of generated data, and the issues associated with the collection of large volumes of data and finally propose feasible solutions to protect the privacy rights of the people, during the post-COVID-19 era.


MUAMALATUNA ◽  
2021 ◽  
Vol 13 (1) ◽  
pp. 33
Author(s):  
Mohammad Farid Fad

Abstract In the world of computing and digital, an individual is required to recognize and define the boundaries or boundaries of his preferences in order to reach agreement on the status of his privacy in a particular context or space. In other words, each individual should acquire the right to use his/her own personal data, which will ensure a more active role in the management of his/her personal data. While on the other hand, personal data or information has become something that is very valuable, as well as vulnerable as a commodity so that it poses a risk of vulnerability to misuse or theft of personal data. For this reason, it is necessary to analyze risk mitigation in a syar'i manner in order to avoid the crime of theft and misuse of personal data in cyberspace called the sadd dzari'ah method. This research uses a qualitative approach. Data collection methods used in this study are literature, documentation and observation methods. In analyzing the data that has been collected, the researcher will use descriptive-analytical analysis with ushuliyah approach. The findings of this study are in the perspective of sadd dzari'ah, personal data contains honor and personal dignity that should not be disturbed. When there is misuse of data, it creates a danger (mudharat) in the form of damage to a person's dignity (hifz al-irdh) even though Islamic law as much as possible creates benefits for humans. Therefore, the Government is obliged to draw up a Personal Data Protection Law in order to create a protected and guaranteed digital ecosystem. Keywords: Protection, Personal Data, Sadd Dzari’ah.   Abstrak Dalam dunia komputasi dan digital, seorang individu dituntut untuk mengenali dan menetapkan garis batas atau batasan preferensinya untuk mencapai kesepakatan tentang status privasinya dalam konteks atau ruang tertentu. Dengan kata lain, masing-masing individu harus memperoleh hak untuk menggunakan data pribadinya sendiri, yang akan memastikan peran yang lebih aktif dalam pengelolaan data pribadinya. Sementara di sisi yang lain, data atau informasi pribadi telah menjadi sesuatu yang sangat berharga, sekaligus rentan sebagai komoditas hingga menimbulkan resiko kerawanan penyalahgunaan ataupun pencurian data pribadi. Untuk itu diperlukan analitis mitigasi resiko secara syar’i demi terhindar dari kejahatan pencurian dan penyalahgunaan data pribadi di ruang siber yang disebut metode sadd dzari’ah. Penelitian ini memakai pendekatan kualitatif. Metode pengumpulan data yang dipakai dalam penelitian ini adalah metode literature, dokumentasi dan observasi. Dalam menganalisis data yang telah terkumpul, peneliti akan menggunakan analisis deskriptif-analitis dengan pendekatan ushuliyah. Temuan penelitian ini adalah dalam perspektif sadd dzari’ah, data pribadi memuat kehormatan, dan martabat pribadi yang tidak boleh diganggu. Ketika terjadi penyalahgunaan data, maka menimbulkan bahaya (mudharat) berupa rusaknya harkat dan martabat seseorang (hifz al-irdh) padahal syariat Islam sebisa mungkin mewujudkan kemaslahatan bagi manusia. Oleh karena itu, Pemerintah wajib menyusun Undang-Undang Perlindungan Data Pribadi demi menciptakan ekosistem digital yang terlindungi dan terjamin keamanannya. Kata Kunci: Perlindungan, Data Pribadi, Sadd Dzari’ah.


Author(s):  
Angeliki Tzouganatou ◽  
Jennifer Krueckeberg

With the growing proliferation of digital media into the memory practices of cultural institutions and ordinary people, questions about a growing dependence on monopolistic technology companies on the creation, access and preservation of collective memory have emerged. For cultural institutions that rely on social media to boost their audience engagement, this also means that they lose part of their role as public educators, while ordinary people fear the loss of ownership over their personal memories. This paper proposes equitable approaches to the current digital ecosystem, that is built on the extraction and profit-making of personal data, that can be developed by looking beyond the current market, envisioning possibilities for related policies that could enable the re-design of the current memory ecosystem towards social inclusion. The argument is based on a combination of ethnographic research into initiatives that foster the openness of knowledge by enabling fair practices to be realized in the competitive sphere of the digital economy. Building upon work such as the MyData and the DECODE project, as well as enquiries into personal memory practices of youth living in Germany and the UK.


2021 ◽  
Vol 44 (3) ◽  
Author(s):  
Lisa Archbold ◽  
Damian Clifford ◽  
Moira Paterson ◽  
Megan Richardson ◽  
Normann Witzleb

The advertising technology industry, known as ‘adtech’, is a complicated network of organisations and individuals that collect, aggregate and deal with large amounts of personal data. As children engage with digital networks for many aspects of their lives, they are increasingly exposed to adtech practices. Depending on their age, children may have less knowledge of the commercial digital environment and less maturity in their decision-making processes than adults have. Their limited resilience in the face of adtech’s onslaught offers a particularly stark illustration of why it is problematic to look to ‘consent’ as the exclusive or predominant mechanism to control the use of consumer data in the digital ecosystem. This article examines the problems arising from adtech’s data practices and makes recommendations on how to strengthen the agency and control exercised by children and protect their best interests in the context of adtech.


2019 ◽  
Vol 34 (1) ◽  
pp. 59-80 ◽  
Author(s):  
Roger Clarke

The digitisation of data about the world relevant to business has given rise to a new phase of digitalisation of business itself. The digitisation of data about people has linked with the notions of information society, surveillance society, surveillance state and surveillance capitalism, and given rise to what is referred to in this article as the digital surveillance economy. At the heart of this is a new form of business model that is predicated on the acquisition and consolidation of very large volumes of personal data, and its exploitation to target advertisements, manipulate consumer behaviour, and price goods and services at the highest level that each individual is willing to bear. In the words of the model’s architects, users are ‘bribed’ and ‘induced’ to make their data available at minimal cost to marketers. The digital surveillance economy harbours serious threats to the interests of individuals, societies and polities. That in turn creates risks for corporations. The new economic wave may prove to be a tsunami that swamps the social dimension and washes away the last five centuries’ individualism and humanism. Alternatively, institutional adaptation might occur, overcoming the worst of the negative impacts; or a breaking-point could be reached and consumers might rebel against corporate domination. A research agenda is proposed, to provide a framework within which alternative scenarios can be investigated.


2021 ◽  
Vol 2021 (2) ◽  
pp. 88-110
Author(s):  
Duc Bui ◽  
Kang G. Shin ◽  
Jong-Min Choi ◽  
Junbum Shin

Abstract Privacy policies are documents required by law and regulations that notify users of the collection, use, and sharing of their personal information on services or applications. While the extraction of personal data objects and their usage thereon is one of the fundamental steps in their automated analysis, it remains challenging due to the complex policy statements written in legal (vague) language. Prior work is limited by small/generated datasets and manually created rules. We formulate the extraction of fine-grained personal data phrases and the corresponding data collection or sharing practices as a sequence-labeling problem that can be solved by an entity-recognition model. We create a large dataset with 4.1k sentences (97k tokens) and 2.6k annotated fine-grained data practices from 30 real-world privacy policies to train and evaluate neural networks. We present a fully automated system, called PI-Extract, which accurately extracts privacy practices by a neural model and outperforms, by a large margin, strong rule-based baselines. We conduct a user study on the effects of data practice annotation which highlights and describes the data practices extracted by PI-Extract to help users better understand privacy-policy documents. Our experimental evaluation results show that the annotation significantly improves the users’ reading comprehension of policy texts, as indicated by a 26.6% increase in the average total reading score.


Temida ◽  
2012 ◽  
Vol 15 (3) ◽  
pp. 99-114 ◽  
Author(s):  
Natasa Rajic

This paper discusses the normative framework of regulating the right to protection of personal data relating to biomedical treatment procedures of patients as human rights. The subjects of analysis are the European Convention, the Convention on Human Rights and Biomedicine and the relevant provisions of the Constitution of the Republic of Serbia. The right to protection of personal data in the field of biomedicine is analyzed comparatively in terms of the content of this right and in terms of basis for limiting this right. The analysis is carried out to find answers to the question if the constitutional framework is consistent in terms of exercising this right, taking into account the constitutional provision on the direct application of human rights guaranteed by international treaties and other provisions that determine the status of international sources of law in our legal system.


Author(s):  
Yola Georgiadou ◽  
Rolf de By ◽  
Ourania Kounadi

The General Data Protection Regulation (GDPR) protects the personal data of natural persons and at the same time allows the free movement of such data within the European Union (EU). Hailed as majestic by admirers and dismissed as protectionist by critics, the Regulation is expected to have a profound impact around the world, including in the African Union (AU). For European–African consortia conducting research that may affect the privacy of African citizens, the question is ‘how to protect personal data of data subjects while at the same time ensuring a just distribution of the benefits of a global digital ecosystem?’ We use location privacy as a point of departure, because information about an individual’s location is different from other kinds of personally identifiable information. We analyse privacy at two levels, individual and cultural. Our perspective is interdisciplinary: we draw from computer science to describe three scenarios of transformation of volunteered/observed information to inferred information about a natural person and from cultural theory to distinguish four privacy cultures emerging within the EU in the wake of GDPR. We highlight recent data protection legislation in the AU and discuss factors that may accelerate or inhibit the alignment of data protection legislation in the AU with the GDPR.


2021 ◽  
Vol 12 (1) ◽  
pp. 67-80
Author(s):  
Kyong Yoon

Drawing on South Korea’s response to COVID-19, this article examines how the digital measures that were implemented by the nation state during the pandemic intensified the dilemma between public safety and information rights. South Korea’s highly praised handling of COVID-19 raises the question of how far digital technology can infiltrate everyday life for the sake of public safety and how citizens can negotiate the rapid digital transformation of a nation state. The South Korean government’s digital measures during the pandemic involved the extensive use of personal data; however, citizens were not allowed sufficient participation in the flow of information. By critically examining the South Korean case, this article reveals that the government coped with the pandemic through digital surveillance as a way to avoid physical lockdown, and in so doing, projected its desire for transition to a digitally advanced state while facilitating nationalism through a digital utopian discourse.


Sign in / Sign up

Export Citation Format

Share Document