Tactics, affects and agencies in digital privacy narratives: a story completion study

2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Ash Watson ◽  
Deborah Lupton

PurposeThe purpose of this paper is to report on the findings from the Digital Privacy Story Completion Project, which investigated Australian participants' understandings of and responses to digital privacy scenarios using a novel method and theoretical approach.Design/methodology/approachThe story completion method was brought together with De Certeau's concept of tactics and more-than-human theoretical perspectives. Participants were presented with four story stems on an online platform. Each story stem introduced a fictional character confronted with a digital privacy dilemma. Participants were asked to complete the stories by typing in open text boxes, responding to the prompts “How does the character feel? What does she/he do? What happens next?”. A total of 29 participants completed the stories, resulting in a corpus of 116 narratives for a theory-driven thematic analysis.FindingsThe stories vividly demonstrate the ways in which tactics are entangled with relational connections and affective intensities. They highlight the micropolitical dimensions of human–nonhuman affordances when people are responding to third-party use of their personal information. The stories identified the tactics used and boundaries that are drawn in people's sense-making concerning how they define appropriate and inappropriate use of their data.Originality/valueThis paper demonstrates the value and insights of creatively attending to personal data privacy issues in ways that decentre the autonomous tactical and agential individual and instead consider the more-than-human relationality of privacy.Peer reviewThe peer review history for this article is available at: https://publons.com/publon/10.1108/OIR-05-2020-0174

2019 ◽  
Vol 61 (1) ◽  
pp. 170-190 ◽  
Author(s):  
Sheshadri Chatterjee

Purpose The purpose of this study is to identify how the privacy policy can be framed for protection of personal data and how the latest judgement of full bench of Supreme Court of India has dealt with right to privacy in India. Design/methodology/approach The study uses the latest Supreme Court judgement on right to privacy and historical cases on right to privacy in India. This paper uses Indian Constitution as a source of Information for study along with case laws and judgements of different courts in India. Findings This paper tries to find if personal data privacy is a fundamental right in India. In addition, the paper provides recommendations to different concerned authorities on protecting personal information in online platform. Research limitations/implications This study deals with privacy issues so far as Indian citizens are concerns and does not focus on other countries. Moreover, the study tries to understand the issue of fundamental rights from Indian Constitution perspective. In addition, the recommendations provided to the policymakers and other authorities of India have wide implications for formulation of new policy and management of personal data, so that it should not go to wrong hands and the personal data and privacy is protected of the citizens. Practical implications Millions of people put their personal information in online platform. In addition, there are few government initiatives in India such as Aadhaar card where the biometric information is taken from the residents of India, and in many cases, the personal data are compromised under various circumstances. As the personal data of the citizens are in question, thus the study has direct practical implication mainly for all the citizens whose personal data are available in online platform. Social implications This study has social implication as it dealt with the “personal data” of the citizens of India. As the paper discusses the issue of protection of personal data in the context of right to privacy, thus this study has a direct social impact so far as online citizen of India is concerned. Originality/value This paper is timely, original and discusses the contemporary issue of online data privacy and fundamental right in India. This paper is a useful resource for the researchers, policymakers and online users who deal with personal data-, right to privacy and data privacy policy-related areas.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Heather J. Parker ◽  
Stephen Flowerday

Purpose Social media has created a new level of interconnected communication. However, the use of online platforms brings about various ways in which a user’s personal data can be put at risk. This study aims to investigate what drives the disclosure of personal information online and whether an increase in awareness of the value of personal information motivates users to safeguard their information. Design/methodology/approach Fourteen university students participated in a mixed-methods experiment, where responses to Likert-type scale items were combined with responses to interview questions to provide insight into the cost–benefit analysis users conduct when disclosing information online. Findings Overall, the findings indicate that users are able to disregard their concerns due to a resigned and apathetic attitude towards privacy. Furthermore, subjective norms enhanced by fear of missing out (FOMO) further allows users to overlook potential risks to their information in order to avoid social isolation and sanction. Alternatively, an increased awareness of the personal value of information and having experienced a previous privacy violation encourage the protection of information and limited disclosure. Originality/value This study provides insight into privacy and information disclosure on social media in South Africa. To the knowledge of the researchers, this is the first study to include a combination of the theory of planned behaviour and the privacy calculus model, together with the antecedent factors of personal valuation of information, trust in the social media provider, FOMO.


Author(s):  
Anastasia Kozyreva ◽  
Philipp Lorenz-Spreen ◽  
Ralph Hertwig ◽  
Stephan Lewandowsky ◽  
Stefan M. Herzog

AbstractPeople rely on data-driven AI technologies nearly every time they go online, whether they are shopping, scrolling through news feeds, or looking for entertainment. Yet despite their ubiquity, personalization algorithms and the associated large-scale collection of personal data have largely escaped public scrutiny. Policy makers who wish to introduce regulations that respect people’s attitudes towards privacy and algorithmic personalization on the Internet would greatly benefit from knowing how people perceive personalization and personal data collection. To contribute to an empirical foundation for this knowledge, we surveyed public attitudes towards key aspects of algorithmic personalization and people’s data privacy concerns and behavior using representative online samples in Germany (N = 1065), Great Britain (N = 1092), and the United States (N = 1059). Our findings show that people object to the collection and use of sensitive personal information and to the personalization of political campaigning and, in Germany and Great Britain, to the personalization of news sources. Encouragingly, attitudes are independent of political preferences: People across the political spectrum share the same concerns about their data privacy and show similar levels of acceptance regarding personalized digital services and the use of private data for personalization. We also found an acceptability gap: People are more accepting of personalized services than of the collection of personal data and information required for these services. A large majority of respondents rated, on average, personalized services as more acceptable than the collection of personal information or data. The acceptability gap can be observed at both the aggregate and the individual level. Across countries, between 64% and 75% of respondents showed an acceptability gap. Our findings suggest a need for transparent algorithmic personalization that minimizes use of personal data, respects people’s preferences on personalization, is easy to adjust, and does not extend to political advertising.


Author(s):  
Irene Chen

The story describes how three school institutes are grappling with the loss of private information, each through a unique set of circumstances. Pasadena City Public Schools discovered that it had sold several computers containing the names and Social Security numbers of employees as surplus. Stephens Public Schools learned that personal information about students at one of its middle schools was lost when a bag containing a thumb drive was stolen. Also, Woodlands Public Schools accidentally exposed employee personal data on a public Web site for a short period of time. How should each of the institutes react?


2020 ◽  
Vol 120 (6) ◽  
pp. 1059-1083 ◽  
Author(s):  
Peiqi Ding ◽  
Zhiying Zhao ◽  
Xiang Li

PurposeThe power battery is the core of a new energy vehicle and plays a vital role in the rise of the new energy vehicle industry. As the number of waste batteries increases, firms involved in the industry need to properly dispose them, but what party is responsible remains unclear. To reduce environmental impacts, governments introduce two subsidy policies, i.e. collection subsidies, which are provided to the collecting firms, and dismantling subsidies, which are provided to the dismantling firms.Design/methodology/approachBased on the different characteristics of the subsidies, we develop a stylized model to examine the collection strategies and the preferences over the subsidies.FindingsWe derive several insights from analysis. First, the collection strategies depend on the fixed collection cost. Second, the key factor determining the firm's subsidy preference is the efficiency of dismantling. Finally, if the primary target is the collection rate, governments prefer to provide collection subsidies. If consider the environmental impact, the choice of subsidies has to do with the efficiency of dismantling. Moreover, from a social welfare perspective, the raw material cost and the efficiency of dismantling are core indicators of decision.Originality/valueThis work develops the first analytical model to study two power battery subsidies and investigate the optimal collecting strategies and subsidy preferences. The insights are compelling not only for the manufacturer and the third party but also for policymakers.Peer reviewThe peer review history for this article is available at: https://publons.com/publon/10.1108/IMDS-08-2019-0450


2020 ◽  
pp. 004728752095164
Author(s):  
Athina Ioannou ◽  
Iis Tussyadiah ◽  
Graham Miller

Against the backdrop of advancements in technology and its deployment by companies and governments to collect sensitive personal information, information privacy has become an issue of great interest for academics, practitioners, and the general public. The travel and tourism industry has been pioneering the collection and use of biometric data for identity verification. Yet, privacy research focusing on the travel context is scarce. This study developed a valid measurement of Travelers’ Online Privacy Concerns (TOPC) through a series of empirical studies: pilot ( n=277) and cross-validation ( n=287). TOPC was then assessed for its predictive validity in its relationships with trust, risk, and intention to disclose four types of personal data: biometric, identifiers, biographic, and behavioral data ( n=685). Results highlight the role of trust in mitigating the relationship between travelers’ privacy concerns and data disclosure. This study provides valuable contribution to research and practice on data privacy in travel.


Author(s):  
Fred Stutzman ◽  
Ralph Gross ◽  
Alessandro Acquisti

Over the past decade, social network sites have experienced dramatic growth in popularity, reaching most demographics and providing new opportunities for interaction and socialization. Through this growth, users have been challenged to manage novel privacy concerns and balance nuanced trade-offs between disclosing and withholding personal information. To date, however, no study has documented how privacy and disclosure evolved on social network sites over an extended period of time. In this manuscript we use profile data from a longitudinal panel of 5,076 Facebook users to understand how their privacy and disclosure behavior changed between 2005---the early days of the network---and 2011. Our analysis highlights three contrasting trends. First, over time Facebook users in our dataset exhibited increasingly privacy-seeking behavior, progressively decreasing the amount of personal data shared publicly with unconnected profiles in the same network. However, and second, changes implemented by Facebook near the end of the period of time under our observation arrested or in some cases inverted that trend. Third, the amount and scope of personal information that Facebook users revealed privately to other connected profiles actually increased over time---and because of that, so did disclosures to ``silent listeners'' on the network: Facebook itself, third-party apps, and (indirectly) advertisers. These findings highlight the tension between privacy choices as expressions of individual subjective preferences, and the role of the environment in shaping those choices.


2021 ◽  
Author(s):  
Yurong Gao ◽  
Yiping Guo ◽  
Awais Khan Jumani ◽  
Achyut Shankar

Abstract Data security needs a comprehensive system design approach that combines legal, administrative, and technical protection. These laws generally contain complete rules and principles relevant to the collecting, storing, and using personal information in line with international standards on privacy and data protection. Personal data should be legally collected for a specified reason and not be used without authorization for unlawful monitoring or profiling by governments or third parties. In advocacy and open data activity, increasing attention has been placed on privacy problems. To secure the protection of this data, the Privacy Law (PL) and the Regulations typically put forth industrial and technical standards on IT systems that hold and handle personal data. Concerns about information privacy are genuine, valid, and exacerbated on the Internet of Things (IoT) and Cyber-Physical Systems (CPS). This article suggests that compliance with IoT and CPS Data Privacy (DP) at technical and non-technical levels should be dealt with. The proposed architecture is then coupled with a reference framework for the business architecture to offer a DP-IoT model focused on the industry and technology and positioned to comply with the Personal Information Protection Act (POPI). Therefore, methods are necessary to protect data privacy based on both system and organizational reference designs. In the end, users should have specific rights to information about them, including the capacity and method to seek recourse to protect such rights, to acquire and amend incorrect details. The DP-IoT model shows a privacy ratio of 92.6%, scalability ratio of 91.5, data management ratio of 94.3%, data protection ratio of 96.7%, customer satisfaction rate of 92.2 %, attack prevention ratio of 95.5% and energy consumption ratio of 25.5 % compared to the existing methods.


2021 ◽  
Author(s):  
Arwa Alrawais ◽  
Fatemah Alharbi ◽  
Moteeb Almoteri ◽  
Sara A Aljwair ◽  
Sara SAljwair

The COVID-19 pandemic has swapped the world, causing enormous cases, which led to high mortality rates across the globe. Internet of Things (IoT) based social distancing techniques and many current and emerging technologies have contributed to the fight against the spread of pandemics and reduce the number of positive cases. These technologies generate massive data, which will pose a significant threat to data owners’ privacy by revealing their lifestyle and personal information since that data is stored and managed by a third party like a cloud. This paper provides a new privacy-preserving scheme based on anonymization using an improved slicing technique and implying distributed fog computing. Our implementation shows that the proposed approach ensures data privacy against a third party intending to violate it for any purpose. Furthermore, our results illustrate our scheme’s efficiency and effectiveness.


Significance The move comes after Facebook suspended a UK political consulting firm, Cambridge Analytica, following allegations on March 18 that it improperly obtained personal data on 50 million Facebook users that was subsequently used in political campaigns. The incident has reignited the debates in the United States and elsewhere on online privacy, targeted messaging and whether tech firms are now too powerful to be left to regulate themselves. Impacts First Amendment considerations will limit any efforts to control online political advertising in the United States. Accusations that Facebook facilitated foreign meddling in elections will dog it more than allegations of improper acquisitions of user data. Internal criticism of Facebook's practices by employees, former employees and investors may be greater agents for change than lawmakers.


Sign in / Sign up

Export Citation Format

Share Document