scholarly journals “Sharing Is Caring:” Australian Self-Trackers' Concepts and Practices of Personal Data Sharing and Privacy

2021 ◽  
Vol 3 ◽  
Author(s):  
Deborah Lupton

Self-tracking technologies and practices offer ways of generating vast reams of personal details, raising questions about how these data are revealed or exposed to others. In this article, I report on findings from an interview-based study of long-term Australian self-trackers who were collecting and reviewing personal information about their bodies and other aspects of their everyday lives. The discussion focuses on the participants' understandings and practices related to sharing their personal data and to data privacy. The contextual elements of self-tracked sharing and privacy concerns were evident in the participants' accounts and were strongly related to ideas about why and how these details should be accessed by others. Sharing personal information from self-tracking was largely viewed as an intimate social experience. The value of self-tracked data to contribute to close face-to-face relationships was recognized and related aspects of social privacy were identified. However, most participants did not consider the possibilities that their personal information could be distributed well-beyond these relationships by third parties for commercial purposes (or what has been termed “institutional privacy”). These findings contribute to a more-than-digital approach to personal data sharing and privacy practices that recognizes the interplay between digital and non-digital practices and contexts. They also highlight the relational and social dimensions of self-tracking and concepts of data privacy.

2020 ◽  
pp. 004728752095164
Author(s):  
Athina Ioannou ◽  
Iis Tussyadiah ◽  
Graham Miller

Against the backdrop of advancements in technology and its deployment by companies and governments to collect sensitive personal information, information privacy has become an issue of great interest for academics, practitioners, and the general public. The travel and tourism industry has been pioneering the collection and use of biometric data for identity verification. Yet, privacy research focusing on the travel context is scarce. This study developed a valid measurement of Travelers’ Online Privacy Concerns (TOPC) through a series of empirical studies: pilot ( n=277) and cross-validation ( n=287). TOPC was then assessed for its predictive validity in its relationships with trust, risk, and intention to disclose four types of personal data: biometric, identifiers, biographic, and behavioral data ( n=685). Results highlight the role of trust in mitigating the relationship between travelers’ privacy concerns and data disclosure. This study provides valuable contribution to research and practice on data privacy in travel.


Author(s):  
Sandra C. Henderson ◽  
Charles A. Snyder ◽  
Terry A. Byrd

Electronic commerce (e-commerce) has had a profound effect on the way we conduct business. It has impacted economies, markets, industry structures, and the flow of products through the supply chain. Despite the phenomenal growth of e-commerce and the potential impact on the revenues of businesses, there are problems with the capabilities of this technology. Organizations are amassing huge quantities of personal data about consumers. As a result, consumers are very concerned about the protection of their personal information and they want something done about the problem.This study examined the relationships between consumer privacy concerns, actual e-commerce activity, the importance of privacy policies, and regulatory preference. Using a model developed from existing literature and theory, an online questionnaire was developed to gauge the concerns of consumers. The results indicated that consumers are concerned about the protection of their personal information and feel that privacy policies are important. Consumers also indicated that they preferred government regulation to industry self-regulation to protect their personal information.


Author(s):  
Sandra C. Henderson ◽  
Charles A. Snyder ◽  
Terry A. Byrd

Electronic commerce (e-commerce) has had a profound effect on the way we conduct business. It has impacted economies, markets, industry structures, and the flow of products through the supply chain. Despite the phenomenal growth of e-commerce and the potential impact on the revenues of businesses, there are problems with the capabilities of this technology. Organizations are amassing huge quantities of personal data about consumers. As a result, consumers are very concerned about the protection of their personal information and they want something done about the problem. This study examined the relationships between consumer privacy concerns, actual e-commerce activity, the importance of privacy policies, and regulatory preference. Using a model developed from existing literature and theory, an online questionnaire was developed to gauge the concerns of consumers. The results indicated that consumers are concerned about the protection of their personal information and feel that privacy policies are important. Consumers also indicated that they preferred government regulation to industry self-regulation to protect their personal information.


Author(s):  
Anna Rohunen ◽  
Jouni Markkula

Personal data is increasingly collected with the support of rapidly advancing information and communication technology, which raises privacy concerns among data subjects. In order to address these concerns and offer the full benefits of personal data intensive services to the public, service providers need to understand how to evaluate privacy concerns in evolving service contexts. By analyzing the earlier used privacy concerns evaluation instruments, we can learn how to adapt them to new contexts. In this article, the historical development of the most widely used privacy concerns evaluation instruments is presented and analyzed regarding privacy concerns' dimensions. Privacy concerns' core dimensions, and the types of context dependent dimensions, to be incorporated into evaluation instruments are identified. Following this, recommendations on how to utilize the existing evaluation instruments are given, as well as suggestions for future research dealing with validation and standardization of the instruments.


Author(s):  
Anastasia Kozyreva ◽  
Philipp Lorenz-Spreen ◽  
Ralph Hertwig ◽  
Stephan Lewandowsky ◽  
Stefan M. Herzog

AbstractPeople rely on data-driven AI technologies nearly every time they go online, whether they are shopping, scrolling through news feeds, or looking for entertainment. Yet despite their ubiquity, personalization algorithms and the associated large-scale collection of personal data have largely escaped public scrutiny. Policy makers who wish to introduce regulations that respect people’s attitudes towards privacy and algorithmic personalization on the Internet would greatly benefit from knowing how people perceive personalization and personal data collection. To contribute to an empirical foundation for this knowledge, we surveyed public attitudes towards key aspects of algorithmic personalization and people’s data privacy concerns and behavior using representative online samples in Germany (N = 1065), Great Britain (N = 1092), and the United States (N = 1059). Our findings show that people object to the collection and use of sensitive personal information and to the personalization of political campaigning and, in Germany and Great Britain, to the personalization of news sources. Encouragingly, attitudes are independent of political preferences: People across the political spectrum share the same concerns about their data privacy and show similar levels of acceptance regarding personalized digital services and the use of private data for personalization. We also found an acceptability gap: People are more accepting of personalized services than of the collection of personal data and information required for these services. A large majority of respondents rated, on average, personalized services as more acceptable than the collection of personal information or data. The acceptability gap can be observed at both the aggregate and the individual level. Across countries, between 64% and 75% of respondents showed an acceptability gap. Our findings suggest a need for transparent algorithmic personalization that minimizes use of personal data, respects people’s preferences on personalization, is easy to adjust, and does not extend to political advertising.


Author(s):  
Xun Li ◽  
Radhika Santhanam

Individuals are increasingly reluctant to disclose personal data and sometimes even intentionally fabricate information to avoid the risk of having it compromised. In this context, organizations face an acute dilemma: they must obtain accurate job applicant information in order to make good hiring decisions, but potential employees may be reluctant to provide accurate information because they fear it could be used for other purposes. Building on theoretical foundations from social cognition and persuasion theory, we propose that, depending on levels of privacy concerns, organizations could use appropriate strategies to persuade job applicants to provide accurate information. We conducted a laboratory experiment to examine the effects of two different persuasion strategies on prospective employees’ willingness to disclose information, measured as their intentions to disclose or falsify information. Our results show support for our suggestion As part of this study, we propose the term information sensitivity to identify the types of personal information that potential employees are most reluctant to disclose.


Author(s):  
Irene Chen

The story describes how three school institutes are grappling with the loss of private information, each through a unique set of circumstances. Pasadena City Public Schools discovered that it had sold several computers containing the names and Social Security numbers of employees as surplus. Stephens Public Schools learned that personal information about students at one of its middle schools was lost when a bag containing a thumb drive was stolen. Also, Woodlands Public Schools accidentally exposed employee personal data on a public Web site for a short period of time. How should each of the institutes react?


2000 ◽  
Vol 19 (1) ◽  
pp. 27-41 ◽  
Author(s):  
Joseph Phelps ◽  
Glen Nowak ◽  
Elizabeth Ferrell

The authors examine potential relationships among categories of personal information, beliefs about direct marketing, situational characteristics, specific privacy concerns, and consumers’ direct marketing shopping habits. Furthermore, the authors offer an assessment of the trade-offs consumers are willing to make when they exchange personal information for shopping benefits. The findings indicate that public policy and self-regulatory efforts to alleviate consumer privacy concerns should provide consumers with more control over the initial gathering and subsequent dissemination of personal information. Such efforts must also consider the type of information sought, because consumer concern and willingness to provide marketers with personal data vary dramatically by information type.


Cryptography ◽  
2019 ◽  
Vol 3 (1) ◽  
pp. 7 ◽  
Author(s):  
Karuna Pande Joshi ◽  
Agniva Banerjee

An essential requirement of any information management system is to protect data and resources against breach or improper modifications, while at the same time ensuring data access to legitimate users. Systems handling personal data are mandated to track its flow to comply with data protection regulations. We have built a novel framework that integrates semantically rich data privacy knowledge graph with Hyperledger Fabric blockchain technology, to develop an automated access-control and audit mechanism that enforces users' data privacy policies while sharing their data with third parties. Our blockchain based data-sharing solution addresses two of the most critical challenges: transaction verification and permissioned data obfuscation. Our solution ensures accountability for data sharing in the cloud by incorporating a secure and efficient system for End-to-End provenance. In this paper, we describe this framework along with the comprehensive semantically rich knowledge graph that we have developed to capture rules embedded in data privacy policy documents. Our framework can be used by organizations to automate compliance of their Cloud datasets.


Author(s):  
Fred Stutzman ◽  
Ralph Gross ◽  
Alessandro Acquisti

Over the past decade, social network sites have experienced dramatic growth in popularity, reaching most demographics and providing new opportunities for interaction and socialization. Through this growth, users have been challenged to manage novel privacy concerns and balance nuanced trade-offs between disclosing and withholding personal information. To date, however, no study has documented how privacy and disclosure evolved on social network sites over an extended period of time. In this manuscript we use profile data from a longitudinal panel of 5,076 Facebook users to understand how their privacy and disclosure behavior changed between 2005---the early days of the network---and 2011. Our analysis highlights three contrasting trends. First, over time Facebook users in our dataset exhibited increasingly privacy-seeking behavior, progressively decreasing the amount of personal data shared publicly with unconnected profiles in the same network. However, and second, changes implemented by Facebook near the end of the period of time under our observation arrested or in some cases inverted that trend. Third, the amount and scope of personal information that Facebook users revealed privately to other connected profiles actually increased over time---and because of that, so did disclosures to ``silent listeners'' on the network: Facebook itself, third-party apps, and (indirectly) advertisers. These findings highlight the tension between privacy choices as expressions of individual subjective preferences, and the role of the environment in shaping those choices.


Sign in / Sign up

Export Citation Format

Share Document