privacy behaviors
Recently Published Documents


TOTAL DOCUMENTS

26
(FIVE YEARS 7)

H-INDEX

12
(FIVE YEARS 1)

2021 ◽  
Vol 9 (4) ◽  
pp. 158-169 ◽  
Author(s):  
Johanna Schäwel ◽  
Regine Frener ◽  
Sabine Trepte

Social media allow political parties to conduct political behavioral targeting in order to address and persuade specific groups of users and potential voters. This has been criticized: Most social media users do not know about these microtargeting strategies, and the majority of people who are aware of targeted political advertising say that it is not acceptable. This intrusion on personal privacy is viewed as problematic by users and activists alike. The overarching goal of this article is to elaborate on social media users’ privacy perceptions and potential regulating behaviors in the face of political microtargeting. This work is theoretical in nature. We first review theoretical and empirical research in the field of political microtargeting and online privacy. We then analyze how privacy is experienced by social media users during political microtargeting. Building on our theoretical analysis, we finally suggest clear-cut propositions for how political microtargeting can be researched while considering users’ privacy needs on the one hand and relevant political outcomes on the other.


Author(s):  
Dmitry Epstein ◽  
Kelly Quinn

The goals of this study are two-fold. We extend established models linking attitudes related to privacy concerns and privacy protecting behavior (PPB) by (a) differentiating between horizontal (social) and vertical (institutional) orientations of PPB as capturing an aspect of privacy multidimensionality, and (b) introducing additional explanatory factors such as privacy literacy and privacy self-efficacy into the modeling of PPB. We survey a representative sample of 686 US social media users to test relationships between privacy concern, trust, privacy self-efficacy, privacy literacy, and vertical and horizontal PPB. We find privacy concerns contribute to horizontal and vertical PPB at different levels, reinforcing the dimensionality of privacy. We also find that privacy literacy and privacy self-efficacy are important factors in explaining dimensional privacy behaviors and moderate the established relationships between privacy concerns and PPB.


2021 ◽  
Vol 2021 (3) ◽  
pp. 334-350
Author(s):  
Sebastian Linsner ◽  
Franz Kuntke ◽  
Enno Steinbrink ◽  
Jonas Franken ◽  
Christian Reuter

Abstract Technological progress can disrupt domains and change the way we work and collaborate. This paper presents a qualitative study with 52 German farmers that investigates the impact of the ongoing digitalization process in agriculture and discusses the implications for privacy research. As in other domains, the introduction of digital tools and services leads to the data itself becoming a resource. Sharing this data with products along the supply chain is favored by retailers and consumers, who benefit from traceability through transparency. However, transparency can pose a privacy risk. Having insight into the business data of others along the supply chain provides an advantage in terms of market position. This is particularly true in agriculture, where there is already a significant imbalance of power between actors. A multitude of small and medium-sized farming businesses are opposed by large upstream and downstream players that drive technological innovation. Further weakening the market position of farmers could lead to severe consequences for the entire sector. We found that on the one hand, privacy behaviors are affected by adoption of digitalization, and on the other hand, privacy itself influences adoption of digital tools. Our study sheds light on the emerging challenges for farmers and the role of privacy in the process of digitalization in agriculture.


2021 ◽  
Vol 10 (3) ◽  
pp. 283-306
Author(s):  
Yannic Meier ◽  
Johanna Schäwel ◽  
Nicole C. Krämer

Using privacy-protecting tools and reducing self-disclosure can decrease the likelihood of experiencing privacy violations. Whereas previous studies found people’s online self-disclosure being the result of privacy risk and benefit perceptions, the present study extended this so-called privacy calculus approach by additionally focusing on privacy protection by means of a tool. Furthermore, it is important to understand contextual differences in privacy behaviors as well as characteristics of privacy-protecting tools that may affect usage intention. Results of an online experiment (N = 511) supported the basic notion of the privacy calculus and revealed that perceived privacy risks were strongly related to participants’ desired privacy protection which, in turn, was positively related to the willingness to use a privacy-protecting tool. Self-disclosure was found to be context dependent, whereas privacy protection was not. Moreover, participants would rather forgo using a tool that records their data, although this was described to enhance privacy protection.


2020 ◽  
Vol 67 (3) ◽  
pp. 697-711 ◽  
Author(s):  
Nancy K. Lankton ◽  
D. Harrison McKnight ◽  
John F. Tripp

Author(s):  
Muhammad Irtaza Safi ◽  
Abhiditya Jha ◽  
Makak Eihab Aly ◽  
Xinru Page ◽  
Sameer Patil ◽  
...  

2018 ◽  
Vol 2018 (3) ◽  
pp. 63-83 ◽  
Author(s):  
Irwin Reyes ◽  
Primal Wijesekera ◽  
Joel Reardon ◽  
Amit Elazari Bar On ◽  
Abbas Razaghpanah ◽  
...  

Abstract We present a scalable dynamic analysis framework that allows for the automatic evaluation of the privacy behaviors of Android apps. We use our system to analyze mobile apps’ compliance with the Children’s Online Privacy Protection Act (COPPA), one of the few stringent privacy laws in the U.S. Based on our automated analysis of 5,855 of the most popular free children’s apps, we found that a majority are potentially in violation of COPPA, mainly due to their use of thirdparty SDKs. While many of these SDKs offer configuration options to respect COPPA by disabling tracking and behavioral advertising, our data suggest that a majority of apps either do not make use of these options or incorrectly propagate them across mediation SDKs. Worse, we observed that 19% of children’s apps collect identifiers or other personally identifiable information (PII) via SDKs whose terms of service outright prohibit their use in child-directed apps. Finally, we show that efforts by Google to limit tracking through the use of a resettable advertising ID have had little success: of the 3,454 apps that share the resettable ID with advertisers, 66% transmit other, non-resettable, persistent identifiers as well, negating any intended privacy-preserving properties of the advertising ID.


Sign in / Sign up

Export Citation Format

Share Document