privacy preference
Recently Published Documents


TOTAL DOCUMENTS

31
(FIVE YEARS 6)

H-INDEX

6
(FIVE YEARS 1)

2021 ◽  
Vol 16 (7) ◽  
pp. 2943-2964
Author(s):  
Xudong Lin ◽  
Xiaoli Huang ◽  
Shuilin Liu ◽  
Yulin Li ◽  
Hanyang Luo ◽  
...  

With the rapid development of information technology, digital platforms can collect, utilize, and share large amounts of specific information of consumers. However, these behaviors may endanger information security, thus causing privacy concerns among consumers. Considering the information sharing among firms, this paper constructs a two-period duopoly price competition Hotelling model, and gives insight into the impact of three different levels of privacy regulations on industry profit, consumer surplus, and social welfare. The results show that strong privacy protection does not necessarily make consumers better off, and weak privacy protection does not necessarily hurt consumers. Information sharing among firms will lead to strong competitive effects, which will prompt firms to lower the price for new customers, thus damaging the profits of firms, and making consumers’ surplus higher. The level of social welfare under different privacy regulations depends on consumers’ product-privacy preference, and the cost of information coordination among firms. With the cost of information coordination among firms increasing, it is only in areas where consumers have greater privacy preferences that social welfare may be optimal under the weak regulation.


2021 ◽  
Vol 2021 (4) ◽  
pp. 249-269
Author(s):  
Maximilian Hils ◽  
Daniel W. Woods ◽  
Rainer Böhme

Abstract Privacy preference signals are digital representations of how users want their personal data to be processed. Such signals must be adopted by both the sender (users) and intended recipients (data processors). Adoption represents a coordination problem that remains unsolved despite efforts dating back to the 1990s. Browsers implemented standards like the Platform for Privacy Preferences (P3P) and Do Not Track (DNT), but vendors profiting from personal data faced few incentives to receive and respect the expressed wishes of data subjects. In the wake of recent privacy laws, a coalition of AdTech firms published the Transparency and Consent Framework (TCF), which defines an optin consent signal. This paper integrates post-GDPR developments into the wider history of privacy preference signals. Our main contribution is a high-frequency longitudinal study describing how TCF signal gained dominance as of February 2021. We explore which factors correlate with adoption at the website level. Both the number of third parties on a website and the presence of Google Ads are associated with higher adoption of TCF. Further, we show that vendors acted as early adopters of TCF 2.0 and provide two case-studies describing how Consent Management Providers shifted existing customers to TCF 2.0. We sketch ways forward for a pro-privacy signal.


2021 ◽  
Vol 3 (2) ◽  
Author(s):  
Hilda Hadan ◽  
Laura Calloway ◽  
Shakthidhar Gopavaram ◽  
Shrirang Mare ◽  
L. Jean Camp

The on-going COVID-19 pandemic has brought surveillance and privacy concerns to the forefront, given that contact tracing has been seen as a very effective tool to prevent the spread of infectious disease and that public authorities and government officials hope to use it to contain the spread of COVID-19. On the other hand, the rejection of contact tracing tools has also been widely reported, partly due to privacy concerns. We conducted an online survey to identify participants’ privacy concerns and their risk perceptions during the on-going COVID-19 pandemic. Our results contradict media claims that people are more willing to share their private information in a public health crisis. We identified a significant difference depending on the information recipient, the type of device, the intended purpose, and thus concretize the claims rather than suggesting a fundamental difference. We note that participants’ privacy preferences are largely impacted by their perceived autonomy and the perceived severity of consequences related to privacy risks. Contrarily, even during an on-going COVID-19 pandemic, health risk perceptions had limited influence on participants’ privacy preference, given only the perceived newness of the risk could weakly increase their comfort level. Finally, our results show that participants’ computer expertise has a positive influence on their privacy preference while their knowledge to security make them less comfortable with sharing.


2018 ◽  
Vol 2018 ◽  
pp. 1-13 ◽  
Author(s):  
Liang Xiao ◽  
Fei-Peng Guo ◽  
Qi-Bei Lu

The existing mobile personalized service (MPS) gives little consideration to users’ privacy. In order to address this issue and some other shortcomings, the paper proposes a MPS recommender model for item recommendation based on sentiment analysis and privacy concern. First, the paper puts forward sentiment analysis algorithm based on sentiment vocabulary ontology and then clusters the users based on sentiment tendency. Second, the paper proposes a measurement algorithm, which integrates personality traits with privacy preference intensity, and then clusters the users based on personality traits. Third, this paper achieves a hybrid collaborative filtering recommendation by combining sentiment analysis with privacy concern. Experiments show that this model can effectively solve the problem of MPS data sparseness and cold start. More importantly, a combination of subjective privacy concern and objective recommendation technology can reduce the influence of users’ privacy concerns on their acceptance of MPS.


2018 ◽  
Vol 3 (1) ◽  
pp. 40-53 ◽  
Author(s):  
Daniela Fernandez Espinosa ◽  
Lu Xiao

Abstract Purpose In this paper, we describe how gender recognition on Twitter can be used as an intelligent business tool to determine the privacy concerns among users, and ultimately offer a more personalized service for customers who are more likely to respond positively to targeted advertisements. Design/methodology/approach We worked with two different data sets to examine whether Twitter users’ gender, inferred from the first name of the account and the profile description, correlates with the privacy setting of the account. We also used a set of features including the inferred gender of Twitter users to develop classifiers that predict user privacy settings. Findings We found that the inferred gender of Twitter users correlates with the account’s privacy setting. Specifically, females tend to be more privacy concerned than males. Users whose gender cannot be inferred from their provided first names tend to be more privacy concerned. In addition, our classification performance suggests that inferred gender can be used as an indicator of the user’s privacy preference. Research limitations It is known that not all twitter accounts are real user accounts, and social bots tweet as well. A major limitation of our study is the lack of consideration of social bots in the data. In our study, this implies that at least some percentage of the undefined accounts, that is, accounts that had names non-existent in the name dictionary, are social bots. It will be interesting to explore the privacy setting of social bots in the Twitter space. Practical implications Companies are investing large amounts of money in business intelligence tools that allow them to know the preferences of their consumers. Due to the large number of consumers around the world, it is very difficult for companies to have direct communication with each customer to anticipate market changes. For this reason, the social network Twitter has gained relevance as one ideal tool for information extraction. On the other hand, users’ privacy preference needs to be considered when companies consider leveraging their publicly available data. This paper suggests that gender recognition of Twitter users, based on Twitter users’ provided first names and their profile descriptions, can be used to infer the users’ privacy preference. Originality/value This study explored a new way of inferring Twitter user’s gender, that is, to recognize the user’s gender based on the provided first name and the user’s profile description. The potential of this information for predicting the user’s privacy preference is explored.


Sign in / Sign up

Export Citation Format

Share Document