Helping Users Managing Context-Based Privacy Preferences

Author(s):  
Md. Zulfikar Alom ◽  
Barbara Carminati ◽  
Elena Ferrari
Keyword(s):  
Author(s):  
Fulpagare Priya K. ◽  
Nitin N. Patil

Social Network is an emerging e-service for Content Sharing Sites (CSS). It is an emerging service which provides reliable communication. Some users over CSS affect user’s privacy on their personal contents, where some users keep on sending annoying comments and messages by taking advantage of the user’s inherent trust in their relationship network. Integration of multiple user’s privacy preferences is very difficult task, because privacy preferences may create conflict. The techniques to resolve conflicts are essentially required. Moreover, these methods need to consider how users would actually reach an agreement about a solution to the conflict in order to offer solutions acceptable by all of the concerned users. The first mechanism to resolve conflicts for multi-party privacy management in social media that is able to adapt to different situations by displaying the enterprises that users make to reach a result to the conflicts. Billions of items that are uploaded to social media are co-owned by multiple users. Only the user that uploads the item is allowed to set its privacy settings (i.e. who can access the item). This is a critical problem as users’ privacy preferences for co-owned items can conflict. Multi-party privacy management is therefore of crucial importance for users to appropriately reserve their privacy in social media.


Author(s):  
Claudio A. Ardagna ◽  
Sabrina De Capitani di Vimercati ◽  
Sara Foresti ◽  
Stefano Paraboschi ◽  
Pierangela Samarati
Keyword(s):  

2020 ◽  
Vol 35 (1) ◽  
Author(s):  
A. Can Kurtan ◽  
Pınar Yolum

AbstractImage sharing is a service offered by many online social networks. In order to preserve privacy of images, users need to think through and specify a privacy setting for each image that they upload. This is difficult for two main reasons: first, research shows that many times users do not know their own privacy preferences, but only become aware of them over time. Second, even when users know their privacy preferences, editing these privacy settings is cumbersome and requires too much effort, interfering with the quick sharing behavior expected on an online social network. Accordingly, this paper proposes a privacy recommendation model for images using tags and an agent that implements this, namely pelte. Each user agent makes use of the privacy settings that its user have set for previous images to predict automatically the privacy setting for an image that is uploaded to be shared. When in doubt, the agent analyzes the sharing behavior of other users in the user’s network to be able to recommend to its user about what should be considered as private. Contrary to existing approaches that assume all the images are available to a centralized model, pelte is compatible to distributed environments since each agent accesses only the privacy settings of the images that the agent owner has shared or those that have been shared with the user. Our simulations on a real-life dataset shows that pelte can accurately predict privacy settings even when a user has shared a few images with others, the images have only a few tags or the user’s friends have varying privacy preferences.


Author(s):  
Sofia Tsagiopoulou ◽  
Georgios Spathoulas ◽  
Athanasios Kakarountas
Keyword(s):  

Author(s):  
Stefania Gnesi ◽  
Ilaria Matteucci ◽  
Corrado Moiso ◽  
Paolo Mori ◽  
Marinella Petrocchi ◽  
...  

2019 ◽  
Author(s):  
Anya Skatova ◽  
Rebecca Louise McDonald ◽  
Sinong Ma ◽  
Carsten Maple

Data is key for the digital economy, underpinning business models and service provision, and a lot of these valuable datasets are personal in nature. Information about individual behaviour is collected regularly by organisations. This information has value to businesses, the government and third parties. It is not clear what value this personal data has to consumers themselves. Much of the digital economy is predicated on people sharing personal data, however if individuals value their privacy, they may choose to withhold this data unless the perceived benefits of sharing outweigh the perceived value of keeping the data private. Further, they might be willing to pay for an otherwise free service if paying allowed them to avoid sharing personal data. We used five evaluation techniques to study preferences for protecting personal data online and found that consumers assign a positive value to keeping a variety of types of personal data private. We show that participants are prepared to pay different amounts to protect different types of data, suggesting there is no simple function to assign monetary value that can be identified for individual privacy in the digital economy. The majority of participants displayed remarkable consistency in their rankings of the importance of different types of data, a finding that indicates the existence of stable individual privacy preferences in protecting personal data. We discuss our findings in the context of research on the value of privacy and privacy preferences, and in terms of implications for future business models and consumer protection.


Sign in / Sign up

Export Citation Format

Share Document