scholarly journals Image sharing privacy policy on social networks using A3P

2017 ◽  
Vol 2 (3) ◽  
Author(s):  
S. Saranya ◽  
M. Ranjith Kumar ◽  
K. Madheswaran

User Image sharing social site maintaining privacy has become a major problem, as demonstrated by a recent wave of publicized incidents where users inadvertently shared personal information. In light of these incidents, the need of tools to help users control access to their shared content is apparent. Toward addressing this need an Adaptive Privacy Policy Prediction (A3P) system to help users compose privacy settings for their images. The solution relies on an image classification framework for image categories which may be associated with similar policies and on a policy prediction algorithm to automatically generate a policy for each newly uploaded image, also according to user’s social features. Image Sharing takes place both among previously established groups of known people or social circles and also increasingly with people outside the users social circles, for purposes of social discovery-to help them identify new peers and learn about peers interests and social surroundings, Sharing images within online content sharing sites, therefore, may quickly lead to unwanted disclosure. The aggregated information can result in unexpected exposure of one’s social environment and lead to abuse of one’s personal information.

Author(s):  
. Yadagiri ◽  
G. Dayakar ◽  
Shaik Abdul Nabi

An Adaptive Privacy Policy Prediction (A3P) system to support users to comprise privacy settings for their images. With the accumulative volume of images, user’s stake through social sites, sustaining privacy has become a major problem, as proven by a recent trend of publicized happenings where users unintentionally shared personal information. In such a case of incidents, the need of tools to help users control access to their shared content is superficial. Towards addressing this need, it is examined the role of social context, image content, and metadata as probable indicators of users’ privacy partialities. A two-level framework which is rendering to the user’s available history on the site, defines the best available privacy policy for the user’s images being uploaded. The solution depend on an image classification framework for image categories which may be accompanied with similar policies, and on a policy prediction algorithm to automatically generate a policy for each newly uploaded image, also according to users’ social features. Over time, the created policies will follow the evolution of users’ privacy attitude. It also provides the results of extensive evaluation which determine the efficacy of the system, with prediction accuracies.


2018 ◽  
Vol 7 (1.7) ◽  
pp. 142
Author(s):  
Hemalatha D ◽  
Almas Begum ◽  
Alex David S

Presently, the growth of Social media is explosive among the users. Increasingly developed social websites like Flickr, Facebook, Google+, LinkedIn etc permits the users to create, share and view the post. Confidentiality is a leading factor required in Social Networks. The social users upload their photos to the social sites that intend to gain public interest for social purposes. The exposure of personal information leads to slipping process like identity stealing, morphing etc, which are against the privacy violations. Relied upon the personal characteristics of users, the privacy settings of each user should be defined. In this paper, a relational study about the privacy settings in Online Social structure is examined. Initiated by the importance of social networks among the social users and their behavior towards Online Social Networks, which is followed by the privacy techniques suggested by other researchers are explored. At last, an overview about the merits and demerits of privacy designs and schemes for the user-uploaded images are presented. The study results a new privacy system that controls the confidential information from being accessed from different devices, including mobile devices and computers.


2022 ◽  
Vol 22 (1) ◽  
pp. 1-32
Author(s):  
Onuralp Ulusoy ◽  
Pinar Yolum

Privacy is the right of individuals to keep personal information to themselves. When individuals use online systems, they should be given the right to decide what information they would like to share and what to keep private. When a piece of information pertains only to a single individual, preserving privacy is possible by providing the right access options to the user. However, when a piece of information pertains to multiple individuals, such as a picture of a group of friends or a collaboratively edited document, deciding how to share this information and with whom is challenging. The problem becomes more difficult when the individuals who are affected by the information have different, possibly conflicting privacy constraints. Resolving this problem requires a mechanism that takes into account the relevant individuals’ concerns to decide on the privacy configuration of information. Because these decisions need to be made frequently (i.e., per each piece of shared content), the mechanism should be automated. This article presents a personal assistant to help end-users with managing the privacy of their content. When some content that belongs to multiple users is about to be shared, the personal assistants of the users employ an auction-based privacy mechanism to regulate the privacy of the content. To do so, each personal assistant learns the preferences of its user over time and produces bids accordingly. Our proposed personal assistant is capable of assisting users with different personas and thus ensures that people benefit from it as they need it. Our evaluations over multiagent simulations with online social network content show that our proposed personal assistant enables privacy-respecting content sharing.


Author(s):  
Fulpagare Priya K. ◽  
Nitin N. Patil

Social Network is an emerging e-service for Content Sharing Sites (CSS). It is an emerging service which provides reliable communication. Some users over CSS affect user’s privacy on their personal contents, where some users keep on sending annoying comments and messages by taking advantage of the user’s inherent trust in their relationship network. Integration of multiple user’s privacy preferences is very difficult task, because privacy preferences may create conflict. The techniques to resolve conflicts are essentially required. Moreover, these methods need to consider how users would actually reach an agreement about a solution to the conflict in order to offer solutions acceptable by all of the concerned users. The first mechanism to resolve conflicts for multi-party privacy management in social media that is able to adapt to different situations by displaying the enterprises that users make to reach a result to the conflicts. Billions of items that are uploaded to social media are co-owned by multiple users. Only the user that uploads the item is allowed to set its privacy settings (i.e. who can access the item). This is a critical problem as users’ privacy preferences for co-owned items can conflict. Multi-party privacy management is therefore of crucial importance for users to appropriately reserve their privacy in social media.


2020 ◽  
Vol 35 (1) ◽  
Author(s):  
A. Can Kurtan ◽  
Pınar Yolum

AbstractImage sharing is a service offered by many online social networks. In order to preserve privacy of images, users need to think through and specify a privacy setting for each image that they upload. This is difficult for two main reasons: first, research shows that many times users do not know their own privacy preferences, but only become aware of them over time. Second, even when users know their privacy preferences, editing these privacy settings is cumbersome and requires too much effort, interfering with the quick sharing behavior expected on an online social network. Accordingly, this paper proposes a privacy recommendation model for images using tags and an agent that implements this, namely pelte. Each user agent makes use of the privacy settings that its user have set for previous images to predict automatically the privacy setting for an image that is uploaded to be shared. When in doubt, the agent analyzes the sharing behavior of other users in the user’s network to be able to recommend to its user about what should be considered as private. Contrary to existing approaches that assume all the images are available to a centralized model, pelte is compatible to distributed environments since each agent accesses only the privacy settings of the images that the agent owner has shared or those that have been shared with the user. Our simulations on a real-life dataset shows that pelte can accurately predict privacy settings even when a user has shared a few images with others, the images have only a few tags or the user’s friends have varying privacy preferences.


2010 ◽  
Vol 25 (2) ◽  
pp. 109-125 ◽  
Author(s):  
Hanna Krasnova ◽  
Sarah Spiekermann ◽  
Ksenia Koroleva ◽  
Thomas Hildebrand

On online social networks such as Facebook, massive self-disclosure by users has attracted the attention of Industry players and policymakers worldwide. Despite the Impressive scope of this phenomenon, very little Is understood about what motivates users to disclose personal Information. Integrating focus group results Into a theoretical privacy calculus framework, we develop and empirically test a Structural Equation Model of self-disclosure with 259 subjects. We find that users are primarily motivated to disclose Information because of the convenience of maintaining and developing relationships and platform enjoyment. Countervailing these benefits, privacy risks represent a critical barrier to information disclosure. However, users’ perception of risk can be mitigated by their trust in the network provider and availability of control options. Based on these findings, we offer recommendations for network providers.


Author(s):  
Bailing Liu ◽  
Paul A. Pavlou ◽  
Xiufeng Cheng

Companies face a trade-off between creating stronger privacy protection policies for consumers and employing more sophisticated data collection methods. Justice-driven privacy protection outlines a method to manage this trade-off. We built on the theoretical lens of justice theory to integrate justice provision with two key privacy protection features, negotiation and active-recommendation, and proposed an information technology (IT) solution to balance the trade-off between privacy protection and consumer data collection. In the context of mobile banking applications, we prototyped a theory-driven IT solution, referred to as negotiation, active-recommendation privacy policy application, which enables customer service agents to interact with and actively recommend personalized privacy policies to consumers. We benchmarked our solution through a field experiment relative to two conventional applications: an online privacy statement and a privacy policy with only a simple negotiation feature. The results showed that the proposed IT solution improved consumers’ perceived procedural justice, interactive justice, and distributive justice and increased their psychological comfort in using our application design and in turn reduced their privacy concerns, enhanced their privacy awareness, and increased their information disclosure intentions and actual disclosure behavior in practice. Our proposed design can provide consumers better privacy protection while ensuring that consumers voluntarily disclose personal information desirable for companies.


Sign in / Sign up

Export Citation Format

Share Document