scholarly journals Privacy in targeted advertising: A survey

Author(s):  
Imdad Ullah ◽  
Roksana Boreli ◽  
Salil S. Kanhere

Targeted advertising has transformed the marketing trend for any business by creating new opportunities for advertisers to reach prospective customers by delivering them personalised ads using an infrastructure of a variety of intermediary entities and technologies. The advertising and analytics companies collect, aggregate, process and trade a rich amount of user's personal data, which has prompted serious privacy concerns among individuals and organisations. This article presents a detailed survey of privacy risks including the information flow between advertising platform and ad/analytics networks, the profiling process, the advertising sources and criteria, the measurement analysis of targeted advertising based on user's interests and profiling context and ads delivery process in both in-app and in-browser targeted ads. We provide detailed discussion of challenges in preserving user privacy that includes privacy threats posed by the advertising and analytics companies, how private information is extracted and exchanged among various advertising entities, privacy threats from third-party tracking, re-identification of private information and associated privacy risks, in addition to, overview data and tracking sharing technologies. Following, we present various techniques for preserving user privacy and a comprehensive analysis of various proposals founded on those techniques and compare them based on the underlying architectures, the privacy mechanisms and the deployment scenarios. Finally we discuss some potential research challenges and open research issues.<br>

2020 ◽  
Author(s):  
Imdad Ullah ◽  
Roksana Boreli ◽  
Salil S. Kanhere

Targeted advertising has transformed the marketing trend for any business by creating new opportunities for advertisers to reach prospective customers by delivering them personalised ads using an infrastructure of a variety of intermediary entities and technologies. The advertising and analytics companies collect, aggregate, process and trade a rich amount of user's personal data, which has prompted serious privacy concerns among individuals and organisations. This article presents a detailed survey of privacy risks including the information flow between advertising platform and ad/analytics networks, the profiling process, the advertising sources and criteria, the measurement analysis of targeted advertising based on user's interests and profiling context and ads delivery process in both in-app and in-browser targeted ads. We provide detailed discussion of challenges in preserving user privacy that includes privacy threats posed by the advertising and analytics companies, how private information is extracted and exchanged among various advertising entities, privacy threats from third-party tracking, re-identification of private information and associated privacy risks, in addition to, overview data and tracking sharing technologies. Following, we present various techniques for preserving user privacy and a comprehensive analysis of various proposals founded on those techniques and compare them based on the underlying architectures, the privacy mechanisms and the deployment scenarios. Finally we discuss some potential research challenges and open research issues.<br>


2020 ◽  
Author(s):  
Imdad Ullah ◽  
Roksana Boreli ◽  
Salil S. Kanhere

Targeted advertising has transformed the marketing trend for any business by creating new opportunities for advertisers to reach prospective customers by delivering them personalised ads using an infrastructure of a variety of intermediary entities and technologies. The advertising and analytics companies collect, aggregate, process and trade a rich amount of user's personal data, which has prompted serious privacy concerns among individuals and organisations. This article presents a detailed survey of privacy risks including the information flow between advertising platform and ad/analytics networks, the profiling process, the advertising sources and criteria, the measurement analysis of targeted advertising based on user's interests and profiling context and ads delivery process in both in-app and in-browser targeted ads. We provide detailed discussion of challenges in preserving user privacy that includes privacy threats posed by the advertising and analytics companies, how private information is extracted and exchanged among various advertising entities, privacy threats from third-party tracking, re-identification of private information and associated privacy risks, in addition to, overview data and tracking sharing technologies. Following, we present various techniques for preserving user privacy and a comprehensive analysis of various proposals founded on those techniques and compare them based on the underlying architectures, the privacy mechanisms and the deployment scenarios. Finally we discuss some potential research challenges and open research issues.<br>


2014 ◽  
Vol 2014 ◽  
pp. 1-10 ◽  
Author(s):  
Shin-Yan Chiou ◽  
Yi-Cheng Chen

Recently, rates of vehicle ownership have risen globally, exacerbating problems including air pollution, lack of parking, and traffic congestion. While many solutions to these problems have been proposed, carpooling remains one of the most effective approaches. Recently, several carpooling platforms have been built on cloud computing systems, with originators posting online list of departure/arrival points and schedules from which participants can search for rides that match their needs. However, it can be difficult to make matches quickly and the systems are subject to privacy concerns in that they may disclose private information such as names, registration data, and departure/arrival schedules. This paper proposes a dynamic matching method for car/taxi pools for use in mobile devices via ad hoc Wi-Fi networks. The proposed method also preserves user privacy including names and departure/arrival schedules. Moreover, the system does not require the user to register any personal data, so such data cannot be leaked. The system was implemented on the Android mobile platform, allowing users to immediately and securely access the system via their smart phones.


Author(s):  
Eko Wahyu Tyas Darmaningrat ◽  
Hanim Maria Astuti ◽  
Fadhila Alfi

Background: Teenagers in Indonesia have an open nature and satisfy their desire to exist by uploading photos or videos and writing posts on Instagram. The habit of uploading photos, videos, or writings containing their personal information can be dangerous and potentially cause user privacy problems. Several criminal cases caused by information misuse have occurred in Indonesia.Objective: This paper investigates information privacy concerns among Instagram users in Indonesia, more specifically amongst college students, the largest user group of Instagram in Indonesia.Methods: This study referred to the Internet Users' Information Privacy Concerns (IUIPC) method by collecting data through the distribution of online questionnaires and analyzed the data by using Structural Equation Modelling (SEM).Results: The research finding showed that even though students are mindful of the potential danger of information misuse in Instagram, it does not affect their intention to use Instagram. Other factors that influence Indonesian college students' trust are Instagram's reputation, the number of users who use Instagram, the ease of using Instagram, the skills and knowledge of Indonesian students about Instagram, and the privacy settings that Instagram has.Conclusion: The awareness and concern of Indonesian college students for information privacy will significantly influence the increased risk awareness of information privacy. However, the increase in risk awareness does not directly affect Indonesian college students' behavior to post their private information on Instagram.


Author(s):  
Fred Stutzman ◽  
Ralph Gross ◽  
Alessandro Acquisti

Over the past decade, social network sites have experienced dramatic growth in popularity, reaching most demographics and providing new opportunities for interaction and socialization. Through this growth, users have been challenged to manage novel privacy concerns and balance nuanced trade-offs between disclosing and withholding personal information. To date, however, no study has documented how privacy and disclosure evolved on social network sites over an extended period of time. In this manuscript we use profile data from a longitudinal panel of 5,076 Facebook users to understand how their privacy and disclosure behavior changed between 2005---the early days of the network---and 2011. Our analysis highlights three contrasting trends. First, over time Facebook users in our dataset exhibited increasingly privacy-seeking behavior, progressively decreasing the amount of personal data shared publicly with unconnected profiles in the same network. However, and second, changes implemented by Facebook near the end of the period of time under our observation arrested or in some cases inverted that trend. Third, the amount and scope of personal information that Facebook users revealed privately to other connected profiles actually increased over time---and because of that, so did disclosures to ``silent listeners'' on the network: Facebook itself, third-party apps, and (indirectly) advertisers. These findings highlight the tension between privacy choices as expressions of individual subjective preferences, and the role of the environment in shaping those choices.


2021 ◽  
Author(s):  
Daria Ilkina

This thesis investigates the privacy risks that m-learning app users face by identifying the personal information that m-learning apps collect from their users, and the privacy policies of these apps. It reveals that most of the m-learning applications have similar privacy policies, which seem to protect the interest of the providers rather than the users. The Privacy by Design framework is reviewed to determine whether it can help the developers address user privacy practices. The results from the sample of 260 participants suggest that users are less concerned with the collection of personal information that is non-identifiable. The survey also revealed that the users are more concerned when an app shares their personal information with third parties for commercial purposes than when it is shared with the government.


2019 ◽  
Vol 8 (2) ◽  
pp. 2947-2951

Nowadays rapid development of cloud computing in smart healthcare system has significantly improved the quality of health. However, data security and user privacy are a major concern for smart healthcare systems. These days any kind of data can be used for malicious purposes. Many harmful entities constantly try to gain access to the personal data of internet users. This data includes sensitive information that doctors store of patients and is often stored using some kind of third party cloud providing service that is not very secure. To take care of this issue, in this paper, Symmetric Balanced Incomplete Block Design (SBIBD) is utilized for key Security so that unauthorized client can’t get access to the data easily. It also allows the patients immediate and easy access to the data using unique user ID. This system makes use of double encryption using Blowfish algorithm to ensure maximum security of data and the concept of block level is used where data is stored using multiple blocks.


2019 ◽  
Vol 32 (6) ◽  
pp. 1679-1703 ◽  
Author(s):  
Le Wang ◽  
Zao Sun ◽  
Xiaoyong Dai ◽  
Yixin Zhang ◽  
Hai-hua Hu

Purpose The purpose of this paper is to facilitate understanding of how to mitigate the privacy concerns of users who have experienced privacy invasions. Design/methodology/approach Drawing on the communication privacy management theory, the authors developed a model suggesting that privacy concerns form through a cognitive process involving threat-coping appraisals, institutional privacy assurances and privacy experiences. The model was tested using data from an empirical survey with 913 randomly selected social media users. Findings Privacy concerns are jointly determined by perceived privacy risks and privacy self-efficacy. The perceived effectiveness of institutional privacy assurances in terms of established privacy policies and privacy protection technology influences the perceptions of privacy risks and privacy self-efficacy. More specifically, privacy invasion experiences are negatively associated with the perceived effectiveness of institutional privacy assurances. Research limitations/implications Privacy concerns are conceptualized as general concerns that reflect an individual’s worry about the possible loss of private information. The specific types of private information were not differentiated. Originality/value This paper is among the first to clarify the specific mechanisms through which privacy invasion experiences influence privacy concerns. Privacy concerns have long been viewed as resulting from individual actions. The study contributes to literature by linking privacy concerns with institutional privacy practice.


Cyber Crime ◽  
2013 ◽  
pp. 534-556
Author(s):  
Amr Ali Eldin

Despite the expected benefits behind context-awareness and the need for developing more and more context-aware applications, we enunciate that privacy represents a major challenge for the success and widespread adoption of these services. This is due to the collection of huge amount of users’ contextual information, which would highly threaten their privacy concerns. Controlling users’ information collection represents a logical way to let users get more acquainted with these context-aware services. Additionally, this control requires users to be able to make consent decisions which face a high degree of uncertainty due to the nature of this environment and the lack of experience from the user side with information collectors’ privacy policies. Therefore, intelligent techniques are required in order to deal with this uncertainty. In this chapter, the auhtors propose a consent decision-making mechanism, ShEM, which allows users to exert automatic and manual control over their private information. An enhanced fuzzy logic approach was developed for the automatic decision making process. The proposed mechanism has been prototyped and integrated in a UMTS location-based services testbed on a university campus. Users have experienced the services in real time. A survey of users’ responses on the privacy functionality has been carried out and analyzed as well. Users’ response on the privacy functionality was positive. Additionally, results obtained showed that a combination of both manual and automatic privacy control modes in one approach is more likely to be accepted than only a complete automatic or a complete manual privacy control.


Sensors ◽  
2018 ◽  
Vol 18 (12) ◽  
pp. 4383 ◽  
Author(s):  
Hongchen Wu ◽  
Mingyang Li ◽  
Huaxiang Zhang

Privacy intrusion has become a major bottleneck for current trust-aware social sensing, since online social media allows anybody to largely disclose their personal information due to the proliferation of the Internet of Things (IoT). State-of-the-art social sensing still suffers from severe privacy threats since it collects users’ personal data and disclosure behaviors, which could raise user privacy concerns due to data integration for personalization. In this paper, we propose a trust-aware model, called the User and Item Similarity Model with Trust in Diverse Kinds (UISTD), to enhance the personalization of social sensing while reducing users’ privacy concerns. UISTD utilizes user-to-user similarities and item-to-item similarities to generate multiple kinds of personalized items with common tags. UISTD also applies a modified k-means clustering algorithm to select the core users among trust relationships, and the core users’ preferences and disclosure behaviors will be regarded as the predicted disclosure pattern. The experimental results on three real-world data sets demonstrate that target users are more likely to: (1) follow the core users’ interests on diverse kinds of items and disclosure behaviors, thereby outperforming the compared methods; and (2) disclose more information with lower intrusion awareness and privacy concern.


Sign in / Sign up

Export Citation Format

Share Document