scholarly journals Privacy-Preserving Trajectory Data Publishing via Differential Privacy

2017 ◽  
Author(s):  
Ishita Dwivedi
2017 ◽  
Vol 26 (2) ◽  
pp. 285-291 ◽  
Author(s):  
Qiwei Lu ◽  
Caimei Wang ◽  
Yan Xiong ◽  
Huihua Xia ◽  
Wenchao Huang ◽  
...  

2019 ◽  
Vol 90 ◽  
pp. 158-174 ◽  
Author(s):  
Chunhui Piao ◽  
Yajuan Shi ◽  
Jiaqi Yan ◽  
Changyou Zhang ◽  
Liping Liu

2021 ◽  
Author(s):  
Wenqing Cheng ◽  
Ruxue Wen ◽  
Haojun Huang ◽  
Wang Miao ◽  
Chen Wang

2019 ◽  
Vol 76 (7) ◽  
pp. 5276-5300 ◽  
Author(s):  
Songyuan Li ◽  
Hong Shen ◽  
Yingpeng Sang ◽  
Hui Tian

2017 ◽  
Vol 400-401 ◽  
pp. 1-13 ◽  
Author(s):  
Meng Li ◽  
Liehuang Zhu ◽  
Zijian Zhang ◽  
Rixin Xu

2020 ◽  
Author(s):  
Fatima Zahra Errounda ◽  
Yan Liu

Abstract Location and trajectory data are routinely collected to generate valuable knowledge about users' pattern behavior. However, releasing location data may jeopardize the privacy of the involved individuals. Differential privacy is a powerful technique that prevents an adversary from inferring the presence or absence of an individual in the original data solely based on the observed data. The first challenge in applying differential privacy in location is that a it usually involves a single user. This shifts the adversary's target to the user's locations instead of presence or absence in the original data. The second challenge is that the inherent correlation between location data, due to people's movement regularity and predictability, gives the adversary an advantage in inferring information about individuals. In this paper, we review the differentially private approaches to tackle these challenges. Our goal is to help newcomers to the field to better understand the state-of-the art by providing a research map that highlights the different challenges in designing differentially private frameworks that tackle the characteristics of location data. We find that in protecting an individual's location privacy, the attention of differential privacy mechanisms shifts to preventing the adversary from inferring the original location based on the observed one. Moreover, we find that the privacy-preserving mechanisms make use of the predictability and regularity of users' movements to design and protect the users' privacy in trajectory data. Finally, we explore how well the presented frameworks succeed in protecting users' locations and trajectories against well-known privacy attacks.


2021 ◽  
Vol 10 (2) ◽  
pp. 78
Author(s):  
Songyuan Li ◽  
Hui Tian ◽  
Hong Shen ◽  
Yingpeng Sang

Publication of trajectory data that contain rich information of vehicles in the dimensions of time and space (location) enables online monitoring and supervision of vehicles in motion and offline traffic analysis for various management tasks. However, it also provides security holes for privacy breaches as exposing individual’s privacy information to public may results in attacks threatening individual’s safety. Therefore, increased attention has been made recently on the privacy protection of trajectory data publishing. However, existing methods, such as generalization via anonymization and suppression via randomization, achieve protection by modifying the original trajectory to form a publishable trajectory, which results in significant data distortion and hence a low data utility. In this work, we propose a trajectory privacy-preserving method called dynamic anonymization with bounded distortion. In our method, individual trajectories in the original trajectory set are mixed in a localized manner to form synthetic trajectory data set with a bounded distortion for publishing, which can protect the privacy of location information associated with individuals in the trajectory data set and ensure a guaranteed utility of the published data both individually and collectively. Through experiments conducted on real trajectory data of Guangzhou City Taxi statistics, we evaluate the performance of our proposed method and compare it with the existing mainstream methods in terms of privacy preservation against attacks and trajectory data utilization. The results show that our proposed method achieves better performance on data utilization than the existing methods using globally static anonymization, without trading off the data security against attacks.


Sign in / Sign up

Export Citation Format

Share Document