scholarly journals Protecting Locations with Differential Privacy under Temporal Correlations

Author(s):  
Yonghui Xiao ◽  
Li Xiong
2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Weiqi Zhang ◽  
Guisheng Yin ◽  
Yuhai Sha ◽  
Jishen Yang

The rapid development of the Global Positioning System (GPS) devices and location-based services (LBSs) facilitates the collection of huge amounts of personal information for the untrusted/unknown LBS providers. This phenomenon raises serious privacy concerns. However, most of the existing solutions aim at locating interference in the static scenes or in a single timestamp without considering the correlation between location transfer and time of moving users. In this way, the solutions are vulnerable to various inference attacks. Traditional privacy protection methods rely on trusted third-party service providers, but in reality, we are not sure whether the third party is trustable. In this paper, we propose a systematic solution to preserve location information. The protection provides a rigorous privacy guarantee without the assumption of the credibility of the third parties. The user’s historical trajectory information is used as the basis of the hidden Markov model prediction, and the user’s possible prospective location is used as the model output result to protect the user’s trajectory privacy. To formalize the privacy-protecting guarantee, we propose a new definition, L&A-location region, based on k -anonymity and differential privacy. Based on the proposed privacy definition, we design a novel mechanism to provide a privacy protection guarantee for the users’ identity trajectory. We simulate the proposed mechanism based on a dataset collected in real practice. The result of the simulation shows that the proposed algorithm can provide privacy protection to a high standard.


PLoS ONE ◽  
2021 ◽  
Vol 16 (8) ◽  
pp. e0255979
Author(s):  
Efe Bozkir ◽  
Onur Günlü ◽  
Wolfgang Fuhl ◽  
Rafael F. Schaefer ◽  
Enkelejda Kasneci

New generation head-mounted displays, such as VR and AR glasses, are coming into the market with already integrated eye tracking and are expected to enable novel ways of human-computer interaction in numerous applications. However, since eye movement properties contain biometric information, privacy concerns have to be handled properly. Privacy-preservation techniques such as differential privacy mechanisms have recently been applied to eye movement data obtained from such displays. Standard differential privacy mechanisms; however, are vulnerable due to temporal correlations between the eye movement observations. In this work, we propose a novel transform-coding based differential privacy mechanism to further adapt it to the statistics of eye movement feature data and compare various low-complexity methods. We extend the Fourier perturbation algorithm, which is a differential privacy mechanism, and correct a scaling mistake in its proof. Furthermore, we illustrate significant reductions in sample correlations in addition to query sensitivities, which provide the best utility-privacy trade-off in the eye tracking literature. Our results provide significantly high privacy without any essential loss in classification accuracies while hiding personal identifiers.


Sign in / Sign up

Export Citation Format

Share Document