Practical Differential Privacy for Location Data Aggregation using a Hadamard Matrix

Author(s):  
Patinya Sangiamchit ◽  
Jittat Fakcharoenphol
Sensors ◽  
2016 ◽  
Vol 16 (9) ◽  
pp. 1463 ◽  
Author(s):  
Hao Ren ◽  
Hongwei Li ◽  
Xiaohui Liang ◽  
Shibo He ◽  
Yuanshun Dai ◽  
...  

IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 164962-164974
Author(s):  
Yan Yan ◽  
Lianxiu Zhang ◽  
Quan Z. Sheng ◽  
Bingqian Wang ◽  
Xin Gao ◽  
...  

2019 ◽  
Vol 17 (4) ◽  
pp. 450-460
Author(s):  
Hai Liu ◽  
Zhenqiang Wu ◽  
Changgen Peng ◽  
Feng Tian ◽  
Laifeng Lu

Considering the untrusted server, differential privacy and local differential privacy has been used for privacy-preserving in data aggregation. Through our analysis, differential privacy and local differential privacy cannot achieve Nash equilibrium between privacy and utility for mobile service based multiuser collaboration, which is multiuser negotiating a desired privacy budget in a collaborative manner for privacy-preserving. To this end, we proposed a Privacy-Preserving Data Aggregation Framework (PPDAF) that reached Nash equilibrium between privacy and utility. Firstly, we presented an adaptive Gaussian mechanism satisfying Nash equilibrium between privacy and utility by multiplying expected utility factor with conditional filtering noise under expected privacy budget. Secondly, we constructed PPDAF using adaptive Gaussian mechanism based on negotiating privacy budget with heuristic obfuscation. Finally, our theoretical analysis and experimental evaluation showed that the PPDAF could achieve Nash equilibrium between privacy and utility. Furthermore, this framework can be extended to engineering instances in a data aggregation setting


2020 ◽  
Author(s):  
Fatima Zahra Errounda ◽  
Yan Liu

Abstract Location and trajectory data are routinely collected to generate valuable knowledge about users' pattern behavior. However, releasing location data may jeopardize the privacy of the involved individuals. Differential privacy is a powerful technique that prevents an adversary from inferring the presence or absence of an individual in the original data solely based on the observed data. The first challenge in applying differential privacy in location is that a it usually involves a single user. This shifts the adversary's target to the user's locations instead of presence or absence in the original data. The second challenge is that the inherent correlation between location data, due to people's movement regularity and predictability, gives the adversary an advantage in inferring information about individuals. In this paper, we review the differentially private approaches to tackle these challenges. Our goal is to help newcomers to the field to better understand the state-of-the art by providing a research map that highlights the different challenges in designing differentially private frameworks that tackle the characteristics of location data. We find that in protecting an individual's location privacy, the attention of differential privacy mechanisms shifts to preventing the adversary from inferring the original location based on the observed one. Moreover, we find that the privacy-preserving mechanisms make use of the predictability and regularity of users' movements to design and protect the users' privacy in trajectory data. Finally, we explore how well the presented frameworks succeed in protecting users' locations and trajectories against well-known privacy attacks.


Sign in / Sign up

Export Citation Format

Share Document