scholarly journals Iterative Constructions and Private Data Release

Author(s):  
Anupam Gupta ◽  
Aaron Roth ◽  
Jonathan Ullman
Keyword(s):  
2019 ◽  
Vol 2019 (1) ◽  
pp. 26-46 ◽  
Author(s):  
Thee Chanyaswad ◽  
Changchang Liu ◽  
Prateek Mittal

Abstract A key challenge facing the design of differential privacy in the non-interactive setting is to maintain the utility of the released data. To overcome this challenge, we utilize the Diaconis-Freedman-Meckes (DFM) effect, which states that most projections of high-dimensional data are nearly Gaussian. Hence, we propose the RON-Gauss model that leverages the novel combination of dimensionality reduction via random orthonormal (RON) projection and the Gaussian generative model for synthesizing differentially-private data. We analyze how RON-Gauss benefits from the DFM effect, and present multiple algorithms for a range of machine learning applications, including both unsupervised and supervised learning. Furthermore, we rigorously prove that (a) our algorithms satisfy the strong ɛ-differential privacy guarantee, and (b) RON projection can lower the level of perturbation required for differential privacy. Finally, we illustrate the effectiveness of RON-Gauss under three common machine learning applications – clustering, classification, and regression – on three large real-world datasets. Our empirical results show that (a) RON-Gauss outperforms previous approaches by up to an order of magnitude, and (b) loss in utility compared to the non-private real data is small. Thus, RON-Gauss can serve as a key enabler for real-world deployment of privacy-preserving data release.


2021 ◽  
Vol 14 (10) ◽  
pp. 1730-1742
Author(s):  
Yingtai Xiao ◽  
Zeyu Ding ◽  
Yuxin Wang ◽  
Danfeng Zhang ◽  
Daniel Kifer

In practice, differentially private data releases are designed to support a variety of applications. A data release is fit for use if it meets target accuracy requirements for each application. In this paper, we consider the problem of answering linear queries under differential privacy subject to per-query accuracy constraints. Existing practical frameworks like the matrix mechanism do not provide such fine-grained control (they optimize total error, which allows some query answers to be more accurate than necessary, at the expense of other queries that become no longer useful). Thus, we design a fitness-for-use strategy that adds privacy-preserving Gaussian noise to query answers. The covariance structure of the noise is optimized to meet the fine-grained accuracy requirements while minimizing the cost to privacy.


2013 ◽  
Vol 94 (3) ◽  
pp. 401-437 ◽  
Author(s):  
Amos Beimel ◽  
Hai Brenner ◽  
Shiva Prasad Kasiviswanathan ◽  
Kobbi Nissim

2014 ◽  
Vol 11 (1) ◽  
pp. 59-71 ◽  
Author(s):  
Noman Mohammed ◽  
Dima Alhadidi ◽  
Benjamin C.M. Fung ◽  
Mourad Debbabi

Sign in / Sign up

Export Citation Format

Share Document