Correlated Differential Privacy Protection for Big Data

Author(s):  
Denglong Lv ◽  
Shibing Zhu
2020 ◽  
Author(s):  
Huanhuan Wang ◽  
Xiang Wu ◽  
Yongqi Tan ◽  
Hongsheng Yin ◽  
Xiaochun Cheng ◽  
...  

BACKGROUND Medical data mining and sharing is an important process to realize the value of medical big data in E-Health applications. However, medical data contains a large amount of personal private information of patients, there is a risk of privacy disclosure when sharing and mining. Therefore, how to ensure the security of medical big data in the process of publishing, sharing and mining has become the focus of current researches. OBJECTIVE The objective of our study is to design a framework based on differential privacy protection mechanism to ensure the security sharing of medical data. We developed a privacy Protection Query Language (PQL) that can integrate multiple machine mining methods and provide secure sharing functions for medical data. METHODS This paper adopts a modular design method with three sub-modules, including parsing module, mining module and noising module. Each module encapsulates different computing devices, such as composite parser, noise jammer, etc. In the PQL framework, we apply the differential privacy mechanism to the results of the module collaborative calculation to optimize the security of various mining algorithms. These computing devices operate independently, but the mining results depend on their cooperation. RESULTS Designed and developed a query language framework that provides medical data mining, sharing and privacy preserving functions. We theoretically proved the performance of the PQL framework. The experimental results showed that the PQL framework can ensure the security of each mining result, and the average usefulness of the output results is above 97%. CONCLUSIONS We presented a security framework that enables medical data providers to securely share the health data or treatment data, and developed a usable query language based on differential privacy mechanism that enables researchers to mine potential information securely using data mining algorithms. CLINICALTRIAL


2016 ◽  
Vol 71 (9-10) ◽  
pp. 465-475 ◽  
Author(s):  
Chi Lin ◽  
Pengyu Wang ◽  
Houbing Song ◽  
Yanhong Zhou ◽  
Qing Liu ◽  
...  

2022 ◽  
Vol 2022 ◽  
pp. 1-9
Author(s):  
Jiawen Du ◽  
Yong Pi

With the advent of the era of big data, people’s lives have undergone earth-shaking changes, not only getting rid of the cumbersome traditional data collection but also collecting and sorting information directly from people’s footprints on social networks. This paper explores and analyzes the privacy issues in current social networks and puts forward the protection strategies of users’ privacy data based on data mining algorithms so as to truly ensure that users’ privacy in social networks will not be illegally infringed in the era of big data. The data mining algorithm proposed in this paper can protect the user’s identity from being identified and the user’s private information from being leaked. Using differential privacy protection methods in social networks can effectively protect users’ privacy information in data publishing and data mining. Therefore, it is of great significance to study data publishing, data mining methods based on differential privacy protection, and their application in social networks.


Author(s):  
Adam Gowri Shankar

Abstract: Body Area Networks (BANs), collects enormous data by wearable sensors which contain sensitive information such as physical condition, location information, and so on, which needs protection. Preservation of privacy in big data has emerged as an absolute prerequisite for exchanging private data in terms of data analysis, validation, and publishing. Previous methods and traditional methods like k-anonymity and other anonymization techniques have overlooked privacy protection issues resulting to privacy infringement. In this work, a differential privacy protection scheme for ‘big data in body area network’ is developed. Compared with previous methods, the proposed privacy protection scheme is best in terms of availability and reliability. Exploratory results demonstrate that, even when the attacker has full background knowledge, the proposed scheme can still provide enough interference to big sensitive data so as to preserve the privacy. Keywords: BAN’s, Privacy, Differential Privacy, Noisy response


2019 ◽  
Vol 11 (1) ◽  
pp. 168781401882239 ◽  
Author(s):  
Zhimin Li ◽  
Haoze Lv ◽  
Zhaobin Liu

With the development of Internet of Things, many applications need to use people’s location information, resulting in a large amount of data need to be processed, called big data. In recent years, people propose many methods to protect privacy in the location-based service aspect. However, existing technologies have poor performance in big data area. For instance, sensor equipments such as smart phones with location record function may submit location information anytime and anywhere which may lead to privacy disclosure. Attackers can leverage huge data to achieve useful information. In this article, we propose noise-added selection algorithm, a location privacy protection method that satisfies differential privacy to prevent the data from privacy disclosure by attacker with arbitrary background knowledge. In view of Internet of Things, we maximize the availability of data and algorithm when protecting the information. In detail, we filter real-time location distribution information, use our selection mechanism for comparison and analysis to determine privacy-protected regions, and then perform differential privacy on them. As shown in the theoretical analysis and the experimental results, the proposed method can achieve significant improvements in security, privacy, and complete a perfect balance between privacy protection level and data availability.


2020 ◽  
Vol 2020 ◽  
pp. 1-29 ◽  
Author(s):  
Xingxing Xiong ◽  
Shubo Liu ◽  
Dan Li ◽  
Zhaohui Cai ◽  
Xiaoguang Niu

With the advent of the era of big data, privacy issues have been becoming a hot topic in public. Local differential privacy (LDP) is a state-of-the-art privacy preservation technique that allows to perform big data analysis (e.g., statistical estimation, statistical learning, and data mining) while guaranteeing each individual participant’s privacy. In this paper, we present a comprehensive survey of LDP. We first give an overview on the fundamental knowledge of LDP and its frameworks. We then introduce the mainstream privatization mechanisms and methods in detail from the perspective of frequency oracle and give insights into recent studied on private basic statistical estimation (e.g., frequency estimation and mean estimation) and complex statistical estimation (e.g., multivariate distribution estimation and private estimation over complex data) under LDP. Furthermore, we present current research circumstances on LDP including the private statistical learning/inferencing, private statistical data analysis, privacy amplification techniques for LDP, and some application fields under LDP. Finally, we identify future research directions and open challenges for LDP. This survey can serve as a good reference source for the research of LDP to deal with various privacy-related scenarios to be encountered in practice.


Sign in / Sign up

Export Citation Format

Share Document