scholarly journals DE-IDENTIFICATION TECHNIQUE FOR IOT WIRELESS SENSOR NETWORK PRIVACY PROTECTION

2017 ◽  
Vol 10 (1) ◽  
pp. 1
Author(s):  
Yennun Huang ◽  
Szu-Chuang Li ◽  
Bo-Chen Tai ◽  
Chieh-Ming Chang ◽  
Dmitrii I. Kaplun ◽  
...  

As the IoT ecosystem becoming more and more mature, hardware and software vendors are trying create new value by connecting all kinds of devices together via IoT. IoT devices are usually equipped with sensors to collect data, and the data collected are transmitted over the air via different kinds of wireless connection. To extract the value of the data collected, the data owner may choose to seek for third-party help on data analysis, or even of the data to the public for more insight. In this scenario it is important to protect the released data from privacy leakage. Here we propose that differential privacy, as a de-identification technique, can be a useful approach to add privacy protection to the data released, as well as to prevent the collected from intercepted and decoded during over-the-air transmission. A way to increase the accuracy of the count queries performed on the edge cases in a synthetic database is also presented in this research.

2021 ◽  
Vol 2 (4) ◽  
pp. 1-23
Author(s):  
Ahmed Aleroud ◽  
Fan Yang ◽  
Sai Chaithanya Pallaprolu ◽  
Zhiyuan Chen ◽  
George Karabatis

Network traces are considered a primary source of information to researchers, who use them to investigate research problems such as identifying user behavior, analyzing network hierarchy, maintaining network security, classifying packet flows, and much more. However, most organizations are reluctant to share their data with a third party or the public due to privacy concerns. Therefore, data anonymization prior to sharing becomes a convenient solution to both organizations and researchers. Although several anonymization algorithms are available, few of them allow sufficient privacy (organization need), acceptable data utility (researcher need), and efficient data analysis at the same time. This article introduces a condensation-based differential privacy anonymization approach that achieves an improved tradeoff between privacy and utility compared to existing techniques and produces anonymized network trace data that can be shared publicly without lowering its utility value. Our solution also does not incur extra computation overhead for the data analyzer. A prototype system has been implemented, and experiments have shown that the proposed approach preserves privacy and allows data analysis without revealing the original data even when injection attacks are launched against it. When anonymized datasets are given as input to graph-based intrusion detection techniques, they yield almost identical intrusion detection rates as the original datasets with only a negligible impact.


2016 ◽  
Vol 2016 ◽  
pp. 1-11
Author(s):  
Nan Feng ◽  
Zhiqi Hao ◽  
Sibo Yang ◽  
Harris Wu

With the pervasive use of wireless sensor networks (WSNs) within commercial environments, business privacy leakage due to the exposure of sensitive information transmitted in a WSN has become a major issue for enterprises. We examine business privacy protection in the application of WSNs. We propose a business privacy-protection system (BPS) that is modeled as a hierarchical profile in order to filter sensitive information with respect to enterprise-specified privacy requirements. The BPS aims at solving a tradeoff between metrics that are defined to estimate the utility of information and the business privacy risk. We design profile, risk assessment, and filtration agents to implement the BPS based on multiagent technology. The effectiveness of our proposed BPS is validated by experiments.


2021 ◽  
Vol 17 (2) ◽  
pp. 155014772199340
Author(s):  
Xiaohui Li ◽  
Yuliang Bai ◽  
Yajun Wang ◽  
Bo Li

Suppressing the trajectory data to be released can effectively reduce the risk of user privacy leakage. However, the global suppression of the data set to meet the traditional privacy model method reduces the availability of trajectory data. Therefore, we propose a trajectory data differential privacy protection algorithm based on local suppression Trajectory privacy protection based on local suppression (TPLS) to provide the user with the ability and flexibility of protecting data through local suppression. The main contributions of this article include as follows: (1) introducing privacy protection method in trajectory data release, (2) performing effective local suppression judgment on the points in the minimum violation sequence of the trajectory data set, and (3) proposing a differential privacy protection algorithm based on local suppression. In the algorithm, we achieve the purpose Maximal frequent sequence (MFS) sequence loss rate in the trajectory data set by effective local inhibition judgment and updating the minimum violation sequence set, and then establish a classification tree and add noise to the leaf nodes to improve the security of the data to be published. Simulation results show that the proposed algorithm is effective, which can reduce the data loss rate and improve data availability while reducing the risk of user privacy leakage.


2020 ◽  
Vol 17 (4) ◽  
pp. 539-547
Author(s):  
Jun Hong ◽  
Tao Wen ◽  
Quan Guo

Outsourcing spatial database to a third party is becoming a common practice for more and more individuals and companies to save the cost of managing and maintaining database, where a data owner delegates its spatial data management tasks to a third party and grants it to provide query services. However, the third party is not full trusted. Thus, authentication information should be provided to the client for query authentication. In this paper, we introduce an efficient space authenticated data structure, called Verifiable Similarity Indexing tree (VSS-tree), to support authenticated spatial query. We build VSS-tree based on SS-tree which employs bounding sphere rather than bounding rectangle for region shape and extend it with authentication information. Based on VSS-tree, the third party finds query results and builds their corresponding verification object. The client performs query authentication using the verification object and the public key published. Finally, we evaluate the performance and validity of our algorithms, the experiment results show that VSS-tree can efficiently support spatial query and have better performance than Merkle R tree (MR-tree)


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Weiqi Zhang ◽  
Guisheng Yin ◽  
Yuhai Sha ◽  
Jishen Yang

The rapid development of the Global Positioning System (GPS) devices and location-based services (LBSs) facilitates the collection of huge amounts of personal information for the untrusted/unknown LBS providers. This phenomenon raises serious privacy concerns. However, most of the existing solutions aim at locating interference in the static scenes or in a single timestamp without considering the correlation between location transfer and time of moving users. In this way, the solutions are vulnerable to various inference attacks. Traditional privacy protection methods rely on trusted third-party service providers, but in reality, we are not sure whether the third party is trustable. In this paper, we propose a systematic solution to preserve location information. The protection provides a rigorous privacy guarantee without the assumption of the credibility of the third parties. The user’s historical trajectory information is used as the basis of the hidden Markov model prediction, and the user’s possible prospective location is used as the model output result to protect the user’s trajectory privacy. To formalize the privacy-protecting guarantee, we propose a new definition, L&A-location region, based on k -anonymity and differential privacy. Based on the proposed privacy definition, we design a novel mechanism to provide a privacy protection guarantee for the users’ identity trajectory. We simulate the proposed mechanism based on a dataset collected in real practice. The result of the simulation shows that the proposed algorithm can provide privacy protection to a high standard.


2021 ◽  
Author(s):  
panjun sun

Abstract The solution of the contradiction between privacy protection and data utility is a research hotspot in the field of privacy protection. Aiming at the problem of tradeoff between privacy and utility in the scenario of differential privacy offline data release, the optimal differential privacy mechanism is studied by using the rate distortion theory. Firstly, based on Shannon communication theory, the noise channel model of differential privacy is abstracted, and the mutual information and the distortion function is used to measure the privacy and utility of data publishing, and the optimization model based on rate distortion theory is constructed. Secondly, considering the influence of associated auxiliary background knowledge on mutual information privacy leakage, a mutual information privacy measure based on joint events is proposed, and a minimum privacy leakage model is proposed by modifying the rate distortion function. Finally, aiming at the difficulty in solving the Lagrange multiplier method, an approximate algorithm for solving the mutual information privacy optimization channel mechanism is proposed based on the alternating iterative method. The effectiveness of the proposed iterative approximation method is verified by experimental simulation. At the same time, the experimental results show that the proposed method reduces the mutual information privacy leakage under the condition of limited distortion, and improves the data utility under the same privacy tolerance


Author(s):  
Hui Xiu ◽  
Xuemei Jiang ◽  
Xiaomei Zhang

Cloud Manufacturing is a new model to increase the manufacturing and business benefits by sharing manufacturing resources. These resources can bring users convenience, but also may be maliciously analyzed by the attacker which may result in personal or corporate privacy disclosure. In this paper, we discuss the privacy disclosure problem in cloud manufacturing, and propose a method for releasing order data securely with the complex relationship between enterprises and other vendors. With regards to the risk of privacy leakage in the process of data analysis or data mining, we improve the traditional method of anonymous releasing for original order data, and introduce the thought of safe k-anonymization to achieve the process. To meet the needs of protecting sensitive information in data, we analyze the users’ different demands for order data in the cloud manufacturing, use the sampling function to satisfy (β, ε, δ) - DPS to increase the uncertainty of the differential privacy, improve the k-anonymization method, apply the anonymous method with generalization, concealment, and reduce data associations to different attributes. The improved method not only preserves the statistical characteristics of the data, but also protects the privacy information in the order data in the cloud manufacturing environment.


2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Kangsoo Jung ◽  
Seog Park

With the proliferation of wireless communication and mobile devices, various location-based services are emerging. For the growth of the location-based services, more accurate and various types of personal location data are required. However, concerns about privacy violations are a significant obstacle to obtain personal location data. In this paper, we propose a local differential privacy scheme in an environment where there is no trusted third party to implement privacy protection techniques and incentive mechanisms to motivate users to provide more accurate location data. The proposed local differential privacy scheme allows a user to set a personalized safe region that he/she can disclose and then perturb the user’s location within the safe region. It is the way to satisfy the user’s various privacy requirements and improve data utility. The proposed incentive mechanism has two models, and both models pay the incentive differently according to the user’s safe region size to motivate to set a more precise safe region. We verify the proposed local differential privacy algorithm and incentive mechanism can satisfy the privacy protection level while achieving the desirable utility through the experiment.


Sign in / Sign up

Export Citation Format

Share Document