scholarly journals Privacy-Preserving Outsourced Support Vector Machine Design for Secure Drug Discovery

2020 ◽  
Vol 8 (2) ◽  
pp. 610-622 ◽  
Author(s):  
Ximeng Liu ◽  
Robert H. Deng ◽  
Kim-Kwang Raymond Choo ◽  
Yang Yang
2019 ◽  
Vol 1 (1) ◽  
pp. 483-491 ◽  
Author(s):  
Makhamisa Senekane

The ubiquity of data, including multi-media data such as images, enables easy mining and analysis of such data. However, such an analysis might involve the use of sensitive data such as medical records (including radiological images) and financial records. Privacy-preserving machine learning is an approach that is aimed at the analysis of such data in such a way that privacy is not compromised. There are various privacy-preserving data analysis approaches such as k-anonymity, l-diversity, t-closeness and Differential Privacy (DP). Currently, DP is a golden standard of privacy-preserving data analysis due to its robustness against background knowledge attacks. In this paper, we report a scheme for privacy-preserving image classification using Support Vector Machine (SVM) and DP. SVM is chosen as a classification algorithm because unlike variants of artificial neural networks, it converges to a global optimum. SVM kernels used are linear and Radial Basis Function (RBF), while ϵ -differential privacy was the DP framework used. The proposed scheme achieved an accuracy of up to 98%. The results obtained underline the utility of using SVM and DP for privacy-preserving image classification.


2014 ◽  
Vol 11 (5) ◽  
pp. 467-479 ◽  
Author(s):  
Yogachandran Rahulamathavan ◽  
Raphael C.-W. Phan ◽  
Suresh Veluru ◽  
Kanapathippillai Cumanan ◽  
Muttukrishnan Rajarajan

2014 ◽  
Vol 2014 ◽  
pp. 1-10 ◽  
Author(s):  
Haoran Li ◽  
Li Xiong ◽  
Lucila Ohno-Machado ◽  
Xiaoqian Jiang

Data sharing is challenging but important for healthcare research. Methods for privacy-preserving data dissemination based on the rigorous differential privacy standard have been developed but they did not consider the characteristics of biomedical data and make full use of the available information. This often results in too much noise in the final outputs. We hypothesized that this situation can be alleviated by leveraging a small portion of open-consented data to improve utility without sacrificing privacy. We developed a hybrid privacy-preserving differentially private support vector machine (SVM) model that uses public data and private data together. Our model leverages the RBF kernel and can handle nonlinearly separable cases. Experiments showed that this approach outperforms two baselines: (1) SVMs that only use public data, and (2) differentially private SVMs that are built from private data. Our method demonstrated very close performance metrics compared to nonprivate SVMs trained on the private data.


Sign in / Sign up

Export Citation Format

Share Document