scholarly journals Privacy-preserving machine learning based on secure two-party computations

2021 ◽  
Vol 28 (4) ◽  
pp. 39-51
Author(s):  
Sergey V. Zapechnikov ◽  
Andrey Yu. Shcherbakov
2021 ◽  
Vol 13 (4) ◽  
pp. 94
Author(s):  
Haokun Fang ◽  
Quan Qian

Privacy protection has been an important concern with the great success of machine learning. In this paper, it proposes a multi-party privacy preserving machine learning framework, named PFMLP, based on partially homomorphic encryption and federated learning. The core idea is all learning parties just transmitting the encrypted gradients by homomorphic encryption. From experiments, the model trained by PFMLP has almost the same accuracy, and the deviation is less than 1%. Considering the computational overhead of homomorphic encryption, we use an improved Paillier algorithm which can speed up the training by 25–28%. Moreover, comparisons on encryption key length, the learning network structure, number of learning clients, etc. are also discussed in detail in the paper.


2021 ◽  
Author(s):  
Adnesh Dhamangaonkar ◽  
Prajwal Adsul ◽  
Rohini Sarode ◽  
Sunil Mane

Author(s):  
Imtiyazuddin Shaik ◽  
Ajeet Kumar Singh ◽  
Harika Narumanchi ◽  
Nitesh Emmadi ◽  
Rajan Mindigal Alasingara Bhattachar

2020 ◽  
Author(s):  
Nathan Martindale ◽  
Scott Stewart ◽  
Mark Adams ◽  
Greg Westphal

Sign in / Sign up

Export Citation Format

Share Document