A Light-Weight Crowdsourcing Aggregation in Privacy-Preserving Federated Learning System

Author(s):  
Ke Zhang ◽  
Siu Ming Yiu ◽  
Lucas Chi Kwong Hui
2017 ◽  
Vol 82 ◽  
pp. 56-64 ◽  
Author(s):  
Wenting Shen ◽  
Jia Yu ◽  
Hui Xia ◽  
Hanlin Zhang ◽  
Xiuqing Lu ◽  
...  

Electronics ◽  
2019 ◽  
Vol 8 (4) ◽  
pp. 411 ◽  
Author(s):  
Fengyi Tang ◽  
Wei Wu ◽  
Jian Liu ◽  
Huimei Wang ◽  
Ming Xian

The flourishing deep learning on distributed training datasets arouses worry about data privacy. The recent work related to privacy-preserving distributed deep learning is based on the assumption that the server and any learning participant do not collude. Once they collude, the server could decrypt and get data of all learning participants. Moreover, since the private keys of all learning participants are the same, a learning participant must connect to the server via a distinct TLS/SSL secure channel to avoid leaking data to other learning participants. To fix these problems, we propose a privacy-preserving distributed deep learning scheme with the following improvements: (1) no information is leaked to the server even if any learning participant colludes with the server; (2) learning participants do not need different secure channels to communicate with the server; and (3) the deep learning model accuracy is higher. We achieve them by introducing a key transform server and using homomorphic re-encryption in asynchronous stochastic gradient descent applied to deep learning. We show that our scheme adds tolerable communication cost to the deep learning system, but achieves more security properties. The computational cost of learning participants is similar. Overall, our scheme is a more secure and more accurate deep learning scheme for distributed learning participants.


Author(s):  
Miao Hu ◽  
Di Wu ◽  
Run Wu ◽  
Zhengkai Shi ◽  
Min Chen ◽  
...  

Author(s):  
Harsh Kasyap ◽  
Somanath Tripathy

Clinical trials and drug discovery would not be effective without the collaboration of institutions. Earlier, it has been at the cost of individual’s privacy. Several pacts and compliances have been enforced to avoid data breaches. The existing schemes collect the participant’s data to a central repository for learning predictions as the collaboration is indispensable for research advances. The current COVID pandemic has put a question mark on our existing setup where the existing data repository has proved to be obsolete. There is a need for contemporary data collection, processing, and learning. The smartphones and devices held by the last person of the society have also made them a potential contributor. It demands to design a distributed and decentralized Collaborative Learning system that would make the knowledge inference from every data point. Federated Learning [21], proposed by Google, brings the concept of in-place model training by keeping the data intact to the device. Though it is privacy-preserving in nature, however, it is susceptible to inference, poisoning, and Sybil attacks. Blockchain is a decentralized programming paradigm that provides a broader control of the system, making it attack resistant. It poses challenges of high computing power, storage, and latency. These emerging technologies can contribute to the desired learning system and motivate them to address their security and efficiency issues. This article systematizes the security issues in Federated Learning, its corresponding mitigation strategies, and Blockchain’s challenges. Further, a Blockchain-based Federated Learning architecture with two layers of participation is presented, which improves the global model accuracy and guarantees participant’s privacy. It leverages the channel mechanism of Blockchain for parallel model training and distribution. It facilitates establishing decentralized trust between the participants and the gateways using the Blockchain, which helps to have only honest participants.


Sign in / Sign up

Export Citation Format

Share Document