parallel stochastic gradient descent
Recently Published Documents


TOTAL DOCUMENTS

17
(FIVE YEARS 11)

H-INDEX

4
(FIVE YEARS 1)

2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Yuan Yuan ◽  
Zongrui Zou ◽  
Dong Li ◽  
Li Yan ◽  
Dongxiao Yu

Decentralized machine learning has been playing an essential role in improving training efficiency. It has been applied in many real-world scenarios, such as edge computing and IoT. However, in fact, networks are dynamic, and there is a risk of information leaking during the communication process. To address this problem, we propose a decentralized parallel stochastic gradient descent algorithm (D-(DP)2SGD) with differential privacy in dynamic networks. With rigorous analysis, we show that D-(DP)2SGD converges with a rate of O 1 / K n while satisfying ε -DP, which achieves almost the same convergence rate as previous works without privacy concern. To the best of our knowledge, our algorithm is the first known decentralized parallel SGD algorithm that can implement in dynamic networks and take privacy-preserving into consideration.


Sign in / Sign up

Export Citation Format

Share Document