Communication cost reduction using sparse ternary compression and encoding for FedAvg

Author(s):  
Thi Quynh Khanh Dinh ◽  
Thanh-Hai Tran ◽  
Thi-Lan Le
2004 ◽  
Vol 12 (2) ◽  
pp. 159
Author(s):  
Igor Cavrak ◽  
Armin Stranjak ◽  
Mario Zagar

Electronics ◽  
2021 ◽  
Vol 10 (17) ◽  
pp. 2081 ◽  
Author(s):  
Dongseok Kang ◽  
Chang Wook Ahn

Federated learning is a distributed learning algorithm designed to train a single server model on a server using different clients and their local data. To improve the performance of the server model, continuous communication with clients is required, and since the number of clients is very large, the algorithm must be designed in consideration of the cost required for communication. In this paper, we propose a method for distributing a model with a structure different from that of the server model, distributing a model suitable for clients with different data sizes, and training a server model using the reconstructed model trained by the client. In this way, the server model deploys only a subset of the sequential model, collects gradient updates, and selectively applies updates to the server model. This method of delivering the server model at a lower cost to clients who only need smaller models can reduce the communication cost of training server models compared to standard methods. An image classification model was designed to verify the effectiveness of the proposed method via three data distribution situations and two datasets, and it was confirmed that training was accomplished only with a cost 0.229 times smaller than the standard method.


2016 ◽  
Vol 15 (2) ◽  
pp. 843-856 ◽  
Author(s):  
Yanqiu Huang ◽  
Wanli Yu ◽  
Christof Osewold ◽  
Alberto Garcia-Ortiz

Sign in / Sign up

Export Citation Format

Share Document