Wireless Charger Placement with Communication Constraint

Author(s):  
Haipeng Dai ◽  
Nan Yu ◽  
Alex X. Liu ◽  
Bingchuan Tian ◽  
Guihai Chen

Author(s):  
Shaoqi Wang ◽  
Aidi Pi ◽  
Xiaobo Zhou

Scalability of distributed deep learning (DL) training with parameter server architecture is often communication constrained in large clusters. There are recent efforts that use a layer by layer strategy to overlap gradient communication with backward computation so as to reduce the impact of communication constraint on the scalability. However, the approaches cannot be effectively applied to the overlap between parameter communication and forward computation. In this paper, we propose and design iBatch, a novel communication approach that batches parameter communication and forward computation to overlap them with each other. We formulate the batching decision as an optimization problem and solve it based on greedy algorithm to derive communication and computation batches. We implement iBatch in the open-source DL framework BigDL and perform evaluations with various DL workloads. Experimental results show that iBatch improves the scalability of a cluster of 72 nodes by up to 73% over the default PS and 41% over the layer by layer strategy.


2017 ◽  
Vol 50 (1) ◽  
pp. 49-54 ◽  
Author(s):  
Sen Li ◽  
Zhen Li ◽  
Jian Li ◽  
Qinglin Wang ◽  
Zhuoyue Song ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document