CORC  > 自动化研究所  > 中国科学院自动化研究所  > 融合创新中心
DenseStream: A Novel Data Representation for Gradient Sparsification in Distributed Synchronous SGD Algorithms
Guangyao Li; Mingxue Liao; Yongyue Chao; Pin Lv
2023-05
会议日期2023年5月18-23日
会议地点澳大利亚昆士兰州
英文摘要

Distributed training is widely used in training large-scale deep learning model, and data parallelism is one of the dominant algorithms. Data-parallel training has additional communication overhead, which greatly affects the training at low bandwidth. Gradient sparsification is a promising technique to reduce the communication volume, which keeps a small number of important gradient values and sets the rest to zero. However, the communication of sparsified gradients suffer from scalability issues for (1) the communication volume of the AllGather algorithm, which is commonly used to accumulate sparse gradients, increases linearly with the number of nodes, and (2) sparse local gradients may return dense due to gradient accumulation. These issues hinder the application of gradient sparsification. We observe that sparse gradient value distribution has great locality, and therefore we propose DenseStream, a novel data representation for sparse gradients in data-parallel training to alleviate the issues. DenseStream integrates an efficient sparse AllReduce algorithm with the synchronous SGD (S-SGD). Evaluations are conducted by real-world applications. Experimental results show that DenseStream achieves better compression ratio at higher densities and can represent sparse vectors with a wider range of densities. Compared with dense AllReduce, our method is more scalable and achieves 3.1-12.1x improvement. 

会议录出版者1
内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/52216]  
专题融合创新中心
通讯作者Mingxue Liao
作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Guangyao Li,Mingxue Liao,Yongyue Chao,et al. DenseStream: A Novel Data Representation for Gradient Sparsification in Distributed Synchronous SGD Algorithms[C]. 见:. 澳大利亚昆士兰州. 2023年5月18-23日.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace