ShadowSync: Performing Synchronization in the Background for Highly Scalable Distributed Training

7 Mar 2020Qinqing ZhengBor-Yiing SuJiyan YangAlisson AzzoliniQiang WuOu JinShri KarandikarHagay LupeskoLiang XiongEric Zhou

Ads recommendation systems are often trained with a tremendous amount of data, and distributed training is the workhorse to shorten the training time. Meanwhile, a commonly used technique to prevent overfitting in Ads recommendation is one pass training... (read more)

PDF Abstract

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper