Search Results for author: T. S. Eugene Ng

Found 3 papers, 1 papers with code

Zen: Near-Optimal Sparse Tensor Synchronization for Distributed DNN Training

no code implementations23 Sep 2023 Zhuang Wang, Zhaozhuo Xu, Anshumali Shrivastava, T. S. Eugene Ng

We then systematically explore the design space of communication schemes for sparse tensors and find the optimal one.

ByteComp: Revisiting Gradient Compression in Distributed Training

no code implementations28 May 2022 Zhuang Wang, Haibin Lin, Yibo Zhu, T. S. Eugene Ng

It first designs a decision tree abstraction to express all the compression strategies and develops empirical models to timeline tensor computation, communication, and compression to enable ByteComp to derive the intricate interactions among tensors.

MergeComp: A Compression Scheduler for Scalable Communication-Efficient Distributed Training

1 code implementation28 Mar 2021 Zhuang Wang, Xinyu Wu, T. S. Eugene Ng

It can even achieve a scaling factor of distributed training up to 99% over high-speed networks.

Cannot find the paper you are looking for? You can Submit a new open access paper.