Search Results for author: Guangfeng Yan

Found 7 papers, 1 papers with code

Truncated Non-Uniform Quantization for Distributed SGD

no code implementations2 Feb 2024 Guangfeng Yan, Tan Li, Yuanzhang Xiao, Congduan Li, Linqi Song

To address the communication bottleneck challenge in distributed learning, our work introduces a novel two-stage quantization strategy designed to enhance the communication efficiency of distributed Stochastic Gradient Descent (SGD).

Quantization

Improved Quantization Strategies for Managing Heavy-tailed Gradients in Distributed Learning

no code implementations2 Feb 2024 Guangfeng Yan, Tan Li, Yuanzhang Xiao, Hanxu Hou, Linqi Song

We consider a general family of heavy-tail gradients that follow a power-law distribution, we aim to minimize the error resulting from quantization, thereby determining optimal values for two critical parameters: the truncation threshold and the quantization density.

Quantization

Killing Two Birds with One Stone: Quantization Achieves Privacy in Distributed Learning

no code implementations26 Apr 2023 Guangfeng Yan, Tan Li, Kui Wu, Linqi Song

Communication efficiency and privacy protection are two critical issues in distributed machine learning.

Quantization

Adaptive Top-K in SGD for Communication-Efficient Distributed Learning

no code implementations24 Oct 2022 Mengzhe Ruan, Guangfeng Yan, Yuanzhang Xiao, Linqi Song, Weitao Xu

This paper proposes a novel adaptive Top-K in SGD framework that enables an adaptive degree of sparsification for each gradient descent step to optimize the convergence performance by balancing the trade-off between communication cost and convergence error.

DQ-SGD: Dynamic Quantization in SGD for Communication-Efficient Distributed Learning

no code implementations30 Jul 2021 Guangfeng Yan, Shao-Lun Huang, Tian Lan, Linqi Song

Gradient quantization is an emerging technique in reducing communication costs in distributed learning.

Quantization

DQSGD: DYNAMIC QUANTIZED STOCHASTIC GRADIENT DESCENT FOR COMMUNICATION-EFFICIENT DISTRIBUTED LEARNING

no code implementations1 Jan 2021 Guangfeng Yan, Shao-Lun Huang, Tian Lan, Linqi Song

This paper addresses this issue by proposing a novel dynamic quantized SGD (DQSGD) framework, which enables us to optimize the quantization strategy for each gradient descent step by exploring the trade-off between communication cost and modeling error.

Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.