Search Results for author: Shiqi He

Found 5 papers, 3 papers with code

NL4Opt Competition: Formulating Optimization Problems Based on Their Natural Language Descriptions

1 code implementation14 Mar 2023 Rindranirina Ramamonjison, Timothy T. Yu, Raymond Li, Haley Li, Giuseppe Carenini, Bissan Ghaddar, Shiqi He, Mahdi Mostajabdaveh, Amin Banitalebi-Dehkordi, Zirui Zhou, Yong Zhang

The Natural Language for Optimization (NL4Opt) Competition was created to investigate methods of extracting the meaning and formulation of an optimization problem based on its text description.

Language Modelling Large Language Model

GlueFL: Reconciling Client Sampling and Model Masking for Bandwidth Efficient Federated Learning

no code implementations3 Dec 2022 Shiqi He, Qifan Yan, Feijie Wu, Lanjun Wang, Mathias Lécuyer, Ivan Beschastnikh

Federated learning (FL) is an effective technique to directly involve edge devices in machine learning training while preserving client privacy.

Federated Learning Model Compression

Anchor Sampling for Federated Learning with Partial Client Participation

1 code implementation13 Jun 2022 Feijie Wu, Song Guo, Zhihao Qu, Shiqi He, Ziming Liu, Jing Gao

The lack of inactive clients' updates in partial client participation makes it more likely for the model aggregation to deviate from the aggregation based on full client participation.

Federated Learning

Sign Bit is Enough: A Learning Synchronization Framework for Multi-hop All-reduce with Ultimate Compression

no code implementations14 Apr 2022 Feijie Wu, Shiqi He, Song Guo, Zhihao Qu, Haozhao Wang, Weihua Zhuang, Jie Zhang

Traditional one-bit compressed stochastic gradient descent can not be directly employed in multi-hop all-reduce, a widely adopted distributed training paradigm in network-intensive high-performance computing systems such as public clouds.

Cannot find the paper you are looking for? You can Submit a new open access paper.