Search Results for author: Hongchang Gao

Found 10 papers, 0 papers with code

Can Stochastic Zeroth-Order Frank-Wolfe Method Converge Faster for Non-Convex Problems?

no code implementations ICML 2020 Hongchang Gao, Heng Huang

To address the problem of lacking gradient in many applications, we propose two new stochastic zeroth-order Frank-Wolfe algorithms and theoretically proved that they have a faster convergence rate than existing methods for non-convex problems.

Fast Training Method for Stochastic Compositional Optimization Problems

no code implementations NeurIPS 2021 Hongchang Gao, Heng Huang

The stochastic compositional optimization problem covers a wide range of machine learning models, such as sparse additive models and model-agnostic meta-learning.

Additive models Meta-Learning

Accelerated Gradient-Free Method for Heavily Constrained Nonconvex Optimization

no code implementations29 Sep 2021 Wanli Shi, Hongchang Gao, Bin Gu

In this paper, to solve the nonconvex problem with a large number of white/black-box constraints, we proposed a doubly stochastic zeroth-order gradient method (DSZOG).

Fast Training Method for Stochastic Compositional Optimization Problems

no code implementations NeurIPS 2021 Hongchang Gao, Heng Huang

The stochastic compositional optimization problem covers a wide range of machine learning models, such as sparse additive models and model-agnostic meta-learning.

Additive models Meta-Learning

Delay-Tolerant Local SGD for Efficient Distributed Training

no code implementations1 Jan 2021 An Xu, Xiao Yan, Hongchang Gao, Heng Huang

The heavy communication for model synchronization is a major bottleneck for scaling up the distributed deep neural network training to many workers.

Federated Learning

Adaptive Serverless Learning

no code implementations24 Aug 2020 Hongchang Gao, Heng Huang

To the best of our knowledge, this is the first adaptive decentralized training approach.

Periodic Stochastic Gradient Descent with Momentum for Decentralized Training

no code implementations24 Aug 2020 Hongchang Gao, Heng Huang

The condition for achieving the linear speedup is also provided for this variant.

Deep Relational Factorization Machines

no code implementations25 Sep 2019 Hongchang Gao, Gang Wu, Ryan Rossi, Viswanathan Swaminathan, Heng Huang

Factorization Machines (FMs) is an important supervised learning approach due to its unique ability to capture feature interactions when dealing with high-dimensional sparse data.

Cannot find the paper you are looking for? You can Submit a new open access paper.