no code implementations • ICCV 2015 • Hongchang Gao, Feiping Nie, Xuelong. Li, Heng Huang
In this paper, we propose a novel multi-view subspace clustering method.
no code implementations • 25 Sep 2019 • Hongchang Gao, Gang Wu, Ryan Rossi, Viswanathan Swaminathan, Heng Huang
Factorization Machines (FMs) is an important supervised learning approach due to its unique ability to capture feature interactions when dealing with high-dimensional sparse data.
no code implementations • 24 Aug 2020 • Hongchang Gao, Heng Huang
To the best of our knowledge, this is the first adaptive decentralized training approach.
no code implementations • 24 Aug 2020 • Hongchang Gao, Heng Huang
The condition for achieving the linear speedup is also provided for this variant.
no code implementations • 1 Jan 2021 • An Xu, Xiao Yan, Hongchang Gao, Heng Huang
The heavy communication for model synchronization is a major bottleneck for scaling up the distributed deep neural network training to many workers.
no code implementations • NeurIPS 2021 • Hongchang Gao, Heng Huang
The stochastic compositional optimization problem covers a wide range of machine learning models, such as sparse additive models and model-agnostic meta-learning.
no code implementations • 29 Sep 2021 • Wanli Shi, Hongchang Gao, Bin Gu
In this paper, to solve the nonconvex problem with a large number of white/black-box constraints, we proposed a doubly stochastic zeroth-order gradient method (DSZOG).
no code implementations • NeurIPS 2021 • Hongchang Gao, Heng Huang
The stochastic compositional optimization problem covers a wide range of machine learning models, such as sparse additive models and model-agnostic meta-learning.
no code implementations • 28 Apr 2022 • Hongchang Gao
In this paper, we studied the federated bilevel optimization problem, which has widespread applications in machine learning.
no code implementations • 30 Jun 2022 • Hongchang Gao, Bin Gu, My T. Thai
Bilevel optimization has been applied to a wide variety of machine learning models, and numerous stochastic bilevel optimization algorithms have been developed in recent years.
no code implementations • 6 Dec 2022 • Hongchang Gao
Minimax optimization problems have attracted significant attention in recent years due to their widespread application in numerous machine learning models.
no code implementations • 24 Apr 2023 • Yihan Zhang, Wenhao Jiang, Feng Zheng, Chiu C. Tan, Xinghua Shi, Hongchang Gao
This motivates us to study decentralized minimax optimization algorithms for the nonconvex-nonconcave problem.
no code implementations • 5 Jun 2023 • Hongchang Gao, My T. Thai, Jie Wu
Federated learning is a new learning paradigm for extracting knowledge from distributed data.
no code implementations • 6 Jun 2023 • Hongchang Gao
To the best of our knowledge, this is the first work that achieves the level-independent convergence rate under the decentralized setting.
no code implementations • 25 Jul 2023 • Hongchang Gao
In particular, our study shows that the standard gossip communication strategy cannot achieve linear speedup for decentralized compositional minimax problems due to the large consensus error about the inner-level function.
1 code implementation • ICCV 2023 • Dong Lu, Zhiqiang Wang, Teng Wang, Weili Guan, Hongchang Gao, Feng Zheng
Vision-language pre-training (VLP) models have shown vulnerability to adversarial examples in multimodal tasks.
no code implementations • 19 Nov 2023 • Yihan Zhang, My T. Thai, Jie Wu, Hongchang Gao
To the best of our knowledge, this is the first stochastic algorithm achieving these theoretical results under the heterogeneous setting.
no code implementations • ICML 2020 • Hongchang Gao, Heng Huang
To address the problem of lacking gradient in many applications, we propose two new stochastic zeroth-order Frank-Wolfe algorithms and theoretically proved that they have a faster convergence rate than existing methods for non-convex problems.