no code implementations • NeurIPS 2016 • Conghui Tan, Shiqian Ma, Yu-Hong Dai, Yuqiu Qian
One of the major issues in stochastic gradient descent (SGD) methods is how to choose an appropriate step size while running the algorithm.
no code implementations • 24 Feb 2019 • Wei Peng, Yu-Hong Dai, HUI ZHANG, Li-Zhi Cheng
Training generative adversarial networks (GANs) often suffers from cyclic behaviors of iterates.
no code implementations • 2 Oct 2020 • Tengteng Yu, Xin-Wei Liu, Yu-Hong Dai, Jie Sun
In this paper, we propose a variable metric mini-batch proximal stochastic recursive gradient algorithm VM-mSRGBB, which updates the metric using a new diagonal BB stepsize.
no code implementations • 21 Dec 2020 • Yu-Hong Dai, Jiani Wang, Liwei Zhang
While there have been many numerical algorithms for solving smooth convex-concave minimax problems, numerical algorithms for nonsmooth convex-concave minimax problems are very rare.
Optimization and Control 90C30
no code implementations • 4 Feb 2021 • Wei-Kun Chen, Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo
In this paper, we consider the network slicing problem which attempts to map multiple customized virtual network requests (also called services) to a common shared network infrastructure and allocate network resources to meet diverse service requirements, and propose an efficient two-stage algorithm for solving this NP-hard problem.
Networking and Internet Architecture Information Theory Signal Processing Information Theory Optimization and Control
no code implementations • 24 Nov 2022 • Zi Xu, Zi-Qi Wang, Jun-Lin Wang, Yu-Hong Dai
In this paper, we consider a class of nonconvex-nonconcave minimax problems, i. e., NC-PL minimax problems, whose objective functions satisfy the Polyak-\L ojasiewicz (PL) condition with respect to the inner variable.
no code implementations • 9 Dec 2022 • Huiling Zhang, Junlin Wang, Zi Xu, Yu-Hong Dai
$\mathcal{O}\left( \varepsilon ^{-4} \right)$) under nonconvex-strongly concave (resp.