Search Results for author: Guangzeng Xie

Found 14 papers, 0 papers with code

Lower Complexity Bounds for Finite-Sum Convex-Concave Minimax Optimization Problems

no code implementations ICML 2020 Guangzeng Xie, Luo Luo, Yijiang Lian, Zhihua Zhang

This paper studies the lower bound complexity for minimax optimization problem whose objective function is the average of $n$ individual smooth convex-concave functions.

Near Optimal Stochastic Algorithms for Finite-Sum Unbalanced Convex-Concave Minimax Optimization

no code implementations3 Jun 2021 Luo Luo, Guangzeng Xie, Tong Zhang, Zhihua Zhang

This paper considers stochastic first-order algorithms for convex-concave minimax problems of the form $\min_{\bf x}\max_{\bf y}f(\bf x, \bf y)$, where $f$ can be presented by the average of $n$ individual components which are $L$-average smooth.

Meta-Regularization: An Approach to Adaptive Choice of the Learning Rate in Gradient Descent

no code implementations12 Apr 2021 Guangzeng Xie, Hao Jin, Dachao Lin, Zhihua Zhang

We propose \textit{Meta-Regularization}, a novel approach for the adaptive choice of the learning rate in first-order gradient descent methods.

DIPPA: An improved Method for Bilinear Saddle Point Problems

no code implementations15 Mar 2021 Guangzeng Xie, Yuze Han, Zhihua Zhang

This paper studies bilinear saddle point problems $\min_{\bf{x}} \max_{\bf{y}} g(\bf{x}) + \bf{x}^{\top} \bf{A} \bf{y} - h(\bf{y})$, where the functions $g, h$ are smooth and strongly-convex.

Optimal Quantization for Batch Normalization in Neural Network Deployments and Beyond

no code implementations30 Aug 2020 Dachao Lin, Peiqin Sun, Guangzeng Xie, Shuchang Zhou, Zhihua Zhang

Quantized Neural Networks (QNNs) use low bit-width fixed-point numbers for representing weight parameters and activations, and are often used in real-world applications due to their saving of computation resources and reproducibility of results.

Quantization

A Novel Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization

no code implementations25 Sep 2019 Guangzeng Xie, Luo Luo, Zhihua Zhang

This paper studies the lower bound complexity for the optimization problem whose objective function is the average of $n$ individual smooth convex functions.

A Stochastic Proximal Point Algorithm for Saddle-Point Problems

no code implementations13 Sep 2019 Luo Luo, Cheng Chen, Yu-Jun Li, Guangzeng Xie, Zhihua Zhang

We consider saddle point problems which objective functions are the average of $n$ strongly convex-concave individual components.

A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization

no code implementations22 Aug 2019 Guangzeng Xie, Luo Luo, Zhihua Zhang

This paper studies the lower bound complexity for the optimization problem whose objective function is the average of $n$ individual smooth convex functions.

Hyper-Regularization: An Adaptive Choice for the Learning Rate in Gradient Descent

no code implementations ICLR 2019 Guangzeng Xie, Hao Jin, Dachao Lin, Zhihua Zhang

Specifically, we impose a regularization term on the learning rate via a generalized distance, and cast the joint updating process of the parameter and the learning rate into a maxmin problem.

Accelerated Value Iteration via Anderson Mixing

no code implementations27 Sep 2018 YuJun Li, Chengzhuo Ni, Guangzeng Xie, Wenhao Yang, Shuchang Zhou, Zhihua Zhang

A2VI is more efficient than the modified policy iteration, which is a classical approximate method for policy evaluation.

Atari Games Q-Learning +2

Interpolatron: Interpolation or Extrapolation Schemes to Accelerate Optimization for Deep Neural Networks

no code implementations17 May 2018 Guangzeng Xie, Yitan Wang, Shuchang Zhou, Zhihua Zhang

In this paper we explore acceleration techniques for large scale nonconvex optimization problems with special focuses on deep neural networks.

Cannot find the paper you are looking for? You can Submit a new open access paper.