Search Results for author: Haochuan Li

Found 10 papers, 1 papers with code

Variance-reduced Clipping for Non-convex Optimization

1 code implementation2 Mar 2023 Amirhossein Reisizadeh, Haochuan Li, Subhro Das, Ali Jadbabaie

This is in clear contrast to the well-established assumption in folklore non-convex optimization, a. k. a.

Language Modelling

Tight Analysis of Extra-gradient and Optimistic Gradient Methods For Nonconvex Minimax Problems

no code implementations17 Oct 2022 Pouria Mahdavinia, Yuyang Deng, Haochuan Li, Mehrdad Mahdavi

Despite the established convergence theory of Optimistic Gradient Descent Ascent (OGDA) and Extragradient (EG) methods for the convex-concave minimax problems, little is known about the theoretical guarantees of these methods in nonconvex settings.

On Convergence of Gradient Descent Ascent: A Tight Local Analysis

no code implementations3 Jul 2022 Haochuan Li, Farzan Farnia, Subhro Das, Ali Jadbabaie

In this paper, we aim to bridge this gap by analyzing the \emph{local convergence} of general \emph{nonconvex-nonconcave} minimax problems.

Byzantine-Robust Federated Linear Bandits

no code implementations3 Apr 2022 Ali Jadbabaie, Haochuan Li, Jian Qian, Yi Tian

In this paper, we study a linear bandit optimization problem in a federated setting where a large collection of distributed agents collaboratively learn a common linear bandit model.

Federated Learning

Neural Network Weights Do Not Converge to Stationary Points: An Invariant Measure Perspective

no code implementations12 Oct 2021 Jingzhao Zhang, Haochuan Li, Suvrit Sra, Ali Jadbabaie

This work examines the deep disconnect between existing theoretical analyses of gradient-based algorithms and the practice of training deep neural networks.

Complexity Lower Bounds for Nonconvex-Strongly-Concave Min-Max Optimization

no code implementations NeurIPS 2021 Haochuan Li, Yi Tian, Jingzhao Zhang, Ali Jadbabaie

We provide a first-order oracle complexity lower bound for finding stationary points of min-max optimization problems where the objective function is smooth, nonconvex in the minimization variable, and strongly concave in the maximization variable.

The Stellar Distribution Function and Local Vertical Potential from Gaia DR2

no code implementations18 Jan 2021 Haochuan Li, Lawrence M. Widrow

We develop a novel method to simultaneously determine the vertical potential, force and stellar $z-v_z$ phase space distribution function (DF) in our local patch of the Galaxy.

Astrophysics of Galaxies

Convergence of Adversarial Training in Overparametrized Neural Networks

no code implementations NeurIPS 2019 Ruiqi Gao, Tianle Cai, Haochuan Li, Li-Wei Wang, Cho-Jui Hsieh, Jason D. Lee

Neural networks are vulnerable to adversarial examples, i. e. inputs that are imperceptibly perturbed from natural data and yet incorrectly classified by the network.

Gradient Descent Finds Global Minima of Deep Neural Networks

no code implementations9 Nov 2018 Simon S. Du, Jason D. Lee, Haochuan Li, Li-Wei Wang, Xiyu Zhai

Gradient descent finds a global minimum in training deep neural networks despite the objective function being non-convex.

Randomness in Deconvolutional Networks for Visual Representation

no code implementations2 Apr 2017 Kun He, Jingbo Wang, Haochuan Li, Yao Shu, Mengxiao Zhang, Man Zhu, Li-Wei Wang, John E. Hopcroft

Toward a deeper understanding on the inner work of deep neural networks, we investigate CNN (convolutional neural network) using DCN (deconvolutional network) and randomization technique, and gain new insights for the intrinsic property of this network architecture.

General Classification Image Reconstruction

Cannot find the paper you are looking for? You can Submit a new open access paper.