Search Results for author: Botao Li

Found 5 papers, 1 papers with code

The Probabilistic Stability of Stochastic Gradient Descent

no code implementations23 Mar 2023 Liu Ziyin, Botao Li, Tomer Galanti, Masahito Ueda

A fundamental open problem in deep learning theory is how to define and understand the stability of stochastic gradient descent (SGD) close to a fixed point.

Learning Theory

Exact Solutions of a Deep Linear Network

no code implementations10 Feb 2022 Liu Ziyin, Botao Li, Xiangming Meng

This work finds the analytical expression of the global minima of a deep linear network with weight decay and stochastic neurons, a fundamental model for understanding the landscape of neural networks.

SGD Can Converge to Local Maxima

no code implementations ICLR 2022 Liu Ziyin, Botao Li, James B Simon, Masahito Ueda

Stochastic gradient descent (SGD) is widely used for the nonlinear, nonconvex problem of training deep neural networks, but its behavior remains poorly understood.

SGD with a Constant Large Learning Rate Can Converge to Local Maxima

no code implementations25 Jul 2021 Liu Ziyin, Botao Li, James B. Simon, Masahito Ueda

Previous works on stochastic gradient descent (SGD) often focus on its success.

Multithreaded event-chain Monte Carlo with local times

1 code implementation23 Apr 2020 Botao Li, Synge Todo, A. C. Maggs, Werner Krauth

We present a multithreaded event-chain Monte Carlo algorithm (ECMC) for hard spheres.

Computational Physics Soft Condensed Matter

Cannot find the paper you are looking for? You can Submit a new open access paper.