Search Results for author: ChaeHwan Song

Found 3 papers, 0 papers with code

Subquadratic Overparameterization for Shallow Neural Networks

no code implementations NeurIPS 2021 ChaeHwan Song, Ali Ramezani-Kebrya, Thomas Pethick, Armin Eftekhari, Volkan Cevher

Overparameterization refers to the important phenomenon where the width of a neural network is chosen such that learning algorithms can provably attain zero loss in nonconvex training.

Linear Convergence of SGD on Overparametrized Shallow Neural Networks

no code implementations29 Sep 2021 Paul Rolland, Ali Ramezani-Kebrya, ChaeHwan Song, Fabian Latorre, Volkan Cevher

Despite the non-convex landscape, first-order methods can be shown to reach global minima when training overparameterized neural networks, where the number of parameters far exceed the number of training data.

Nearly Minimal Over-Parametrization of Shallow Neural Networks

no code implementations9 Oct 2019 Armin Eftekhari, ChaeHwan Song, Volkan Cevher

A recent line of work has shown that an overparametrized neural network can perfectly fit the training data, an otherwise often intractable nonconvex optimization problem.

Cannot find the paper you are looking for? You can Submit a new open access paper.