Search Results for author: Charles G. Frye

Found 3 papers, 0 papers with code

Critical Point-Finding Methods Reveal Gradient-Flat Regions of Deep Network Losses

no code implementations23 Mar 2020 Charles G. Frye, James Simon, Neha S. Wadia, Andrew Ligeralde, Michael R. DeWeese, Kristofer E. Bouchard

Despite the fact that the loss functions of deep neural networks are highly non-convex, gradient-based optimization algorithms converge to approximately the same performance from many random initial points.

Second-order methods

Critical Point Finding with Newton-MR by Analogy to Computing Square Roots

no code implementations12 Jun 2019 Charles G. Frye

Understanding of the behavior of algorithms for resolving the optimization problem (hereafter shortened to OP) of optimizing a differentiable loss function (OP1), is enhanced by knowledge of the critical points of that loss function, i. e. the points where the gradient is 0.

Numerically Recovering the Critical Points of a Deep Linear Autoencoder

no code implementations29 Jan 2019 Charles G. Frye, Neha S. Wadia, Michael R. DeWeese, Kristofer E. Bouchard

Numerically locating the critical points of non-convex surfaces is a long-standing problem central to many fields.

Cannot find the paper you are looking for? You can Submit a new open access paper.