1 code implementation • 13 Nov 2021 • S. Ilker Birbil, Ozgur Martin, Gonenc Onay, Figen Oztoprak
Stochastic gradient descent method and its variants constitute the core optimization algorithms that achieve good convergence rates for solving machine learning problems.
no code implementations • NeurIPS 2012 • Figen Oztoprak, Jorge Nocedal, Steven Rennie, Peder A. Olsen
The second approach, which we call the Orthant-Based Newton method, is a two-phase algorithm that first identifies an orthant face and then minimizes a smooth quadratic approximation of the objective function using the conjugate gradient method.