Search Results for author: Figen Oztoprak

Found 2 papers, 1 papers with code

Bolstering Stochastic Gradient Descent with Model Building

1 code implementation13 Nov 2021 S. Ilker Birbil, Ozgur Martin, Gonenc Onay, Figen Oztoprak

Stochastic gradient descent method and its variants constitute the core optimization algorithms that achieve good convergence rates for solving machine learning problems.

Newton-Like Methods for Sparse Inverse Covariance Estimation

no code implementations NeurIPS 2012 Figen Oztoprak, Jorge Nocedal, Steven Rennie, Peder A. Olsen

The second approach, which we call the Orthant-Based Newton method, is a two-phase algorithm that first identifies an orthant face and then minimizes a smooth quadratic approximation of the objective function using the conjugate gradient method.

Cannot find the paper you are looking for? You can Submit a new open access paper.