no code implementations • 19 Jul 2022 • Dewei Zhang, Sam Davanloo Tajbakhsh
For two-level composition optimization, we present a Riemannian Stochastic Composition Gradient Descent (R-SCGD) method that finds an approximate stationary point, with expected squared Riemannian gradient smaller than $\epsilon$, in $O(\epsilon^{-2})$ calls to the stochastic gradient oracle of the outer function and stochastic function and gradient oracles of the inner function.
no code implementations • 11 May 2016 • Sam Davanloo Tajbakhsh, Necdet Serhat Aybat, Enrique del Castillo
We present a new method for estimating multivariate, second-order stationary Gaussian Random Field (GRF) models based on the Sparse Precision matrix Selection (SPS) algorithm, proposed by Davanloo et al. (2015) for estimating scalar GRF models.
1 code implementation • 21 May 2014 • Sam Davanloo Tajbakhsh, Necdet Serhat Aybat, Enrique del Castillo
Iterative methods for fitting a Gaussian Random Field (GRF) model via maximum likelihood (ML) estimation requires solving a nonconvex optimization problem.