Search Results for author: Necdet Serhat Aybat

Found 10 papers, 1 papers with code

A Variance-Reduced Stochastic Accelerated Primal Dual Algorithm

no code implementations19 Feb 2022 Bugra Can, Mert Gurbuzbalaban, Necdet Serhat Aybat

In this work, we consider strongly convex strongly concave (SCSC) saddle point (SP) problems $\min_{x\in\mathbb{R}^{d_x}}\max_{y\in\mathbb{R}^{d_y}}f(x, y)$ where $f$ is $L$-smooth, $f(., y)$ is $\mu$-strongly convex for every $y$, and $f(x,.

A Universally Optimal Multistage Accelerated Stochastic Gradient Method

no code implementations NeurIPS 2019 Necdet Serhat Aybat, Alireza Fallah, Mert Gurbuzbalaban, Asuman Ozdaglar

We study the problem of minimizing a strongly convex, smooth function when we have noisy estimates of its gradient.

Efficient Optimization Algorithms for Robust Principal Component Analysis and Its Variants

no code implementations9 Jun 2018 Shiqian Ma, Necdet Serhat Aybat

Robust PCA has drawn significant attention in the last decade due to its success in numerous application domains, ranging from bio-informatics, statistics, and machine learning to image and video processing in computer vision.

Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions

no code implementations27 May 2018 Necdet Serhat Aybat, Alireza Fallah, Mert Gurbuzbalaban, Asuman Ozdaglar

We study the trade-offs between convergence rate and robustness to gradient errors in designing a first-order algorithm.

A primal-dual method for conic constrained distributed optimization problems

no code implementations NeurIPS 2016 Necdet Serhat Aybat, Erfan Yazdandoost Hamedani

We consider cooperative multi-agent consensus optimization problems over an undirected network of agents, where only those agents connected by an edge can directly communicate.

Distributed Optimization

Generalized Sparse Precision Matrix Selection for Fitting Multivariate Gaussian Random Fields to Large Data Sets

no code implementations11 May 2016 Sam Davanloo Tajbakhsh, Necdet Serhat Aybat, Enrique del Castillo

We present a new method for estimating multivariate, second-order stationary Gaussian Random Field (GRF) models based on the Sparse Precision matrix Selection (SPS) algorithm, proposed by Davanloo et al. (2015) for estimating scalar GRF models.

An Asynchronous Distributed Proximal Gradient Method for Composite Convex Optimization

no code implementations30 Sep 2014 Necdet Serhat Aybat, Garud Iyengar, Zi Wang

We propose a distributed first-order augmented Lagrangian (DFAL) algorithm to minimize the sum of composite convex functions, where each term in the sum is a private cost function belonging to a node, and only nodes connected by an edge can directly communicate with each other.

Optimization and Control

On the Theoretical Guarantees for Parameter Estimation of Gaussian Random Field Models: A Sparse Precision Matrix Approach

1 code implementation21 May 2014 Sam Davanloo Tajbakhsh, Necdet Serhat Aybat, Enrique del Castillo

Iterative methods for fitting a Gaussian Random Field (GRF) model via maximum likelihood (ML) estimation requires solving a nonconvex optimization problem.

Efficient Algorithms for Robust and Stable Principal Component Pursuit Problems

no code implementations26 Sep 2013 Necdet Serhat Aybat, Donald Goldfarb, Shiqian Ma

Moreover, if the observed data matrix has also been corrupted by a dense noise matrix in addition to gross sparse error, then the stable principal component pursuit (SPCP) problem is solved to recover the low-rank matrix.

Optimization and Control

Fast First-Order Methods for Stable Principal Component Pursuit

no code implementations11 May 2011 Necdet Serhat Aybat, Donald Goldfarb, Garud Iyengar

The stable principal component pursuit (SPCP) problem is a non-smooth convex optimization problem, the solution of which has been shown both in theory and in practice to enable one to recover the low rank and sparse components of a matrix whose elements have been corrupted by Gaussian noise.

Optimization and Control

Cannot find the paper you are looking for? You can Submit a new open access paper.