Search Results for author: Bugra Can

Found 5 papers, 1 papers with code

A Variance-Reduced Stochastic Accelerated Primal Dual Algorithm

no code implementations19 Feb 2022 Bugra Can, Mert Gurbuzbalaban, Necdet Serhat Aybat

In this work, we consider strongly convex strongly concave (SCSC) saddle point (SP) problems $\min_{x\in\mathbb{R}^{d_x}}\max_{y\in\mathbb{R}^{d_y}}f(x, y)$ where $f$ is $L$-smooth, $f(., y)$ is $\mu$-strongly convex for every $y$, and $f(x,.

TENGraD: Time-Efficient Natural Gradient Descent with Exact Fisher-Block Inversion

1 code implementation7 Jun 2021 Saeed Soori, Bugra Can, Baourun Mu, Mert Gürbüzbalaban, Maryam Mehri Dehnavi

This work proposes a time-efficient Natural Gradient Descent method, called TENGraD, with linear convergence guarantees.

Image Classification

IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method

no code implementations NeurIPS 2020 Yossi Arjevani, Joan Bruna, Bugra Can, Mert Gürbüzbalaban, Stefanie Jegelka, Hongzhou Lin

We introduce a framework for designing primal methods under the decentralized optimization setting where local functions are smooth and strongly convex.

ASYNC: A Cloud Engine with Asynchrony and History for Distributed Machine Learning

no code implementations19 Jul 2019 Saeed Soori, Bugra Can, Mert Gurbuzbalaba, Maryam Mehri Dehnavi

ASYNC is a framework that supports the implementation of asynchrony and history for optimization methods on distributed computing platforms.

BIG-bench Machine Learning Distributed Computing

Accelerated Linear Convergence of Stochastic Momentum Methods in Wasserstein Distances

no code implementations22 Jan 2019 Bugra Can, Mert Gurbuzbalaban, Lingjiong Zhu

In the special case of strongly convex quadratic objectives, we can show accelerated linear rates in the $p$-Wasserstein metric for any $p\geq 1$ with improved sensitivity to noise for both AG and HB through a non-asymptotic analysis under some additional assumptions on the noise structure.

Cannot find the paper you are looking for? You can Submit a new open access paper.