no code implementations • 19 Apr 2013 • Rachael Tappenden, Peter Richtárik, Jacek Gondzio
In this paper we consider the problem of minimizing a convex function using a randomized block coordinate descent method.
no code implementations • 30 Aug 2013 • Rachael Tappenden, Peter Richtarik, Burak Buke
In this paper we study decomposition methods based on separable approximations for minimizing the augmented Lagrangian.
no code implementations • 8 Jun 2015 • Chenxin Ma, Rachael Tappenden, Martin Takáč
We show that the famous SDCA algorithm for optimizing the SVM dual problem, or the stochastic coordinate descent method for the LASSO problem, fits into the framework of RC-FDM.
no code implementations • 10 Oct 2017 • Majid Jahani, Naga Venkata C. Gudapati, Chenxin Ma, Rachael Tappenden, Martin Takáč
In this work we introduce the concept of an Underestimate Sequence (UES), which is motivated by Nesterov's estimate sequence.
no code implementations • 6 Jun 2020 • Majid Jahani, MohammadReza Nazari, Rachael Tappenden, Albert S. Berahas, Martin Takáč
This work presents a new algorithm for empirical risk minimization.
no code implementations • 1 Jun 2022 • Abdurakhmon Sadiev, Aleksandr Beznosikov, Abdulla Jasem Almansoori, Dmitry Kamzolov, Rachael Tappenden, Martin Takáč
There are several algorithms for such problems, but existing methods often work poorly when the problem is badly scaled and/or ill-conditioned, and a primary goal of this work is to introduce methods that alleviate this issue.
no code implementations • 2 Nov 2022 • Rachael Tappenden, Martin Takáč
This work shows that applying Gradient Descent (GD) with a fixed step size to minimize a (possibly nonconvex) quadratic function is equivalent to running the Power Method (PM) on the gradients.