1 code implementation • NeurIPS 2023 • Mohammad Mozaffari, Sikan Li, Zhao Zhang, Maryam Mehri Dehnavi
This work proposes a Momentum-Enabled Kronecker-Factor-Based Optimizer Using Rank-1 updates, called MKOR, that improves the training time and convergence properties of deep neural networks (DNNs).
1 code implementation • 7 Jun 2021 • Saeed Soori, Bugra Can, Baourun Mu, Mert Gürbüzbalaban, Maryam Mehri Dehnavi
This work proposes a time-efficient Natural Gradient Descent method, called TENGraD, with linear convergence guarantees.
no code implementations • 19 Jul 2019 • Saeed Soori, Bugra Can, Mert Gurbuzbalaba, Maryam Mehri Dehnavi
ASYNC is a framework that supports the implementation of asynchrony and history for optimization methods on distributed computing platforms.
no code implementations • 24 Oct 2017 • Saeed Soori, Aditya Devarakonda, James Demmel, Mert Gurbuzbalaban, Maryam Mehri Dehnavi
We formulate the algorithm for two different optimization methods on the Lasso problem and show that the latency cost is reduced by a factor of k while bandwidth and floating-point operation costs remain the same.