no code implementations • 12 Feb 2024 • Yuepeng Yang, Antares Chen, Lorenzo Orecchia, Cong Ma
On the analytical front, we provide a refined $\ell_\infty$ error analysis of the weighted MLE that is more explicit and tighter than existing analyses.
no code implementations • 29 Jun 2019 • Jelena Diakonikolas, Lorenzo Orecchia
This note provides a novel, simple analysis of the method of conjugate gradients for the minimization of convex quadratic functions.
no code implementations • ICML 2018 • Jelena Diakonikolas, Lorenzo Orecchia
While various block-coordinate-descent-type methods have been studied extensively, only alternating minimization – which applies to the setting of only two blocks – is known to have convergence time that scales independently of the least smooth block.
no code implementations • ICML 2017 • Cem Aksoylar, Lorenzo Orecchia, Venkatesh Saligrama
We propose a novel, computationally efficient mirror-descent based optimization framework for subgraph detection in graph-structured data.
no code implementations • 16 Jun 2015 • Zeyuan Allen-Zhu, Zhenyu Liao, Lorenzo Orecchia
In this paper, we provide a novel construction of the linear-sized spectral sparsifiers of Batson, Spielman and Srivastava [BSS14].
no code implementations • 6 Jul 2014 • Zeyuan Allen-Zhu, Lorenzo Orecchia
First-order methods play a central role in large-scale machine learning.
no code implementations • 10 Jul 2013 • Lorenzo Orecchia, Zeyuan Allen Zhu
A very elegant algorithm for this problem has been given by Andersen and Lang [AL08] and requires solving a small number of single-commodity maximum flow computations over the whole graph G. In this paper, we introduce LocalImprove, the first cut-improvement algorithm that is local, i. e. that runs in time dependent on the size of the input set A rather than on the size of the entire graph.