no code implementations • 20 Oct 2023 • Etienne Bamas, Sai Ganesh Nagarajan, Ola Svensson
For any $\alpha>2$, we show that $D^\alpha$ seeding guarantees in expectation an approximation factor of $$ O_\alpha \left((g_\alpha)^{2/\alpha}\cdot \left(\frac{\sigma_{\mathrm{max}}}{\sigma_{\mathrm{min}}}\right)^{2-4/\alpha}\cdot (\min\{\ell,\log k\})^{2/\alpha}\right)$$ with respect to the standard $k$-means cost of any underlying clustering; where $g_\alpha$ is a parameter capturing the concentration of the points in each cluster, $\sigma_{\mathrm{max}}$ and $\sigma_{\mathrm{min}}$ are the maximum and minimum standard deviation of the clusters around their means, and $\ell$ is the number of distinct mixing weights in the underlying clustering (after rounding them to the nearest power of $2$).
no code implementations • NeurIPS 2021 • Vincent Cohen-Addad, Silvio Lattanzi, Ashkan Norouzi-Fard, Christian Sohler, Ola Svensson
In this paper we introduce a new parallel algorithm for the Euclidean hierarchical $k$-median problem that, when using machines with memory $s$ (for $s\in \Omega(\log^2 (n+\Delta+d))$), outputs a hierarchical clustering such that for every fixed value of $k$ the cost of the solution is at most an $O(\min\{d, \log n\} \log \Delta)$ factor larger in expectation than that of an optimal solution.
no code implementations • NeurIPS 2021 • Buddhima Gamlath, Xinrui Jia, Adam Polak, Ola Svensson
We give an algorithm that outputs an explainable clustering that loses at most a factor of $O(\log^2 k)$ compared to an optimal (not necessarily explainable) clustering for the $k$-medians objective, and a factor of $O(k \log^2 k)$ for the $k$-means objective.
no code implementations • 8 Feb 2021 • Paritosh Garg, Linus Jordan, Ola Svensson
Their approach is based on the versatile local ratio technique and also applies to generalizations such as weighted hypergraph matchings.
Data Structures and Algorithms
no code implementations • NeurIPS 2020 • Vincent Cohen-Addad, Silvio Lattanzi, Ashkan Norouzi-Fard, Christian Sohler, Ola Svensson
$k$-means++ \cite{arthur2007k} is a widely used clustering algorithm that is easy to implement, has nice theoretical guarantees and strong empirical performance.
1 code implementation • NeurIPS 2020 • Étienne Bamas, Andreas Maggiori, Ola Svensson
The extension of classical online algorithms when provided with predictions is a new and active research area.
1 code implementation • NeurIPS 2020 • Étienne Bamas, Andreas Maggiori, Lars Rohwedder, Ola Svensson
As power management has become a primary concern in modern data centers, computing resources are being scaled dynamically to minimize energy consumption.
no code implementations • 6 Aug 2018 • Ashkan Norouzi-Fard, Jakub Tarnawski, Slobodan Mitrović, Amir Zandieh, Aida Mousavifar, Ola Svensson
It is the first low-memory, single-pass algorithm that improves the factor $0. 5$, under the natural assumption that elements arrive in a random order.
no code implementations • ICML 2018 • Ashkan Norouzi-Fard, Jakub Tarnawski, Slobodan Mitrovic, Amir Zandieh, Aidasadat Mousavifar, Ola Svensson
It is the first low-memory, singlepass algorithm that improves the factor 0. 5, under the natural assumption that elements arrive in a random order.
no code implementations • NeurIPS 2016 • Aditya Bhaskara, Mehrdad Ghadiri, Vahab Mirrokni, Ola Svensson
We first study the approximation quality of the algorithm by comparing with the LP objective.