Non-monotone Submodular Maximization in Exponentially Fewer Iterations

NeurIPS 2018  ·  Balkanski Eric, Breuer Adam, Singer Yaron ·

In this paper we consider parallelization for applications whose objective can be expressed as maximizing a non-monotone submodular function under a cardinality constraint. Our main result is an algorithm whose approximation is arbitrarily close to $1/2e$ in $O(\log^2 n)$ adaptive rounds, where $n$ is the size of the ground set. This is an exponential speedup in parallel running time over any previously studied algorithm for constrained non-monotone submodular maximization. Beyond its provable guarantees, the algorithm performs well in practice. Specifically, experiments on traffic monitoring and personalized data summarization applications show that the algorithm finds solutions whose values are competitive with state-of-the-art algorithms while running in exponentially fewer parallel iterations.

PDF Abstract NeurIPS 2018 PDF NeurIPS 2018 Abstract
No code implementations yet. Submit your code now

Categories


Data Structures and Algorithms

Datasets


  Add Datasets introduced or used in this paper