no code implementations • ICML 2020 • Ran Haba, Ehsan Kazemi, Moran Feldman, Amin Karbasi
Moreover, we propose the first streaming algorithms for monotone submodular maximization subject to $k$-extendible and $k$-system constraints.
no code implementations • 26 May 2023 • Loay Mualem, Ethan R. Elenberg, Moran Feldman, Amin Karbasi
Despite the rich existing literature about minimax optimization in continuous settings, only very partial results of this kind have been obtained for combinatorial settings.
no code implementations • 12 Oct 2022 • Loay Mualem, Moran Feldman
We also present an inapproximability result showing that our online algorithm and Du's (2022) offline algorithm are both optimal in a strong sense.
no code implementations • 7 Feb 2022 • Loay Mualem, Moran Feldman
Over the last two decades, submodular function maximization has been the workhorse of many discrete optimization problems in machine learning applications.
no code implementations • NeurIPS 2021 • Siddharth Mitra, Moran Feldman, Amin Karbasi
It has been well established that first order optimization methods can converge to the maximal objective value of concave functions and provide constant factor approximation guarantees for (non-convex/non-concave) continuous submodular functions.
no code implementations • 6 Apr 2021 • Christopher Harshaw, Ehsan Kazemi, Moran Feldman, Amin Karbasi
We propose subsampling as a unified algorithmic technique for submodular maximization in centralized and online settings.
1 code implementation • 29 Sep 2020 • Moran Feldman, Christopher Harshaw, Amin Karbasi
We also present SubmodularGreedy. jl, a Julia package which implements these algorithms and may be downloaded at https://github. com/crharshaw/SubmodularGreedy. jl .
no code implementations • NeurIPS 2020 • Moran Feldman, Amin Karbasi
We first prove that a simple variant of the vanilla coordinate ascent, called Coordinate-Ascent+, achieves a $(\frac{e-1}{2e-1}-\varepsilon)$-approximation guarantee while performing $O(n/\varepsilon)$ iterations, where the computational complexity of each iteration is roughly $O(n/\sqrt{\varepsilon}+n\log n)$ (here, $n$ denotes the dimension of the optimization problem).
no code implementations • 16 Jun 2020 • Wenxin Li, Moran Feldman, Ehsan Kazemi, Amin Karbasi
In this paper, we provide the first deterministic algorithm that achieves the tight $1-1/e$ approximation guarantee for submodular maximization under a cardinality (size) constraint while making a number of queries that scales only linearly with the size of the ground set $n$.
no code implementations • 10 Feb 2020 • Ehsan Kazemi, Shervin Minaee, Moran Feldman, Amin Karbasi
In this paper, we propose scalable methods for maximizing a regularized submodular function $f = g - \ell$ expressed as the difference between a monotone submodular function $g$ and a modular function $\ell$.
1 code implementation • 9 Feb 2020 • Ran Haba, Ehsan Kazemi, Moran Feldman, Amin Karbasi
In this paper, we propose a novel framework that converts streaming algorithms for monotone submodular maximization into streaming algorithms for non-monotone submodular maximization.
no code implementations • 11 Jun 2019 • Moran Feldman, Ran Haba
In this paper we consider the problem of finding a maximum weight set subject to a $k$-extendible constraint in the data stream model.
Data Structures and Algorithms 68W40 (Primary) 68R05 (Secondary) F.2.2; G.1.6; G.2.1
1 code implementation • 19 Apr 2019 • Christopher Harshaw, Moran Feldman, Justin Ward, Amin Karbasi
It is generally believed that submodular functions -- and the more general class of $\gamma$-weakly submodular functions -- may only be optimized under the non-negativity assumption $f(S) \geq 0$.
1 code implementation • NeurIPS 2019 • Marko Mitrovic, Ehsan Kazemi, Moran Feldman, Andreas Krause, Amin Karbasi
In many machine learning applications, one needs to interactively select a sequence of items (e. g., recommending movies based on a user's feedback) or make sequential decisions in a certain order (e. g., guiding an agent through a series of states).
no code implementations • 15 Nov 2018 • Lin Chen, Moran Feldman, Amin Karbasi
In this paper, we consider the unconstrained submodular maximization problem.
no code implementations • NeurIPS 2018 • Moran Feldman, Amin Karbasi, Ehsan Kazemi
In this paper, we develop the first one-pass streaming algorithm for submodular maximization that does not evaluate the entire stream even once.
no code implementations • ICML 2018 • Lin Chen, Moran Feldman, Amin Karbasi
In this paper, we prove that a randomized version of the greedy algorithm (previously used by Buchbinder et al. (2014) for a different problem) achieves an approximation ratio of $(1 + 1/\gamma)^{-2}$ for the maximization of a weakly submodular function subject to a general matroid constraint, where $\gamma$ is a parameter measuring the distance of the function from submodularity.
no code implementations • 5 Apr 2017 • Moran Feldman, Christopher Harshaw, Amin Karbasi
Sample Greedy achieves $(k + 3)$-approximation with only $O(nr/k)$ function evaluations.
1 code implementation • NeurIPS 2017 • Ethan R. Elenberg, Alexandros G. Dimakis, Moran Feldman, Amin Karbasi
In many machine learning applications, it is important to explain the predictions of a black-box classifier.