Paper

Differentiable Greedy Submodular Maximization: Guarantees, Gradient Estimators, and Applications

Motivated by, e.g., sensitivity analysis and end-to-end learning, the demand for differentiable optimization algorithms has been significantly increasing. In this paper, we establish a theoretically guaranteed versatile framework that makes the greedy algorithm for monotone submodular function maximization differentiable... (read more)

Results in Papers With Code
(↓ scroll down to see all results)