Search Results for author: Eric Balkanski

Found 14 papers, 1 papers with code

Energy-Efficient Scheduling with Predictions

no code implementations NeurIPS 2023 Eric Balkanski, Noemie Perivier, Clifford Stein, Hao-Ting Wei

We show that, when the prediction error is small, this framework gives improved competitive ratios for many different energy-efficient scheduling problems, including energy minimization with deadlines, while also maintaining a bounded competitive ratio regardless of the prediction error.

Scheduling

Scheduling with Speed Predictions

no code implementations2 May 2022 Eric Balkanski, Tingting Ou, Clifford Stein, Hao-Ting Wei

In the context of scheduling, very recent work has leveraged machine-learned predictions to design algorithms that achieve improved approximation ratios in settings where the processing times of the jobs are initially unknown.

Scheduling

Learning Low Degree Hypergraphs

no code implementations21 Feb 2022 Eric Balkanski, Oussama Hanguir, Shatian Wang

To the best of our knowledge, these are the first algorithms with poly$(n, m)$ query complexity for learning non-trivial families of hypergraphs that have a super-constant number of edges of super-constant size.

Instance Specific Approximations for Submodular Maximization

no code implementations23 Feb 2021 Eric Balkanski, Sharon Qian, Yaron Singer

A major question is therefore how to measure the performance of an algorithm in comparison to an optimal solution on instances we encounter in practice.

The Adaptive Complexity of Maximizing a Gross Substitutes Valuation

no code implementations NeurIPS 2020 Ron Kupfer, Sharon Qian, Eric Balkanski, Yaron Singer

Both the upper and lower bounds are under the assumption that queries are only on feasible sets (i. e., of size at most k).

Adversarial Attacks on Binary Image Recognition Systems

no code implementations22 Oct 2020 Eric Balkanski, Harrison Chase, Kojin Oshiba, Alexander Rilee, Yaron Singer, Richard Wang

Nevertheless, we generalize SCAR to design attacks that fool state-of-the-art check processing systems using unnoticeable perturbations that lead to misclassification of deposit amounts.

Image Classification License Plate Recognition

The FAST Algorithm for Submodular Maximization

2 code implementations ICML 2020 Adam Breuer, Eric Balkanski, Yaron Singer

Recent algorithms have comparable guarantees in terms of asymptotic worst case analysis, but their actual number of rounds and query complexity depend on very large constants and polynomials in terms of precision and confidence, making them impractical for large data sets.

Parallelization does not Accelerate Convex Optimization: Adaptivity Lower Bounds for Non-smooth Convex Minimization

no code implementations12 Aug 2018 Eric Balkanski, Yaron Singer

For the problem of minimizing a non-smooth convex function $f:[0, 1]^n\to \mathbb{R}$ over the unit Euclidean ball, we give a tight lower bound that shows that even when $\texttt{poly}(n)$ queries can be executed in parallel, there is no randomized algorithm with $\tilde{o}(n^{1/3})$ rounds of adaptivity that has convergence rate that is better than those achievable with a one-query-per-round algorithm.

Combinatorial Optimization

Approximation Guarantees for Adaptive Sampling

no code implementations ICML 2018 Eric Balkanski, Yaron Singer

In particular, we show that under very mild conditions of curvature of a function, adaptive sampling techniques achieve an approximation arbitrarily close to 1/2 while maintaining their low adaptivity.

Minimizing a Submodular Function from Samples

no code implementations NeurIPS 2017 Eric Balkanski, Yaron Singer

In this paper we consider the problem of minimizing a submodular function from training data.

Statistical Cost Sharing

no code implementations NeurIPS 2017 Eric Balkanski, Umar Syed, Sergei Vassilvitskii

We first show that when cost functions come from the family of submodular functions with bounded curvature, $\kappa$, the Shapley value can be approximated from samples up to a $\sqrt{1 - \kappa}$ factor, and that the bound is tight.

The Power of Optimization from Samples

no code implementations NeurIPS 2016 Eric Balkanski, Aviad Rubinstein, Yaron Singer

In this paper we show that for any monotone submodular function with curvature c there is a (1 - c)/(1 + c - c^2) approximation algorithm for maximization under cardinality constraints when polynomially-many samples are drawn from the uniform distribution over feasible sets.

The Limitations of Optimization from Samples

no code implementations19 Dec 2015 Eric Balkanski, Aviad Rubinstein, Yaron Singer

In particular, our main result shows that there is no constant factor approximation for maximizing coverage functions under a cardinality constraint using polynomially-many samples drawn from any distribution.

Cannot find the paper you are looking for? You can Submit a new open access paper.