Search Results for author: Yaron Singer

Found 29 papers, 5 papers with code

Tree of Attacks: Jailbreaking Black-Box LLMs Automatically

1 code implementation4 Dec 2023 Anay Mehrotra, Manolis Zampetakis, Paul Kassianik, Blaine Nelson, Hyrum Anderson, Yaron Singer, Amin Karbasi

In this work, we present Tree of Attacks with Pruning (TAP), an automated method for generating jailbreaks that only requires black-box access to the target LLM.

Navigate

Instance Specific Approximations for Submodular Maximization

no code implementations23 Feb 2021 Eric Balkanski, Sharon Qian, Yaron Singer

A major question is therefore how to measure the performance of an algorithm in comparison to an optimal solution on instances we encounter in practice.

The Adaptive Complexity of Maximizing a Gross Substitutes Valuation

no code implementations NeurIPS 2020 Ron Kupfer, Sharon Qian, Eric Balkanski, Yaron Singer

Both the upper and lower bounds are under the assumption that queries are only on feasible sets (i. e., of size at most k).

Adversarial Attacks on Binary Image Recognition Systems

no code implementations22 Oct 2020 Eric Balkanski, Harrison Chase, Kojin Oshiba, Alexander Rilee, Yaron Singer, Richard Wang

Nevertheless, we generalize SCAR to design attacks that fool state-of-the-art check processing systems using unnoticeable perturbations that lead to misclassification of deposit amounts.

Image Classification License Plate Recognition

An Optimal Elimination Algorithm for Learning a Best Arm

no code implementations NeurIPS 2020 Avinatan Hassidim, Ron Kupfer, Yaron Singer

We consider the classic problem of $(\epsilon,\delta)$-PAC learning a best arm where the goal is to identify with confidence $1-\delta$ an arm whose mean is an $\epsilon$-approximation to that of the highest mean arm in a multi-armed bandit setting.

Learning Theory PAC learning

Causal Mediation Analysis for Interpreting Neural NLP: The Case of Gender Bias

1 code implementation26 Apr 2020 Jesse Vig, Sebastian Gehrmann, Yonatan Belinkov, Sharon Qian, Daniel Nevo, Simas Sakenis, Jason Huang, Yaron Singer, Stuart Shieber

Common methods for interpreting neural models in natural language processing typically examine either their structure or their behavior, but not both.

Robustness from Simple Classifiers

no code implementations21 Feb 2020 Sharon Qian, Dimitris Kalimeris, Gal Kaplun, Yaron Singer

Despite the vast success of Deep Neural Networks in numerous application domains, it has been shown that such models are not robust i. e., they are vulnerable to small adversarial perturbations of the input.

The FAST Algorithm for Submodular Maximization

2 code implementations ICML 2020 Adam Breuer, Eric Balkanski, Yaron Singer

Recent algorithms have comparable guarantees in terms of asymptotic worst case analysis, but their actual number of rounds and query complexity depend on very large constants and polynomials in terms of precision and confidence, making them impractical for large data sets.

Predicting Choice with Set-Dependent Aggregation

no code implementations ICML 2020 Nir Rosenfeld, Kojin Oshiba, Yaron Singer

Providing users with alternatives to choose from is an essential component in many online platforms, making the accurate prediction of choice vital to their success.

Robust Attacks against Multiple Classifiers

1 code implementation6 Jun 2019 Juan C. Perdomo, Yaron Singer

We address the challenge of designing optimal adversarial noise algorithms for settings where a learner has access to multiple classifiers.

General Classification Image Classification

Optimal Attacks against Multiple Classifiers

no code implementations ICLR 2019 Juan C. Perdomo, Yaron Singer

The main technical challenge we consider is the design of best response oracles that can be implemented in a Multiplicative Weight Updates framework to find equilibrium strategies in the zero-sum game.

Image Classification

Robust Influence Maximization for Hyperparametric Models

no code implementations9 Mar 2019 Dimitris Kalimeris, Gal Kaplun, Yaron Singer

A recent surging research direction in influence maximization focuses on the case where the edge probabilities on the graph are not arbitrary but are generated as a function of the features of the users and a global hyperparameter.

Fast Parallel Algorithms for Statistical Subset Selection Problems

1 code implementation NeurIPS 2019 Sharon Qian, Yaron Singer

Recently, there has been a surge of interest in a parallel optimization technique called adaptive sampling which produces solutions with desirable approximation guarantees for submodular maximization in exponentially faster parallel runtime.

Combinatorial Optimization Experimental Design +1

Optimization for Approximate Submodularity

no code implementations NeurIPS 2018 Yaron Singer, Avinatan Hassidim

We consider the problem of maximizing a submodular function when given access to its approximate version.

Robust Classification of Financial Risk

no code implementations27 Nov 2018 Suproteem K. Sarkar, Kojin Oshiba, Daniel Giebisch, Yaron Singer

To the best of our knowledge, this is the first study of adversarial attacks and defenses for deep learning in financial services.

BIG-bench Machine Learning Classification +3

Parallelization does not Accelerate Convex Optimization: Adaptivity Lower Bounds for Non-smooth Convex Minimization

no code implementations12 Aug 2018 Eric Balkanski, Yaron Singer

For the problem of minimizing a non-smooth convex function $f:[0, 1]^n\to \mathbb{R}$ over the unit Euclidean ball, we give a tight lower bound that shows that even when $\texttt{poly}(n)$ queries can be executed in parallel, there is no randomized algorithm with $\tilde{o}(n^{1/3})$ rounds of adaptivity that has convergence rate that is better than those achievable with a one-query-per-round algorithm.

Combinatorial Optimization

Approximation Guarantees for Adaptive Sampling

no code implementations ICML 2018 Eric Balkanski, Yaron Singer

In particular, we show that under very mild conditions of curvature of a function, adaptive sampling techniques achieve an approximation arbitrarily close to 1/2 while maintaining their low adaptivity.

Learning Diffusion using Hyperparameters

no code implementations ICML 2018 Dimitris Kalimeris, Yaron Singer, Karthik Subbian, Udi Weinsberg

Despite this obstacle, we can shrink the best-known sample complexity bound for learning IC by a factor of |E|/d where |E| is the number of edges in the graph and d is the dimension of the hyperparameter.

Minimizing a Submodular Function from Samples

no code implementations NeurIPS 2017 Eric Balkanski, Yaron Singer

In this paper we consider the problem of minimizing a submodular function from training data.

Robust Guarantees of Stochastic Greedy Algorithms

no code implementations ICML 2017 Avinatan Hassidim, Yaron Singer

In this paper we analyze the robustness of stochastic variants of the greedy algorithm for submodular maximization.

Robust Optimization for Non-Convex Objectives

no code implementations NeurIPS 2017 Robert Chen, Brendan Lucier, Yaron Singer, Vasilis Syrgkanis

We consider robust optimization problems, where the goal is to optimize in the worst case over a class of objective functions.

Bayesian Optimization General Classification

Maximization of Approximately Submodular Functions

no code implementations NeurIPS 2016 Thibaut Horel, Yaron Singer

We study the problem of maximizing a function that is approximately submodular under a cardinality constraint.

The Power of Optimization from Samples

no code implementations NeurIPS 2016 Eric Balkanski, Aviad Rubinstein, Yaron Singer

In this paper we show that for any monotone submodular function with curvature c there is a (1 - c)/(1 + c - c^2) approximation algorithm for maximization under cardinality constraints when polynomially-many samples are drawn from the uniform distribution over feasible sets.

Submodular Optimization under Noise

no code implementations12 Jan 2016 Avinatan Hassidim, Yaron Singer

We provide initial answers, by focusing on the question of maximizing a monotone submodular function under a cardinality constraint when given access to a noisy oracle of the function.

The Limitations of Optimization from Samples

no code implementations19 Dec 2015 Eric Balkanski, Aviad Rubinstein, Yaron Singer

In particular, our main result shows that there is no constant factor approximation for maximizing coverage functions under a cardinality constraint using polynomially-many samples drawn from any distribution.

Information-theoretic lower bounds for convex optimization with erroneous oracles

no code implementations NeurIPS 2015 Yaron Singer, Jan Vondrak

We consider the problem of optimizing convex and concave functions with access to an erroneous zeroth-order oracle.

Learnability of Influence in Networks

no code implementations NeurIPS 2015 Harikrishna Narasimhan, David C. Parkes, Yaron Singer

We establish PAC learnability of influence functions for three common influence models, namely, the Linear Threshold (LT), Independent Cascade (IC) and Voter models, and present concrete sample complexity results in each case.

Cannot find the paper you are looking for? You can Submit a new open access paper.