Search Results for author: Hossein Esfandiari

Found 20 papers, 4 papers with code

Robust and differentially private stochastic linear bandits

no code implementations23 Apr 2023 Vasileios Charisopoulos, Hossein Esfandiari, Vahab Mirrokni

In this paper, we study the stochastic linear bandit problem under the additional requirements of differential privacy, robustness and batched observations.

Adversarial Robustness

Anonymous Bandits for Multi-User Systems

no code implementations21 Oct 2022 Hossein Esfandiari, Vahab Mirrokni, Jon Schneider

In this work, we present and study a new framework for online learning in systems with multiple users that provide user anonymity.

Clustering

Replicable Bandits

no code implementations4 Oct 2022 Hossein Esfandiari, Alkis Kalavasis, Amin Karbasi, Andreas Krause, Vahab Mirrokni, Grigoris Velegkas

Similarly, for stochastic linear bandits (with finitely and infinitely many arms) we develop replicable policies that achieve the best-known problem-independent regret bounds with an optimal dependency on the replicability parameter.

Multi-Armed Bandits

Smooth Anonymity for Sparse Binary Matrices

no code implementations13 Jul 2022 Hossein Esfandiari, Alessandro Epasto, Vahab Mirrokni, Andres Munoz Medina, Sergei Vassilvitskii

When working with user data providing well-defined privacy guarantees is paramount.

Tackling Provably Hard Representative Selection via Graph Neural Networks

1 code implementation20 May 2022 Mehran Kazemi, Anton Tsitsulin, Hossein Esfandiari, Mohammadhossein Bateni, Deepak Ramachandran, Bryan Perozzi, Vahab Mirrokni

Representative Selection (RS) is the problem of finding a small subset of exemplars from a dataset that is representative of the dataset.

Active Learning Data Compression +1

Improved Approximations for Euclidean $k$-means and $k$-median, via Nested Quasi-Independent Sets

no code implementations11 Apr 2022 Vincent Cohen-Addad, Hossein Esfandiari, Vahab Mirrokni, Shyam Narayanan

Motivated by data analysis and machine learning applications, we consider the popular high-dimensional Euclidean $k$-median and $k$-means problems.

Tight and Robust Private Mean Estimation with Few Users

no code implementations22 Oct 2021 Hossein Esfandiari, Vahab Mirrokni, Shyam Narayanan

In particular, we provide a nearly optimal trade-off between the number of users and the number of samples per user required for private mean estimation, even when the number of users is as low as $O(\frac{1}{\varepsilon}\log\frac{1}{\delta})$.

Label differential privacy via clustering

no code implementations5 Oct 2021 Hossein Esfandiari, Vahab Mirrokni, Umar Syed, Sergei Vassilvitskii

We present new mechanisms for \emph{label differential privacy}, a relaxation of differentially private machine learning that only protects the privacy of the labels in the training set.

Clustering

Feature Cross Search via Submodular Optimization

no code implementations5 Jul 2021 Lin Chen, Hossein Esfandiari, Gang Fu, Vahab S. Mirrokni, Qian Yu

First, we show that it is not possible to provide an $n^{1/\log\log n}$-approximation algorithm for this problem unless the exponential time hypothesis fails.

Feature Engineering

Almost Tight Approximation Algorithms for Explainable Clustering

no code implementations1 Jul 2021 Hossein Esfandiari, Vahab Mirrokni, Shyam Narayanan

Next, we study the $k$-means problem in this context and provide an $O(k \log k)$-approximation algorithm for explainable $k$-means, improving over the $O(k^2)$ bound of Dasgupta et al. and the $O(d k \log k)$ bound of \cite{laber2021explainable}.

Clustering

Contextual Reserve Price Optimization in Auctions via Mixed Integer Programming

1 code implementation NeurIPS 2020 Joey Huchette, Haihao Lu, Hossein Esfandiari, Vahab Mirrokni

Moreover, we show that this MIP formulation is ideal (i. e. the strongest possible formulation) for the revenue function of a single impression.

Contextual Reserve Price Optimization in Auctions via Mixed-Integer Programming

1 code implementation20 Feb 2020 Joey Huchette, Haihao Lu, Hossein Esfandiari, Vahab Mirrokni

Moreover, we show that this MIP formulation is ideal (i. e. the strongest possible formulation) for the revenue function of a single impression.

Adaptivity in Adaptive Submodularity

no code implementations9 Nov 2019 Hossein Esfandiari, Amin Karbasi, Vahab Mirrokni

We propose an efficient semi adaptive policy that with $O(\log n \times\log k)$ adaptive rounds of observations can achieve an almost tight $1-1/e-\epsilon$ approximation guarantee with respect to an optimal policy that carries out $k$ actions in a fully sequential manner.

Active Learning Decision Making +1

Locality-Sensitive Hashing for f-Divergences: Mutual Information Loss and Beyond

no code implementations NeurIPS 2019 Lin Chen, Hossein Esfandiari, Thomas Fu, Vahab S. Mirrokni

In this paper, we aim to develop LSH schemes for distance functions that measure the distance between two probability distributions, particularly for f-divergences as well as a generalization to capture mutual information loss.

Model Compression

Regret Bounds for Batched Bandits

no code implementations11 Oct 2019 Hossein Esfandiari, Amin Karbasi, Abbas Mehrabian, Vahab Mirrokni

We present simple and efficient algorithms for the batched stochastic multi-armed bandit and batched stochastic linear bandit problems.

Multi-Armed Bandits

Seeding with Costly Network Information

1 code implementation10 May 2019 Dean Eckles, Hossein Esfandiari, Elchanan Mossel, M. Amin Rahimian

We study the task of selecting $k$ nodes, in a social network of size $n$, to seed a diffusion with maximum expected spread size, under the independent cascade model with cascade probability $p$.

Social and Information Networks Computational Complexity Probability Physics and Society

Categorical Feature Compression via Submodular Optimization

no code implementations30 Apr 2019 Mohammadhossein Bateni, Lin Chen, Hossein Esfandiari, Thomas Fu, Vahab S. Mirrokni, Afshin Rostamizadeh

To achieve this, we introduce a novel re-parametrization of the mutual information objective, which we prove is submodular, and design a data structure to query the submodular function in amortized $O(\log n )$ time (where $n$ is the input vocabulary size).

Feature Compression

Parallel and Streaming Algorithms for K-Core Decomposition

no code implementations ICML 2018 Hossein Esfandiari, Silvio Lattanzi, Vahab Mirrokni

The $k$-core decomposition is a fundamental primitive in many machine learning and data mining applications.

Bi-Objective Online Matching and Submodular Allocations

no code implementations NeurIPS 2016 Hossein Esfandiari, Nitish Korula, Vahab Mirrokni

In particular, in online advertising it is fairly common to optimize multiple metrics, such as clicks, conversions, and impressions, as well as other metrics which may be largely uncorrelated such as ‘share of voice’, and ‘buyer surplus’.

Cannot find the paper you are looking for? You can Submit a new open access paper.