no code implementations • 27 Feb 2023 • Haim Kaplan, Yishay Mansour, Shay Moran, Kobbi Nissim, Uri Stemmer
In this work we introduce an interactive variant of joint differential privacy towards handling online processes in which existing privacy definitions seem too restrictive.
no code implementations • 29 Mar 2021 • Kobbi Nissim, Chao Yan
We provide a lowerbound on the sample complexity of distribution-free parity learning in the realizable case in the shuffle model of differential privacy.
no code implementations • 26 Jan 2021 • Haim Kaplan, Yishay Mansour, Kobbi Nissim, Uri Stemmer
We present a streaming problem for which every adversarially-robust streaming algorithm must use polynomial space, while there exists a classical (oblivious) streaming algorithm that uses only polylogarithmic space.
Data Structures and Algorithms
no code implementations • 28 Sep 2020 • Amos Beimel, Iftach Haitner, Kobbi Nissim, Uri Stemmer
Combining this primitive with the two-round semi-honest protocol of Applebaun et al. [TCC 2018], we obtain that every randomized functionality can be computed in the shuffle model with an honest majority, in merely two rounds.
no code implementations • 18 Dec 2019 • Amos Beimel, Aleksandra Korolova, Kobbi Nissim, Or Sheffet, Uri Stemmer
Motivated by the desire to bridge the utility gap between local and trusted curator models of differential privacy for practical applications, we initiate the theoretical study of a hybrid model introduced by "Blender" [Avent et al.,\ USENIX Security '17], in which differentially private protocols of n agents that work in the local-model are assisted by a differentially private curator that has access to the data of m additional users.
1 code implementation • 20 Jun 2019 • Borja Balle, James Bell, Adria Gascon, Kobbi Nissim
In recent work, Cheu et al. (Eurocrypt 2019) proposed a protocol for $n$-party real summation in the shuffle model of differential privacy with $O_{\epsilon, \delta}(1)$ error and $\Theta(\epsilon\sqrt{n})$ one-bit messages per party.
1 code implementation • 7 Mar 2019 • Borja Balle, James Bell, Adria Gascon, Kobbi Nissim
Additionally, Erlingsson et al. (SODA 2019) provide a privacy amplification bound quantifying the level of curator differential privacy achieved by the shuffle model in terms of the local differential privacy of the randomizer used by each user.
no code implementations • 27 Feb 2019 • Amos Beimel, Shay Moran, Kobbi Nissim, Uri Stemmer
The building block for this learner is a differentially private algorithm for locating an approximate center point of $m>\mathrm{poly}(d, 2^{\log^*|X|})$ points -- a high dimensional generalization of the median function.
no code implementations • NeurIPS 2018 • Kobbi Nissim, Adam Smith, Thomas Steinke, Uri Stemmer, Jonathan Ullman
While statistics and machine learning offers numerous methods for ensuring generalization, these methods often fail in the presence of adaptivity---the common practice in which the choice of analysis depends on previous interactions with the same dataset.
1 code implementation • 5 Jun 2017 • Georgios Kellaris, George Kollios, Kobbi Nissim, Adam O'Neill
In this work we present a model for differentially private outsourced database system and a concrete construction, $\mathcal{E}\text{psolute}$, that provably conceals the aforementioned leakages, while remaining efficient and scalable.
Cryptography and Security
no code implementations • 6 Mar 2017 • Kobbi Nissim, Uri Stemmer
We show that differential privacy can be used to prove concentration bounds for such functions in the non-adaptive setting.
no code implementations • 4 Jan 2017 • Shiva Prasad Kasiviswanathan, Kobbi Nissim, Hongxia Jin
Our first contribution is a generic transformation of private batch ERM mechanisms into private incremental ERM mechanisms, based on a simple idea of invoking the private batch ERM procedure at some regular time intervals.
3 code implementations • 14 Sep 2016 • Marco Gaboardi, James Honaker, Gary King, Jack Murtagh, Kobbi Nissim, Jonathan Ullman, Salil Vadhan
We provide an overview of PSI ("a Private data Sharing Interface"), a system we are developing to enable researchers in the social sciences and other fields to share and explore privacy-sensitive datasets with the strong privacy protections of differential privacy.
Cryptography and Security Computers and Society Methodology
no code implementations • 19 Apr 2016 • Kobbi Nissim, Uri Stemmer, Salil Vadhan
We present a new algorithm for locating a small cluster of points with differential privacy [Dwork, McSherry, Nissim, and Smith, 2006].
no code implementations • 24 Feb 2016 • Rachel Cummings, Katrina Ligett, Kobbi Nissim, Aaron Roth, Zhiwei Steven Wu
We also show that perfect generalization is a strictly stronger guarantee than differential privacy, but that, nevertheless, many learning tasks can be carried out subject to the guarantees of perfect generalization.
no code implementations • 27 Nov 2015 • Mark Bun, Kobbi Nissim, Uri Stemmer
We investigate the direct-sum problem in the context of differentially private PAC learning: What is the sample complexity of solving $k$ learning tasks simultaneously under differential privacy, and how does this cost compare to that of solving $k$ learning tasks without privacy?
no code implementations • 8 Nov 2015 • Raef Bassily, Kobbi Nissim, Adam Smith, Thomas Steinke, Uri Stemmer, Jonathan Ullman
Specifically, suppose there is an unknown distribution $\mathbf{P}$ and a set of $n$ independent samples $\mathbf{x}$ is drawn from $\mathbf{P}$.
no code implementations • 28 Apr 2015 • Mark Bun, Kobbi Nissim, Uri Stemmer, Salil Vadhan
Our sample complexity upper and lower bounds also apply to the tasks of learning distributions with respect to Kolmogorov distance and of properly PAC learning thresholds with differential privacy.
no code implementations • 22 Apr 2015 • Kobbi Nissim, Uri Stemmer
Very recently, Bassily et al. presented an improved bound and showed that (a variant of) the private multiplicative weights algorithm can answer $k$ adaptively chosen statistical queries using sample complexity that grows logarithmically in $k$.
no code implementations • 10 Jul 2014 • Amos Beimel, Kobbi Nissim, Uri Stemmer
We show that the sample complexity of these tasks under approximate differential privacy can be significantly lower than that under pure differential privacy.
no code implementations • 10 Jul 2014 • Amos Beimel, Kobbi Nissim, Uri Stemmer
In 2008, Kasiviswanathan et al. (FOCS 2008) gave a generic construction of private learners, in which the sample complexity is (generally) higher than what is needed for non-private learners.
no code implementations • 10 Feb 2014 • Amos Beimel, Kobbi Nissim, Uri Stemmer
Kasiviswanathan et al. gave a generic construction of private learners for (finite) concept classes, with sample complexity logarithmic in the size of the concept class.
no code implementations • 6 Mar 2008 • Shiva Prasad Kasiviswanathan, Homin K. Lee, Kobbi Nissim, Sofya Raskhodnikova, Adam Smith
Therefore, almost anything learnable is learnable privately: specifically, if a concept class is learnable by a (non-private) algorithm with polynomial sample complexity and output size, then it can be learned privately using a polynomial number of samples.