no code implementations • NeurIPS 2021 • Sofya Raskhodnikova, Satchit Sivakumar, Adam Smith, Marika Swanberg
We demonstrate that, in some parameter regimes, private sampling requires asymptotically fewer observations than learning a description of $P$ nonprivately; in other regimes, however, private sampling proves to be as difficult as private learning.
no code implementations • 10 Aug 2012 • Sofya Raskhodnikova, Grigory Yaroslavtsev
We show that an analog of Hastad's switching lemma holds for pseudo-Boolean k-DNFs if all constants associated with the terms of the formula are bounded.
no code implementations • 6 Mar 2008 • Shiva Prasad Kasiviswanathan, Homin K. Lee, Kobbi Nissim, Sofya Raskhodnikova, Adam Smith
Therefore, almost anything learnable is learnable privately: specifically, if a concept class is learnable by a (non-private) algorithm with polynomial sample complexity and output size, then it can be learned privately using a polynomial number of samples.
1 code implementation • 8 Jun 2007 • Sofya Raskhodnikova, Dana Ron, Ronitt Rubinfeld, Adam Smith
We raise the question of approximating the compressibility of a string with respect to a fixed compression scheme, in sublinear time.
Data Structures and Algorithms