no code implementations • 23 Aug 2023 • Pascale Gourdeau
We then focus on learning problems with distributions on the input data that satisfy a Lipschitz condition and show that robustly learning monotone conjunctions has sample complexity at least exponential in the adversary's budget (the maximum number of bits it can perturb on each input).
no code implementations • 12 Oct 2022 • Pascale Gourdeau, Varun Kanade, Marta Kwiatkowska, James Worrell
We finish by giving robust learning algorithms for halfspaces on $\{0, 1\}^n$ and then obtaining robustness guarantees for halfspaces in $\mathbb{R}^n$ against precision-bounded adversaries.
no code implementations • 12 May 2022 • Pascale Gourdeau, Varun Kanade, Marta Kwiatkowska, James Worrell
A fundamental problem in adversarial machine learning is to quantify how much training data is needed in the presence of evasion attacks.
no code implementations • NeurIPS 2019 • Pascale Gourdeau, Varun Kanade, Marta Kwiatkowska, James Worrell
However if the adversary is restricted to perturbing $O(\log n)$ bits, then the class of monotone conjunctions can be robustly learned with respect to a general class of distributions (that includes the uniform distribution).