Search Results for author: Jeremy M Cohen

Found 1 papers, 1 papers with code

Certified Adversarial Robustness via Randomized Smoothing

10 code implementations8 Feb 2019 Jeremy M Cohen, Elan Rosenfeld, J. Zico Kolter

We show how to turn any classifier that classifies well under Gaussian noise into a new classifier that is certifiably robust to adversarial perturbations under the $\ell_2$ norm.

Adversarial Defense Adversarial Robustness +1

Cannot find the paper you are looking for? You can Submit a new open access paper.