On Evaluating Adversarial Robustness

18 Feb 2019Nicholas CarliniAnish AthalyeNicolas PapernotWieland BrendelJonas RauberDimitris TsiprasIan GoodfellowAleksander MadryAlexey Kurakin

Correctly evaluating defenses against adversarial examples has proven to be extremely difficult. Despite the significant amount of recent work attempting to design defenses that withstand adaptive attacks, few have succeeded; most papers that propose defenses are quickly shown to be incorrect... (read more)

PDF Abstract

Evaluation Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.