no code implementations • 7 Dec 2017 • Jiajun Lu, Hussein Sibai, Evan Fabry
An adversarial example is an example that has been adjusted to produce a wrong label when presented to a system at test time.
no code implementations • 9 Oct 2017 • Jiajun Lu, Hussein Sibai, Evan Fabry, David Forsyth
Finally, an adversarial pattern on a physical object that could fool a detector would have to be adversarial in the face of a wide family of parametric distortions (scale; view angle; box shift inside the detector; illumination; and so on).
no code implementations • 12 Jul 2017 • Jiajun Lu, Hussein Sibai, Evan Fabry, David Forsyth
Instead, a trained neural network classifies most of the pictures taken from different distances and angles of a perturbed image correctly.