Search Results for author: Evan Fabry

Found 3 papers, 0 papers with code

Adversarial Examples that Fool Detectors

no code implementations7 Dec 2017 Jiajun Lu, Hussein Sibai, Evan Fabry

An adversarial example is an example that has been adjusted to produce a wrong label when presented to a system at test time.

Standard detectors aren't (currently) fooled by physical adversarial stop signs

no code implementations9 Oct 2017 Jiajun Lu, Hussein Sibai, Evan Fabry, David Forsyth

Finally, an adversarial pattern on a physical object that could fool a detector would have to be adversarial in the face of a wide family of parametric distortions (scale; view angle; box shift inside the detector; illumination; and so on).

Adversarial Attack

NO Need to Worry about Adversarial Examples in Object Detection in Autonomous Vehicles

no code implementations12 Jul 2017 Jiajun Lu, Hussein Sibai, Evan Fabry, David Forsyth

Instead, a trained neural network classifies most of the pictures taken from different distances and angles of a perturbed image correctly.

Autonomous Vehicles object-detection +1

Cannot find the paper you are looking for? You can Submit a new open access paper.