Generating Realistic Physical Adversarial Examplesby Patch Transformer Network

29 Sep 2021  ·  Quanfu Fan, Kaidi Xu, Chun-Fu Chen, Sijia Liu, Gaoyuan Zhang, David Daniel Cox, Xue Lin ·

Physical adversarial attacks apply carefully crafted adversarial perturbations onto real objects to maliciously alter the prediction of object classifiers or detectors. The current standard method for designing physical adversarial patches, i.e. Expectation over Transformations (EoT), simulates real-world environments by random physical transformations, resulting in adversarial examples far from satisfactory. To tackle this issue, we propose and develop a novel network to learn real-world physical transformations from data, including geometric transformation, printer color transformation and illumination adaption. Our approach produces realisticlooking adversarial examples and can be integrated into existing attack generation frameworks to generate adversarial patches effectively. We apply our approach to design adversarial T-shirts worn by moving people, one of the most challenging settings for physical attacks. Experiments show that our approach significantly outperforms the state of the arts when attacking DL-based object detectors in real life. Moreover, we build a first-kind-of adversarial T-shirts dataset to enable effective training of our approach and facilitate fair comparison on physical world attacks by considering a standard patch size, environment changes and object variances. Our code will be made publicly available.

PDF Abstract
No code implementations yet. Submit your code now


Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.


No methods listed for this paper. Add relevant methods here