Search Results for author: Chengyin Hu

Found 12 papers, 1 papers with code

Adversarial Infrared Curves: An Attack on Infrared Pedestrian Detectors in the Physical World

no code implementations21 Dec 2023 Chengyin Hu, Weiwen Shi

Using Particle Swarm Optimization, we optimize two Bezier curves and employ cold patches in the physical realm to introduce perturbations, creating infrared curve patterns for physical sample generation.

Adversarial Defense Neural Network Security

Two-stage optimized unified adversarial patch for attacking visible-infrared cross-modal detectors in the physical world

no code implementations4 Dec 2023 Chengyin Hu, Weiwen Shi

The TOUAP employs a two-stage optimization process: firstly, PSO optimizes an irregular polygonal infrared patch to attack the infrared detector; secondly, the color QR code is optimized, and the shape information of the infrared patch from the first stage is used as a mask.

Impact of Light and Shadow on Robustness of Deep Neural Networks

no code implementations23 May 2023 Chengyin Hu, Weiwen Shi, Chao Li, Jialiang Sun, Donghua Wang, Junqi Wu, Guijian Tang

Deep neural networks (DNNs) have made remarkable strides in various computer vision tasks, including image classification, segmentation, and object detection.

Image Classification object-detection +1

Adversarial Infrared Blocks: A Multi-view Black-box Attack to Thermal Infrared Detectors in Physical World

no code implementations21 Apr 2023 Chengyin Hu, Weiwen Shi, Tingsong Jiang, Wen Yao, Ling Tian, Xiaoqian Chen

Infrared imaging systems have a vast array of potential applications in pedestrian detection and autonomous driving, and their safety performance is of great concern.

Autonomous Driving Pedestrian Detection

Adversarial Catoptric Light: An Effective, Stealthy and Robust Physical-World Attack to DNNs

no code implementations19 Sep 2022 Chengyin Hu, Weiwen Shi

We evaluate the proposed method in three aspects: effectiveness, stealthiness, and robustness.

Adversarial Color Projection: A Projector-based Physical Attack to DNNs

no code implementations19 Sep 2022 Chengyin Hu, Weiwen Shi, Ling Tian

In the digital environment, we achieve an attack success rate of 97. 60% on a subset of ImageNet, while in the physical environment, we attain an attack success rate of 100% in the indoor test and 82. 14% in the outdoor test.

Adversarial Attack

Impact of Scaled Image on Robustness of Deep Neural Networks

no code implementations2 Sep 2022 Chengyin Hu, Weiwen Shi

The accuracy of the networks is remarkably influenced by the data distribution of their training dataset.

Adversarial Attack Image Classification +2

Adversarial Color Film: Effective Physical-World Attack to DNNs

no code implementations2 Sep 2022 Chengyin Hu, Weiwen Shi

It is well known that the performance of deep neural networks (DNNs) is susceptible to subtle interference.

Impact of Colour Variation on Robustness of Deep Neural Networks

no code implementations2 Sep 2022 Chengyin Hu, Weiwen Shi

Whereas recent advances have shown their vulnerability to manual digital perturbations in the input data, namely adversarial attacks.

Image Classification object-detection +1

Adversarial Zoom Lens: A Novel Physical-World Attack to DNNs

no code implementations23 Jun 2022 Chengyin Hu, Weiwen Shi

In a digital environment, we construct a data set based on AdvZL to verify the antagonism of equal-scale enlarged images to DNNs.

Adversarial Attack Autonomous Driving

Adversarial Neon Beam: A Light-based Physical Attack to DNNs

no code implementations2 Apr 2022 Chengyin Hu, Weiwen Shi, Wen Li

Moreover, we validate the robustness of our approach by successfully attacking advanced DNNs with a success rate of over 75% in all cases.

Adversarial Attack

Cannot find the paper you are looking for? You can Submit a new open access paper.