no code implementations • 1 Jul 2024 • Kalibinuer Tiliwalidi, Chengyin Hu, Weiwen Shi
These perturbations are cyclically applied to various parts of a pedestrian's clothing to facilitate multi-view black-box physical attacks on infrared pedestrian detectors.
no code implementations • 21 Dec 2023 • Chengyin Hu, Weiwen Shi
Using Particle Swarm Optimization, we optimize two Bezier curves and employ cold patches in the physical realm to introduce perturbations, creating infrared curve patterns for physical sample generation.
no code implementations • 4 Dec 2023 • Chengyin Hu, Weiwen Shi
The TOUAP employs a two-stage optimization process: firstly, PSO optimizes an irregular polygonal infrared patch to attack the infrared detector; secondly, the color QR code is optimized, and the shape information of the infrared patch from the first stage is used as a mask.
no code implementations • 23 May 2023 • Chengyin Hu, Weiwen Shi, Chao Li, Jialiang Sun, Donghua Wang, Junqi Wu, Guijian Tang
Deep neural networks (DNNs) have made remarkable strides in various computer vision tasks, including image classification, segmentation, and object detection.
no code implementations • 21 Apr 2023 • Chengyin Hu, Weiwen Shi, Tingsong Jiang, Wen Yao, Ling Tian, Xiaoqian Chen
Infrared imaging systems have a vast array of potential applications in pedestrian detection and autonomous driving, and their safety performance is of great concern.
no code implementations • 19 Sep 2022 • Chengyin Hu, Weiwen Shi
We evaluate the proposed method in three aspects: effectiveness, stealthiness, and robustness.
no code implementations • 19 Sep 2022 • Chengyin Hu, Weiwen Shi, Ling Tian
In the digital environment, we achieve an attack success rate of 97. 60% on a subset of ImageNet, while in the physical environment, we attain an attack success rate of 100% in the indoor test and 82. 14% in the outdoor test.
no code implementations • 2 Sep 2022 • Chengyin Hu, Weiwen Shi
The accuracy of the networks is remarkably influenced by the data distribution of their training dataset.
no code implementations • 2 Sep 2022 • Chengyin Hu, Weiwen Shi
It is well known that the performance of deep neural networks (DNNs) is susceptible to subtle interference.
no code implementations • 2 Sep 2022 • Chengyin Hu, Weiwen Shi
Whereas recent advances have shown their vulnerability to manual digital perturbations in the input data, namely adversarial attacks.
no code implementations • 23 Jun 2022 • Chengyin Hu, Weiwen Shi
In a digital environment, we construct a data set based on AdvZL to verify the antagonism of equal-scale enlarged images to DNNs.
3 code implementations • 2 Jun 2022 • Chengyin Hu, Yilong Wang, Kalibinuer Tiliwalidi, Wen Li
It realizes robust and covert physical attack by using low-cost laser equipment.
no code implementations • 2 Apr 2022 • Chengyin Hu, Weiwen Shi, Wen Li
Moreover, we validate the robustness of our approach by successfully attacking advanced DNNs with a success rate of over 75% in all cases.