Adversarial Defense
176 papers with code • 10 benchmarks • 5 datasets
Competitions with currently unpublished results:
Libraries
Use these libraries to find Adversarial Defense models and implementationsLatest papers with no code
Ensemble Adversarial Defense via Integration of Multiple Dispersed Low Curvature Models
In this work, we aim to enhance ensemble diversity by reducing attack transferability.
Subspace Defense: Discarding Adversarial Perturbations by Learning a Subspace for Clean Signals
We first empirically show that the features of either clean signals or adversarial perturbations are redundant and span in low-dimensional linear subspaces respectively with minimal overlap, and the classical low-dimensional subspace projection can suppress perturbation features out of the subspace of clean signals.
Adversarial Defense Teacher for Cross-Domain Object Detection under Poor Visibility Conditions
Existing object detectors encounter challenges in handling domain shifts between training and real-world data, particularly under poor visibility conditions like fog and night.
ADAPT to Robustify Prompt Tuning Vision Transformers
The performance of deep models, including Vision Transformers, is known to be vulnerable to adversarial attacks.
Robust Overfitting Does Matter: Test-Time Adversarial Purification With FGSM
Current defense strategies usually train DNNs for a specific adversarial attack method and can achieve good robustness in defense against this type of adversarial attack.
Adversarial Infrared Geometry: Using Geometry to Perform Adversarial Attack against Infrared Pedestrian Detectors
Physical attack experiments are conducted to assess the attack success rate of AdvIG at different distances.
Enhancing the "Immunity" of Mixture-of-Experts Networks for Adversarial Defense
Recent studies have revealed the vulnerability of Deep Neural Networks (DNNs) to adversarial examples, which can easily fool DNNs into making incorrect predictions.
Enhancing Tracking Robustness with Auxiliary Adversarial Defense Networks
Moreover, it can be seamlessly integrated with other visual trackers as a plug-and-play module without requiring any parameter adjustments.
MGE: A Training-Free and Efficient Model Generation and Enhancement Scheme
To provide a foundation for the research of deep learning models, the construction of model pool is an essential step.
Two Heads Are Better Than One: Boosting Graph Sparse Training via Semantic and Topological Awareness
Specifically, GST initially constructs a topology & semantic anchor at a low training cost, followed by performing dynamic sparse training to align the sparse graph with the anchor.