no code implementations • 22 Dec 2023 • Xiaoyue Duan, Shuhao Cui, Guoliang Kang, Baochang Zhang, Zhengcong Fei, Mingyuan Fan, Junshi Huang
Consistent editing of real images is a challenging task, as it requires performing non-rigid edits (e. g., changing postures) to the main objects in the input image without changing their identity or attributes.
1 code implementation • CVPR 2023 • Runqi Wang, Hao Zheng, Xiaoyue Duan, Jianzhuang Liu, Yuning Lu, Tian Wang, Songcen Xu, Baochang Zhang
However, with only a few training images, there exist two crucial problems: (1) the visual feature distributions are easily distracted by class-irrelevant information in images, and (2) the alignment between the visual and language feature distributions is difficult.
no code implementations • CVPR 2023 • Runqi Wang, Xiaoyue Duan, Guoliang Kang, Jianzhuang Liu, Shaohui Lin, Songcen Xu, Jinhu Lv, Baochang Zhang
Text consists of a category name and a fixed number of learnable parameters which are selected from our designed attribute word bank and serve as attributes.
no code implementations • ICCV 2023 • Wenkai Dong, Song Xue, Xiaoyue Duan, Shumin Han
This technique ensures a superior trade-off between editability and high fidelity to the input image of our method.
no code implementations • 28 Nov 2022 • Xiaoyue Duan, Guoliang Kang, Runqi Wang, Shumin Han, Song Xue, Tian Wang, Baochang Zhang
Based on this observation, we propose a simple strategy, i. e., increasing the number of training shots, to mitigate the loss of intrinsic dimension caused by robustness-promoting regularization.
no code implementations • 28 Dec 2021 • Runqi Wang, Xiaoyue Duan, Baochang Zhang, Song Xue, Wentao Zhu, David Doermann, Guodong Guo
We show that our method improves the recognition accuracy of adversarial training on ImageNet by 8. 32% compared with the baseline.
no code implementations • 8 Mar 2021 • Zhenhuan Huang, Xiaoyue Duan, Bo Zhao, Jinhu Lü, Baochang Zhang
We propose an Interpretable Attention Guided Network (IAGN) for fine-grained visual classification.