Search Results for author: Siao Liu

Found 6 papers, 1 papers with code

De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts

no code implementations28 Mar 2024 Yuzheng Wang, Dingkang Yang, Zhaoyu Chen, Yang Liu, Siao Liu, Wenqiang Zhang, Lihua Zhang, Lizhe Qi

Data-Free Knowledge Distillation (DFKD) is a promising task to train high-performance small models to enhance actual deployment without relying on the original training data.

Causal Inference Data-free Knowledge Distillation

Improving Generalization in Visual Reinforcement Learning via Conflict-aware Gradient Agreement Augmentation

no code implementations ICCV 2023 Siao Liu, Zhaoyu Chen, Yang Liu, Yuzheng Wang, Dingkang Yang, Zhile Zhao, Ziqing Zhou, Xie Yi, Wei Li, Wenqiang Zhang, Zhongxue Gan

In particular, CG2A develops a Gradient Agreement Solver to adaptively balance the varying gradient magnitudes, and introduces a Soft Gradient Surgery strategy to alleviate the gradient conflicts.

reinforcement-learning

Sampling to Distill: Knowledge Transfer from Open-World Data

no code implementations31 Jul 2023 Yuzheng Wang, Zhaoyu Chen, Jie Zhang, Dingkang Yang, Zuhao Ge, Yang Liu, Siao Liu, Yunquan Sun, Wenqiang Zhang, Lizhe Qi

Then, we introduce a low-noise representation to alleviate the domain shifts and build a structured relationship of multiple data examples to exploit data knowledge.

Data-free Knowledge Distillation Transfer Learning

Context De-confounded Emotion Recognition

1 code implementation CVPR 2023 Dingkang Yang, Zhaoyu Chen, Yuzheng Wang, Shunli Wang, Mingcheng Li, Siao Liu, Xiao Zhao, Shuai Huang, Zhiyan Dong, Peng Zhai, Lihua Zhang

However, a long-overlooked issue is that a context bias in existing datasets leads to a significantly unbalanced distribution of emotional states among different context scenarios.

Emotion Recognition

Adversarial Contrastive Distillation with Adaptive Denoising

no code implementations17 Feb 2023 Yuzheng Wang, Zhaoyu Chen, Dingkang Yang, Yang Liu, Siao Liu, Wenqiang Zhang, Lizhe Qi

To this end, we propose a novel structured ARD method called Contrastive Relationship DeNoise Distillation (CRDND).

Adversarial Robustness Denoising +1

Efficient universal shuffle attack for visual object tracking

no code implementations14 Mar 2022 Siao Liu, Zhaoyu Chen, Wei Li, Jiwei Zhu, Jiafeng Wang, Wenqiang Zhang, Zhongxue Gan

Recently, adversarial attacks have been applied in visual object tracking to deceive deep trackers by injecting imperceptible perturbations into video frames.

Adversarial Attack Computational Efficiency +2

Cannot find the paper you are looking for? You can Submit a new open access paper.