Search Results for author: Lizhe Qi

Found 7 papers, 1 papers with code

De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts

no code implementations28 Mar 2024 Yuzheng Wang, Dingkang Yang, Zhaoyu Chen, Yang Liu, Siao Liu, Wenqiang Zhang, Lihua Zhang, Lizhe Qi

Data-Free Knowledge Distillation (DFKD) is a promising task to train high-performance small models to enhance actual deployment without relying on the original training data.

Causal Inference Data-free Knowledge Distillation

On the Importance of Spatial Relations for Few-shot Action Recognition

no code implementations14 Aug 2023 Yilun Zhang, Yuqian Fu, Xingjun Ma, Lizhe Qi, Jingjing Chen, Zuxuan Wu, Yu-Gang Jiang

We are thus motivated to investigate the importance of spatial relations and propose a more accurate few-shot action recognition method that leverages both spatial and temporal information.

Few-Shot action recognition Few Shot Action Recognition +1

Sampling to Distill: Knowledge Transfer from Open-World Data

no code implementations31 Jul 2023 Yuzheng Wang, Zhaoyu Chen, Jie Zhang, Dingkang Yang, Zuhao Ge, Yang Liu, Siao Liu, Yunquan Sun, Wenqiang Zhang, Lizhe Qi

Then, we introduce a low-noise representation to alleviate the domain shifts and build a structured relationship of multiple data examples to exploit data knowledge.

Data-free Knowledge Distillation Transfer Learning

Out of Thin Air: Exploring Data-Free Adversarial Robustness Distillation

no code implementations21 Mar 2023 Yuzheng Wang, Zhaoyu Chen, Dingkang Yang, Pinxue Guo, Kaixun Jiang, Wenqiang Zhang, Lizhe Qi

Adversarial Robustness Distillation (ARD) is a promising task to solve the issue of limited adversarial robustness of small capacity models while optimizing the expensive computational costs of Adversarial Training (AT).

Adversarial Robustness Knowledge Distillation +1

Explicit and Implicit Knowledge Distillation via Unlabeled Data

no code implementations17 Feb 2023 Yuzheng Wang, Zuhao Ge, Zhaoyu Chen, Xian Liu, Chuangjia Ma, Yunquan Sun, Lizhe Qi

Data-free knowledge distillation is a challenging model lightweight task for scenarios in which the original dataset is not available.

Data-free Knowledge Distillation

Adversarial Contrastive Distillation with Adaptive Denoising

no code implementations17 Feb 2023 Yuzheng Wang, Zhaoyu Chen, Dingkang Yang, Yang Liu, Siao Liu, Wenqiang Zhang, Lizhe Qi

To this end, we propose a novel structured ARD method called Contrastive Relationship DeNoise Distillation (CRDND).

Adversarial Robustness Denoising +1

An Experimental-based Review of Image Enhancement and Image Restoration Methods for Underwater Imaging

1 code implementation7 Jul 2019 Yan Wang, Wei Song, Giancarlo Fortino, Lizhe Qi, Wenqiang Zhang, Antonio Liotta

Underwater images play a key role in ocean exploration, but often suffer from severe quality degradation due to light absorption and scattering in water medium.

Image Enhancement Image Restoration

Cannot find the paper you are looking for? You can Submit a new open access paper.