Search Results for author: Yichen Lu

Found 6 papers, 2 papers with code

Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty

1 code implementation4 May 2023 Yuan Zhang, Weihua Chen, Yichen Lu, Tao Huang, Xiuyu Sun, Jian Cao

Knowledge distillation is an effective paradigm for boosting the performance of pocket-size model, especially when multiple teacher models are available, the student would break the upper limit again.

Knowledge Distillation object-detection +3

Dive into the Resolution Augmentations and Metrics in Low Resolution Face Recognition: A Plain yet Effective New Baseline

1 code implementation11 Feb 2023 Xu Ling, Yichen Lu, Wenqi Xu, Weihong Deng, Yingjie Zhang, Xingchen Cui, Hongzhi Shi, Dongchao Wen

Although deep learning has significantly improved Face Recognition (FR), dramatic performance deterioration may occur when processing Low Resolution (LR) faces.

Face Recognition General Knowledge

Augmented Geometric Distillation for Data-Free Incremental Person ReID

no code implementations CVPR 2022 Yichen Lu, Mei Wang, Weihong Deng

On this basis, we reveal a "noisy distillation" problem stemming from the noise in dreaming memory, and further propose to augment distillation in a pairwise and cross-wise pattern over different views of memory to mitigate it.

Incremental Learning Person Re-Identification +1

Noisy Positive-Unlabeled Learning with Self-Training for Speculative Knowledge Graph Reasoning

no code implementations13 Jun 2023 Ruijie Wang, Baoyu Li, Yichen Lu, Dachun Sun, Jinning Li, Yuchen Yan, Shengzhong Liu, Hanghang Tong, Tarek F. Abdelzaher

State-of-the-art methods fall short in the speculative reasoning ability, as they assume the correctness of a fact is solely determined by its presence in KG, making them vulnerable to false negative/positive issues.

Knowledge Graphs World Knowledge

Cannot find the paper you are looking for? You can Submit a new open access paper.