2 code implementations • CVPR 2022 • Yin-Yin He, Peizhen Zhang, Xiu-Shen Wei, Xiangyu Zhang, Jian Sun
In this paper, we explore to excavate the confusion matrix, which carries the fine-grained misclassification details, to relieve the pairwise biases, generalizing the coarse one.
1 code implementation • ICCV 2021 • Yin-Yin He, Jianxin Wu, Xiu-Shen Wei
We tackle the long-tailed visual recognition problem from the knowledge distillation perspective by proposing a Distill the Virtual Examples (DiVE) method.
Ranked #19 on Long-tail Learning on iNaturalist 2018
no code implementations • 7 Jun 2023 • Ke Zhu, Yin-Yin He, Jianxin Wu
That is, coarse crops benefits scene images SSL.
no code implementations • 20 Jul 2023 • Ke Zhu, Yin-Yin He, Jianxin Wu
QFD first trains a quantized (or binarized) representation as the teacher, then quantize the network using knowledge distillation (KD).