Search Results for author: Borui Zhao

Found 10 papers, 9 papers with code

DOT: A Distillation-Oriented Trainer

1 code implementation ICCV 2023 Borui Zhao, Quan Cui, RenJie Song, Jiajun Liang

In this paper, we observe a trade-off between task and distillation losses, i. e., introducing distillation loss limits the convergence of task loss.

Knowledge Distillation

Cumulative Spatial Knowledge Distillation for Vision Transformers

1 code implementation ICCV 2023 Borui Zhao, RenJie Song, Jiajun Liang

(2) Distilling knowledge from CNN limits the network convergence in the later training period since ViT's capability of integrating global information is suppressed by CNN's local-inductive-bias supervision.

Inductive Bias Knowledge Distillation +1

Boosting Semi-Supervised Learning by Exploiting All Unlabeled Data

2 code implementations CVPR 2023 Yuhao Chen, Xin Tan, Borui Zhao, Zhaowei Chen, RenJie Song, Jiajun Liang, Xuequan Lu

ANL introduces the additional negative pseudo-label for all unlabeled data to leverage low-confidence examples.

Pseudo Label

Curriculum Temperature for Knowledge Distillation

1 code implementation29 Nov 2022 Zheng Li, Xiang Li, Lingfeng Yang, Borui Zhao, RenJie Song, Lei Luo, Jun Li, Jian Yang

In this paper, we propose a simple curriculum-based technique, termed Curriculum Temperature for Knowledge Distillation (CTKD), which controls the task difficulty level during the student's learning career through a dynamic and learnable temperature.

Image Classification Knowledge Distillation

Efficient One Pass Self-distillation with Zipf's Label Smoothing

1 code implementation26 Jul 2022 Jiajun Liang, Linze Li, Zhaodong Bing, Borui Zhao, Yao Tang, Bo Lin, Haoqiang Fan

This paper proposes an efficient self-distillation method named Zipf's Label Smoothing (Zipf's LS), which uses the on-the-fly prediction of a network to generate soft supervision that conforms to Zipf distribution without using any contrastive samples or auxiliary parameters.

Decoupled Knowledge Distillation

1 code implementation CVPR 2022 Borui Zhao, Quan Cui, RenJie Song, Yiyu Qiu, Jiajun Liang

To provide a novel viewpoint to study logit distillation, we reformulate the classical KD loss into two parts, i. e., target class knowledge distillation (TCKD) and non-target class knowledge distillation (NCKD).

Image Classification Knowledge Distillation +1

RecursiveMix: Mixed Learning with History

1 code implementation14 Mar 2022 Lingfeng Yang, Xiang Li, Borui Zhao, RenJie Song, Jian Yang

In semantic segmentation, RM also surpasses the baseline and CutMix by 1. 9 and 1. 1 mIoU points under UperNet on ADE20K, respectively.

object-detection Object Detection +1

Discriminability-Transferability Trade-Off: An Information-Theoretic Perspective

1 code implementation8 Mar 2022 Quan Cui, Bingchen Zhao, Zhao-Min Chen, Borui Zhao, RenJie Song, Jiajun Liang, Boyan Zhou, Osamu Yoshie

This work simultaneously considers the discriminability and transferability properties of deep representations in the typical supervised learning task, i. e., image classification.

Image Classification Transfer Learning

Dynamic MLP for Fine-Grained Image Classification by Leveraging Geographical and Temporal Information

1 code implementation CVPR 2022 Lingfeng Yang, Xiang Li, RenJie Song, Borui Zhao, Juntian Tao, Shihao Zhou, Jiajun Liang, Jian Yang

Therefore, it is helpful to leverage additional information, e. g., the locations and dates for data shooting, which can be easily accessible but rarely exploited.

Fine-Grained Image Classification

Hierarchical Context Embedding for Region-based Object Detection

no code implementations ECCV 2020 Zhao-Min Chen, Xin Jin, Borui Zhao, Xiu-Shen Wei, Yanwen Guo

To address this issue, we present a simple but effective Hierarchical Context Embedding (HCE) framework, which can be applied as a plug-and-play component, to facilitate the classification ability of a series of region-based detectors by mining contextual cues.

Object object-detection +1

Cannot find the paper you are looking for? You can Submit a new open access paper.