Search Results for author: Shuoxi Zhang

Found 3 papers, 0 papers with code

Knowledge Distillation via Token-level Relationship Graph

no code implementations20 Jun 2023 Shuoxi Zhang, Hanpeng Liu, Kun He

To address the above limitations, we propose a novel method called Knowledge Distillation with Token-level Relationship Graph (TRG) that leverages the token-wise relational knowledge to enhance the performance of knowledge distillation.

Knowledge Distillation Transfer Learning

Class-aware Information for Logit-based Knowledge Distillation

no code implementations27 Nov 2022 Shuoxi Zhang, Hanpeng Liu, John E. Hopcroft, Kun He

Knowledge distillation aims to transfer knowledge to the student model by utilizing the predictions/features of the teacher model, and feature-based distillation has recently shown its superiority over logit-based distillation.

Knowledge Distillation

Generating Pseudo-labels Adaptively for Few-shot Model-Agnostic Meta-Learning

no code implementations9 Jul 2022 Guodong Liu, Tongling Wang, Shuoxi Zhang, Kun He

Model-Agnostic Meta-Learning (MAML) is a famous few-shot learning method that has inspired many follow-up efforts, such as ANIL and BOIL.

Few-Shot Learning Pseudo Label

Cannot find the paper you are looking for? You can Submit a new open access paper.