Search Results for author: Linrui Gong

Found 3 papers, 0 papers with code

Precise Knowledge Transfer via Flow Matching

no code implementations3 Feb 2024 Shitong Shao, Zhiqiang Shen, Linrui Gong, Huanran Chen, Xu Dai

We name this framework Knowledge Transfer with Flow Matching (FM-KT), which can be integrated with a metric-based distillation method with any form (\textit{e. g.} vanilla KD, DKD, PKD and DIST) and a meta-encoder with any available architecture (\textit{e. g.} CNN, MLP and Transformer).

Transfer Learning

Rethinking Centered Kernel Alignment in Knowledge Distillation

no code implementations22 Jan 2024 Zikai Zhou, Yunhang Shen, Shitong Shao, Linrui Gong, Shaohui Lin

Knowledge distillation has emerged as a highly effective method for bridging the representation discrepancy between large-scale models and lightweight models.

Image Classification Knowledge Distillation +2

Teaching What You Should Teach: A Data-Based Distillation Method

no code implementations11 Dec 2022 Shitong Shao, Huanran Chen, Zhen Huang, Linrui Gong, Shuai Wang, Xinxiao wu

To be specific, we design a neural network-based data augmentation module with priori bias, which assists in finding what meets the teacher's strengths but the student's weaknesses, by learning magnitudes and probabilities to generate suitable data samples.

Data Augmentation Knowledge Distillation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.