Search Results for author: Muzhou Yu

Found 2 papers, 0 papers with code

Revisiting Data Augmentation in Model Compression: An Empirical and Comprehensive Study

no code implementations22 May 2023 Muzhou Yu, Linfeng Zhang, Kaisheng Ma

In this paper, we revisit the usage of data augmentation in model compression and give a comprehensive study on the relation between model sizes and their optimal data augmentation policy.

Data Augmentation Knowledge Distillation +2

CORSD: Class-Oriented Relational Self Distillation

no code implementations28 Apr 2023 Muzhou Yu, Sia Huat Tan, Kailu Wu, Runpei Dong, Linfeng Zhang, Kaisheng Ma

Knowledge distillation conducts an effective model compression method while holding some limitations:(1) the feature based distillation methods only focus on distilling the feature map but are lack of transferring the relation of data examples; (2) the relational distillation methods are either limited to the handcrafted functions for relation extraction, such as L2 norm, or weak in inter- and intra- class relation modeling.

Knowledge Distillation Model Compression +2

Cannot find the paper you are looking for? You can Submit a new open access paper.