Search Results for author: Nithin Anchuri

Found 2 papers, 0 papers with code

RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge Distillation

no code implementations Findings (NAACL) 2022 Md Akmal Haidar, Nithin Anchuri, Mehdi Rezagholizadeh, Abbas Ghaddar, Philippe Langlais, Pascal Poupart

To address these problems, we propose a RAndom Intermediate Layer Knowledge Distillation (RAIL-KD) approach in which, intermediate layers from the teacher model are selected randomly to be distilled into the intermediate layers of the student model.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.