Search Results for author: Caglar Kilcioglu

Found 1 papers, 0 papers with code

PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation

no code implementations26 Feb 2021 Reyhan Kevser Keser, Aydin Ayanzadeh, Omid Abdollahi Aghdam, Caglar Kilcioglu, Behcet Ugur Toreyin, Nazim Kemal Ure

One of the most efficient methods for model compression is hint distillation, where the student model is injected with information (hints) from several different layers of the teacher model.

Clustering Knowledge Distillation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.