Search Results for author: Omid Abdollahi Aghdam

Found 2 papers, 0 papers with code

PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge Distillation

no code implementations26 Feb 2021 Reyhan Kevser Keser, Aydin Ayanzadeh, Omid Abdollahi Aghdam, Caglar Kilcioglu, Behcet Ugur Toreyin, Nazim Kemal Ure

One of the most efficient methods for model compression is hint distillation, where the student model is injected with information (hints) from several different layers of the teacher model.

Clustering Knowledge Distillation +1

Exploring Factors for Improving Low Resolution Face Recognition

no code implementations23 Jul 2019 Omid Abdollahi Aghdam, Behzad Bozorgtabar, Hazim Kemal Ekenel, Jean-Philippe Thiran

By leveraging this information, we have utilized deep face models trained on MS-Celeb-1M and fine-tuned on VGGFace2 dataset and achieved state-of-the-art accuracies on the SCFace and ICB-RW benchmarks, even without using any training data from the datasets of these benchmarks.

Face Recognition

Cannot find the paper you are looking for? You can Submit a new open access paper.