Search Results for author: Amin Parchami-Araghi

Found 3 papers, 3 papers with code

Good Teachers Explain: Explanation-Enhanced Knowledge Distillation

1 code implementation5 Feb 2024 Amin Parchami-Araghi, Moritz Böhle, Sukrut Rao, Bernt Schiele

Knowledge Distillation (KD) has proven effective for compressing large teacher models into smaller student models.

Knowledge Distillation

Using Explanations to Guide Models

1 code implementation21 Mar 2023 Sukrut Rao, Moritz Böhle, Amin Parchami-Araghi, Bernt Schiele

To gain a better understanding of which model-guiding approaches actually transfer to more challenging real-world datasets, in this work we conduct an in-depth evaluation across various loss functions, attribution methods, models, and 'guidance depths' on the PASCAL VOC 2007 and MS COCO 2014 datasets, and show that model guidance can sometimes even improve model performance.

Studying How to Efficiently and Effectively Guide Models with Explanations

1 code implementation ICCV 2023 Sukrut Rao, Moritz Böhle, Amin Parchami-Araghi, Bernt Schiele

To better understand the effectiveness of the various design choices that have been explored in the context of model guidance, in this work we conduct an in-depth evaluation across various loss functions, attribution methods, models, and 'guidance depths' on the PASCAL VOC 2007 and MS COCO 2014 datasets.

Cannot find the paper you are looking for? You can Submit a new open access paper.