Search Results for author: Pradipto Mondal

Found 1 papers, 1 papers with code

DeiT-LT Distillation Strikes Back for Vision Transformer Training on Long-Tailed Datasets

1 code implementation3 Apr 2024 Harsh Rangwani, Pradipto Mondal, Mayank Mishra, Ashish Ramayee Asokan, R. Venkatesh Babu

In DeiT-LT, we introduce an efficient and effective way of distillation from CNN via distillation DIST token by using out-of-distribution images and re-weighting the distillation loss to enhance focus on tail classes.

 Ranked #1 on Image Classification on iNaturalist (Overall metric)

Image Classification Inductive Bias +1

Cannot find the paper you are looking for? You can Submit a new open access paper.