no code implementations • 1 Sep 2020 • Sajjad Abbasi, Mohsen Hajabdollahi, Pejman Khadivi, Nader Karimi, Roshanak Roshandel, Shahram Shirani, Shadrokh Samavi
Knowledge distillation addresses some of the shortcomings associated with transfer learning by generalizing a complex model to a lighter model.