no code implementations • 1 Sep 2020 • Sajjad Abbasi, Mohsen Hajabdollahi, Pejman Khadivi, Nader Karimi, Roshanak Roshandel, Shahram Shirani, Shadrokh Samavi
Knowledge distillation addresses some of the shortcomings associated with transfer learning by generalizing a complex model to a lighter model.
no code implementations • 9 Feb 2020 • Sajjad Abbasi, Mohsen Hajabdollahi, Nader Karimi, Shadrokh Samavi, Shahram Shirani
Knowledge distillation is recently proposed to transfer the knowledge of a model to another one and can be useful to cover the shortcomings of transfer learning.
no code implementations • 31 Dec 2019 • Sajjad Abbasi, Mohsen Hajabdollahi, Nader Karimi, Shadrokh Samavi
By utilizing the proposed model, different methods in KD are better investigated and explored.