Search Results for author: Sharan Seshadri

Found 1 papers, 0 papers with code

Beyond Classification: Knowledge Distillation using Multi-Object Impressions

no code implementations27 Oct 2021 Gaurav Kumar Nayak, Monish Keswani, Sharan Seshadri, Anirban Chakraborty

Knowledge Distillation (KD) utilizes training data as a transfer set to transfer knowledge from a complex network (Teacher) to a smaller network (Student).

Classification Knowledge Distillation +3

Cannot find the paper you are looking for? You can Submit a new open access paper.