Search Results for author: Vishwajith Kumar

Found 1 papers, 0 papers with code

Variational Student: Learning Compact and Sparser Networks in Knowledge Distillation Framework

no code implementations26 Oct 2019 Srinidhi Hegde, Ranjitha Prasad, Ramya Hebbalaguppe, Vishwajith Kumar

We demonstrate that the marriage of KD and the VI techniques inherits compression properties from the KD framework, and enhances levels of sparsity from the VI approach, with minimal compromise in the model accuracy.

Knowledge Distillation Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.