no code implementations • 1 Oct 2022 • Jash Rathod, Nauman Dawalatabad, Shatrughan Singh, Dhananjaya Gowda
Knowledge distillation (KD) is a popular model compression approach that has shown to achieve smaller model size with relatively lesser degradation in the model performance.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +3