Search Results for author: Khizir Siddiqui

Found 1 papers, 1 papers with code

KD-Lib: A PyTorch library for Knowledge Distillation, Pruning and Quantization

1 code implementation30 Nov 2020 Het Shah, Avishree Khare, Neelay Shah, Khizir Siddiqui

In recent years, the growing size of neural networks has led to a vast amount of research concerning compression techniques to mitigate the drawbacks of such large sizes.

Knowledge Distillation Model Compression +1

Cannot find the paper you are looking for? You can Submit a new open access paper.