Search Results for author: Subby Olubeko

Found 1 papers, 1 papers with code

SlimNets: An Exploration of Deep Model Compression and Acceleration

1 code implementation1 Aug 2018 Ini Oguntola, Subby Olubeko, Christopher Sweeney

We show that by combining pruning and knowledge distillation methods we can create a compressed network 85 times smaller than the original, all while retaining 96% of the original model's accuracy.

Knowledge Distillation Model Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.