Evaluation Metrics for DNNs Compression
There is a lot of ongoing research effort into developing different techniques for neural networks compression. However, the community lacks standardised evaluation metrics, which are key to identifying the most suitable compression technique for different applications. This paper reviews existing neural network compression evaluation metrics and implements them into a standardisation framework called NetZIP. We introduce two novel metrics to cover existing gaps of evaluation in the literature: 1) Compression and Hardware Agnostic Theoretical Speed (CHATS) and 2) Overall Compression Success (OCS). We demonstrate the use of NetZIP using two case studies on two different hardware platforms (a PC and a Raspberry Pi 4) focusing on object classification and object detection.
PDF Abstract