ZeroQ: A Novel Zero Shot Quantization Framework

Quantization is a promising approach for reducing the inference time and memory footprint of neural networks. However, most existing quantization methods require access to the original training dataset for retraining during quantization... (read more)

PDF Abstract CVPR 2020 PDF CVPR 2020 Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
Label Smoothing
Regularization
RMSProp
Stochastic Optimization
Auxiliary Classifier
Miscellaneous Components
Inception-v3 Module
Image Model Blocks
Dropout
Regularization
Inception-v3
Convolutional Neural Networks
Grouped Convolution
Convolutions
Depthwise Separable Convolution
Convolutions
1x1 Convolution
Convolutions
ReLU
Activation Functions
Inverted Residual Block
Skip Connection Blocks
Depthwise Convolution
Convolutions
Pointwise Convolution
Convolutions
Residual Connection
Skip Connections
Convolution
Convolutions
Average Pooling
Pooling Operations
Channel Shuffle
Miscellaneous Components
Groupwise Point Convolution
Convolutions
ShuffleNet Block
Image Model Blocks
Global Average Pooling
Pooling Operations
MobileNetV2
Image Models
Dense Connections
Feedforward Networks
Max Pooling
Pooling Operations
Softmax
Output Functions
ShuffleNet
Convolutional Neural Networks
Batch Normalization
Normalization