Browse > Methodology > Model Compression > Neural Network Compression

Neural Network Compression

23 papers with code · Methodology
Subtask of Model Compression

Leaderboards

Latest papers with code

REST: Robust and Efficient Neural Networks for Sleep Monitoring in the Wild

29 Jan 2020duggalrahul/REST

By deploying these models to an Android application on a smartphone, we quantitatively observe that REST allows models to achieve up to 17x energy reduction and 9x faster inference.

EEG NEURAL NETWORK COMPRESSION SLEEP STAGE DETECTION

8
29 Jan 2020

Quantisation and Pruning for Neural Network Compression and Regularisation

14 Jan 2020kpaupamah/compression-and-regularisation

Deep neural networks are typically too computationally expensive to run in real-time on consumer-grade hardware and low-powered devices.

NETWORK PRUNING NEURAL NETWORK COMPRESSION

2
14 Jan 2020

Data-Free Learning of Student Networks

ICCV 2019 huawei-noah/DAFL

Learning portable neural networks is very essential for computer vision for the purpose that pre-trained heavy deep models can be well applied on edge devices such as mobile phones and micro sensors.

NEURAL NETWORK COMPRESSION

274
01 Oct 2019

IR-Net: Forward and Backward Information Retention for Highly Accurate Binary Neural Networks

24 Sep 2019JDAI-CV/dabnn

Weight and activation binarization is an effective approach to deep neural network compression and can accelerate the inference by leveraging bitwise operations.

NEURAL NETWORK COMPRESSION QUANTIZATION

582
24 Sep 2019

COP: Customized Deep Model Compression via Regularized Correlation-Based Filter-Level Pruning

25 Jun 2019ZJULearning/COP

2) Cross-layer filter comparison is unachievable since the importance is defined locally within each layer.

NEURAL NETWORK COMPRESSION

35
25 Jun 2019

Learning Sparse Networks Using Targeted Dropout

31 May 2019for-ai/TD

Before computing the gradients for each weight update, targeted dropout stochastically selects a set of units or weights to be dropped using a simple self-reinforcing sparsity criterion and then computes the gradients for the remaining weights.

NETWORK PRUNING NEURAL NETWORK COMPRESSION

237
31 May 2019

Data-Free Learning of Student Networks

ICCV 2019 huawei-noah/DAFL

Learning portable neural networks is very essential for computer vision for the purpose that pre-trained heavy deep models can be well applied on edge devices such as mobile phones and micro sensors.

NEURAL NETWORK COMPRESSION

274
02 Apr 2019

MUSCO: Multi-Stage Compression of neural networks

24 Mar 2019juliagusak/musco

The low-rank tensor approximation is very promising for the compression of deep neural networks.

NEURAL NETWORK COMPRESSION

30
24 Mar 2019

Focused Quantization for Sparse CNNs

NeurIPS 2019 deep-fry/mayo

In ResNet-50, we achieved a 18. 08x CR with only 0. 24% loss in top-5 accuracy, outperforming existing compression methods.

NEURAL NETWORK COMPRESSION QUANTIZATION

76
07 Mar 2019