BranchyNet: Fast Inference via Early Exiting from Deep Neural Networks

Deep neural networks are state of the art methods for many learning tasks due to their ability to extract increasingly better features at each network layer. However, the improved performance of additional layers in a deep network comes at the cost of added latency and energy usage in feedforward inference... (read more)

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
Early exiting
Loss Functions
1x1 Convolution
Convolutions
Convolution
Convolutions
Local Response Normalization
Normalization
Grouped Convolution
Convolutions
ReLU
Activation Functions
Dropout
Regularization
Dense Connections
Feedforward Networks
Max Pooling
Pooling Operations
Softmax
Output Functions
AlexNet
Convolutional Neural Networks