Skip Connection Blocks

Dense Block

Introduced by Huang et al. in Densely Connected Convolutional Networks

A Dense Block is a module used in convolutional neural networks that connects all layers (with matching feature-map sizes) directly with each other. It was originally proposed as part of the DenseNet architecture. To preserve the feed-forward nature, each layer obtains additional inputs from all preceding layers and passes on its own feature-maps to all subsequent layers. In contrast to ResNets, we never combine features through summation before they are passed into a layer; instead, we combine features by concatenating them. Hence, the $\ell^{th}$ layer has $\ell$ inputs, consisting of the feature-maps of all preceding convolutional blocks. Its own feature-maps are passed on to all $L-\ell$ subsequent layers. This introduces $\frac{L(L+1)}{2}$ connections in an $L$-layer network, instead of just $L$, as in traditional architectures: "dense connectivity".

Source: Densely Connected Convolutional Networks

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 81 9.51%
Classification 59 6.92%
General Classification 59 6.92%
Deep Learning 28 3.29%
Semantic Segmentation 24 2.82%
Super-Resolution 17 2.00%
Image Super-Resolution 15 1.76%
Object Detection 15 1.76%
Decoder 13 1.53%

Categories