Image Model Blocks

Dense Block

Introduced by Huang et al. in Densely Connected Convolutional Networks

A Dense Block is a module used in convolutional neural networks that connects all layers (with matching feature-map sizes) directly with each other. It was originally proposed as part of the DenseNet architecture. To preserve the feed-forward nature, each layer obtains additional inputs from all preceding layers and passes on its own feature-maps to all subsequent layers. In contrast to ResNets, we never combine features through summation before they are passed into a layer; instead, we combine features by concatenating them. Hence, the $\ell^{th}$ layer has $\ell$ inputs, consisting of the feature-maps of all preceding convolutional blocks. Its own feature-maps are passed on to all $L-\ell$ subsequent layers. This introduces $\frac{L(L+1)}{2}$ connections in an $L$-layer network, instead of just $L$, as in traditional architectures: "dense connectivity".

Source: Densely Connected Convolutional Networks

Papers


Paper Code Results Date Stars

Categories