A Dense Block is a module used in convolutional neural networks that connects all layers (with matching feature-map sizes) directly with each other. It was originally proposed as part of the DenseNet architecture. To preserve the feed-forward nature, each layer obtains additional inputs from all preceding layers and passes on its own feature-maps to all subsequent layers. In contrast to ResNets, we never combine features through summation before they are passed into a layer; instead, we combine features by concatenating them. Hence, the $\ell^{th}$ layer has $\ell$ inputs, consisting of the feature-maps of all preceding convolutional blocks. Its own feature-maps are passed on to all $L-\ell$ subsequent layers. This introduces $\frac{L(L+1)}{2}$ connections in an $L$-layer network, instead of just $L$, as in traditional architectures: "dense connectivity".
Source: Densely Connected Convolutional NetworksPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 74 | 10.10% |
General Classification | 59 | 8.05% |
Classification | 56 | 7.64% |
Semantic Segmentation | 24 | 3.27% |
Super-Resolution | 17 | 2.32% |
Image Super-Resolution | 15 | 2.05% |
Object Detection | 15 | 2.05% |
Decoder | 13 | 1.77% |
Image Segmentation | 11 | 1.50% |
Component | Type |
|
---|---|---|
Batch Normalization
|
Normalization | |
Concatenated Skip Connection
|
Skip Connections | |
Convolution
|
Convolutions | |
ReLU
|
Activation Functions |