A Dense Block is a module used in convolutional neural networks that connects all layers (with matching feature-map sizes) directly with each other. It was originally proposed as part of the DenseNet architecture. To preserve the feed-forward nature, each layer obtains additional inputs from all preceding layers and passes on its own feature-maps to all subsequent layers. In contrast to ResNets, we never combine features through summation before they are passed into a layer; instead, we combine features by concatenating them. Hence, the $\ell^{th}$ layer has $\ell$ inputs, consisting of the feature-maps of all preceding convolutional blocks. Its own feature-maps are passed on to all $L-\ell$ subsequent layers. This introduces $\frac{L(L+1)}{2}$ connections in an $L$-layer network, instead of just $L$, as in traditional architectures: "dense connectivity".
Source: Densely Connected Convolutional NetworksPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 71 | 10.82% |
General Classification | 59 | 8.99% |
Classification | 52 | 7.93% |
Semantic Segmentation | 22 | 3.35% |
Test | 21 | 3.20% |
Super-Resolution | 16 | 2.44% |
Image Super-Resolution | 15 | 2.29% |
Object Detection | 15 | 2.29% |
Image Segmentation | 9 | 1.37% |
Component | Type |
|
---|---|---|
![]() |
Normalization | |
![]() |
Skip Connections | |
![]() |
Convolutions | |
![]() |
Activation Functions |