CReLU, or Concatenated Rectified Linear Units, is a type of activation function which preserves both positive and negative phase information while enforcing non-saturated non-linearity. We compute by concatenating the layer output $h$ as:
$$ \left[\text{ReLU}\left(h\right), \text{ReLU}\left(-h\right)\right] $$
Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear UnitsPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 2 | 11.11% |
Image Generation | 2 | 11.11% |
Classification | 1 | 5.56% |
Denoising | 1 | 5.56% |
Fine-Grained Image Classification | 1 | 5.56% |
Super-Resolution | 1 | 5.56% |
Image Enhancement | 1 | 5.56% |
Low-Light Image Enhancement | 1 | 5.56% |
Object | 1 | 5.56% |