Activation Functions

CReLU, or Concatenated Rectified Linear Units, is a type of activation function which preserves both positive and negative phase information while enforcing non-saturated non-linearity. We compute by concatenating the layer output $h$ as:

$$ \left[\text{ReLU}\left(h\right), \text{ReLU}\left(-h\right)\right] $$

Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units

Papers


Paper Code Results Date Stars

Components


Component Type
ReLU
Activation Functions

Categories