Activation Functions

# modReLU

Introduced by Arjovsky et al. in Unitary Evolution Recurrent Neural Networks

modReLU is an activation that is a modification of a ReLU. It is a pointwise nonlinearity, $\sigma_{modReLU}\left(z\right) : C \rightarrow C$, which affects only the absolute value of a complex number, defined as:

$$\sigma_{modReLU}\left(z\right) = \left(|z| + b\right)\frac{z}{|z|} \text{ if } |z| + b \geq 0$$ $$\sigma_{modReLU}\left(z\right) = 0 \text{ if } |z| + b \leq 0$$

where $b \in \mathbb{R}$ is a bias parameter of the nonlinearity. For a $n_{h}$ dimensional hidden space we learn $n_{h}$ nonlinearity bias parameters, one per dimension.

#### Papers

Paper Code Results Date Stars