modReLU is an activation that is a modification of a ReLU. It is a pointwise nonlinearity, $\sigma_{modReLU}\left(z\right) : C \rightarrow C$, which affects only the absolute value of a complex number, defined as:
$$ \sigma_{modReLU}\left(z\right) = \left(z + b\right)\frac{z}{z} \text{ if } z + b \geq 0 $$ $$ \sigma_{modReLU}\left(z\right) = 0 \text{ if } z + b \leq 0 $$
where $b \in \mathbb{R}$ is a bias parameter of the nonlinearity. For a $n_{h}$ dimensional hidden space we learn $n_{h}$ nonlinearity bias parameters, one per dimension.
Source: Unitary Evolution Recurrent Neural NetworksPaper  Code  Results  Date  Stars 

Task  Papers  Share 

Marketing  1  20.00% 
Multimodal Sentiment Analysis  1  20.00% 
Recommendation Systems  1  20.00% 
Sentiment Analysis  1  20.00% 
Sequential Image Classification  1  20.00% 
Component  Type 


🤖 No Components Found  You can add them if they exist; e.g. Mask RCNN uses RoIAlign 