CoLU is an activation function similar to Swish and Mish in properties. It is defined as: $$f(x)=\frac{x}{1-x^{-(x+e^x)}}$$ It is smooth, continuously differentiable, unbounded above, bounded below, non-saturating, and non-monotonic. Based on experiments done with CoLU with different activation functions, it is observed that CoLU usually performs better than other functions on deeper neural networks.
Source: Deeper Learning with CoLU ActivationPaper | Code | Results | Date | Stars |
---|
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |