Activation Functions

Collapsing Linear Unit

Introduced by Vagerwal in Deeper Learning with CoLU Activation

CoLU is an activation function similar to Swish and Mish in properties. It is defined as: $$f(x)=\frac{x}{1-x^{-(x+e^x)}}$$ It is smooth, continuously differentiable, unbounded above, bounded below, non-saturating, and non-monotonic. Based on experiments done with CoLU with different activation functions, it is observed that CoLU usually performs better than other functions on deeper neural networks.

Source: Deeper Learning with CoLU Activation

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories