Activation Functions

Softsign Activation

Softsign is an activation function for neural networks:

$$ f\left(x\right) = \left(\frac{x}{|x|+1}\right)$$

Image Source: Sefik Ilkin Serengil

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Speech Synthesis 4 33.33%
Domain Adaptation 2 16.67%
Unsupervised Domain Adaptation 2 16.67%
Image Classification 1 8.33%
Melody Extraction 1 8.33%
GPR 1 8.33%
Text-To-Speech Synthesis 1 8.33%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories