Activation Functions

Tanh Exponential Activation Function

Introduced by Liu et al. in TanhExp: A Smooth Activation Function with High Convergence Speed for Lightweight Neural Networks

Lightweight or mobile neural networks used for real-time computer vision tasks contain fewer parameters than normal networks, which lead to a constrained performance. In this work, we proposed a novel activation function named Tanh Exponential Activation Function (TanhExp) which can improve the performance for these networks on image classification task significantly. The definition of TanhExp is $f(x) = x tanh(e^x)$. We demonstrate the simplicity, efficiency, and robustness of TanhExp on various datasets and network models and TanhExp outperforms its counterparts in both convergence speed and accuracy. Its behaviour also remains stable even with noise added and dataset altered. We show that without increasing the size of the network, the capacity of lightweight neural networks can be enhanced by TanhExp with only a few training epochs and no extra parameters added.

Source: TanhExp: A Smooth Activation Function with High Convergence Speed for Lightweight Neural Networks

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 2 66.67%
General Classification 1 33.33%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories