Activation Functions

# Sigmoid Linear Unit

Introduced by Elfwing et al. in Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning

Sigmoid Linear Units, or SiLUs, are activation functions for neural networks. The activation of the SiLU is computed by the sigmoid function multiplied by its input, or $$x\sigma(x).$$

See Gaussian Error Linear Units (GELUs) where the SiLU was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.

#### Papers

Paper Code Results Date Stars

#### Components Add Remove

Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign