Activation Functions

General • 38 methods

Activation functions are functions that we apply in neural networks after (typically) applying an affine transformation combining weights and input features. They are typically non-linear functions. The rectified linear unit, or ReLU, has been the most popular in the past decade, although the choice is architecture dependent and many alternatives have emerged in recent years. In this section, you will find a constantly updating list of activation functions.

Method Year Papers
2000 5638
2000 3896
2000 3683
2016 2665
2014 510
2017 144
2000 137
2019 121
2016 84
2015 58
2013 37
2015 25
2017 25
2019 23
2017 13
2000 9
2017 9
2016 6
2015 4
2020 4
2015 3
2015 3
2017 3
2017 3
2015 2
2019 2
2018 2
2020 2
2016 1
2018 1
2018 1
2018 1
2018 1
2000 1
2019 1
1998 0
2000 0
2020 0