Activation Functions

General • 39 methods

Activation functions are functions that we apply in neural networks after (typically) applying an affine transformation combining weights and input features. They are typically non-linear functions. The rectified linear unit, or ReLU, has been the most popular in the past decade, although the choice is architecture dependent and many alternatives have emerged in recent years. In this section, you will find a constantly updating list of activation functions.

Method Year Papers
2000 5784
2000 3985
2000 3760
2016 2787
2014 538
2017 151
2000 140
2019 124
2016 90
2015 59
2013 38
2017 28
2019 26
2015 25
2017 13
2000 9
2017 9
2016 6
2015 4
2020 4
2015 3
2015 3
2017 3
2017 3
2015 2
2019 2
2018 2
2020 2
2016 1
2018 1
2018 1
2018 1
2018 1
2000 1
2019 1
2020 1
1998 0
2000 0
2020 0