Activation Functions

General • 51 methods

Activation functions are functions that we apply in neural networks after (typically) applying an affine transformation combining weights and input features. They are typically non-linear functions. The rectified linear unit, or ReLU, has been the most popular in the past decade, although the choice is architecture dependent and many alternatives have emerged in recent years. In this section, you will find a constantly updating list of activation functions.

Method Year Papers
2000 7419
2000 4959
2016 4605
2000 4579
2014 806
2016 257
2017 228
2000 182
2019 161
2017 104
2015 81
2017 49
2019 47
2013 44
2015 31
2000 10
2017 9
2017 8
2016 7
2021 6
2015 4
2015 4
2017 4
2020 4
2015 3
2020 3
2015 2
2019 2
2018 2
2021 2
2016 1
2018 1
2018 1
2018 1
2018 1
2000 1
2019 1
2020 1
2020 1
2021 1
2020 1
2020 1
2020 1
2020 1
2021 1
2021 1
2019 1
1998 0
2000 0
2020 0
2000 0