Activation Functions

General • 72 methods

Activation functions are functions that we apply in neural networks after (typically) applying an affine transformation combining weights and input features. They are typically non-linear functions. The rectified linear unit, or ReLU, has been the most popular in the past decade, although the choice is architecture dependent and many alternatives have emerged in recent years. In this section, you will find a constantly updating list of activation functions.

Subcategories

Method Year Papers
2016 11316
2000 10652
2014 1332
2016 802
2017 417
2000 261
2017 228
2019 225
2024 188
2023 179
2015 119
2017 95
2020 90
2019 90
2013 50
2017 29
2017 20
2021 15
2020 13
2000 12
2020 9
2017 9
2020 9
2020 7
2015 6
2020 5
1994 5
2024 4
2015 4
2020 3
2015 3
2019 3
2019 3
2023 2
2015 2
2019 2
2022 2
2019 2
2000 2
2020 2
2024 2
2016 2
2018 2
2021 2
2000 1
2024 1
2023 1
2018 1
2023 1
2021 1
2024 1
2023 1
2021 1
2022 1
2023 1
2024 1
2023 1
2021 1
2000 1
2018 1
2024 1
2020 1
2018 1
2023 1
2022 1
2024 1
2025 1
1998 0
2020 0
2000 0