no code implementations • 17 Jan 2019 • Thomas Villmann, John Ravichandran, Andrea Villmann, David Nebel, Marika Kaden
An appropriate choice of the activation function (like ReLU, sigmoid or swish) plays an important role in the performance of (deep) multilayer perceptrons (MLP) for classification and regression learning.