no code implementations • 10 Feb 2020 • Behnam Asadi, Hui Jiang
In this paper, we have extended the well-established universal approximator theory to neural networks that use the unbounded ReLU activation function and a nonlinear softmax output layer.