no code implementations • ICLR 2021 • Martin Trimmel, Henning Petzka, Cristian Sminchisescu
Deep neural networks with rectified linear (ReLU) activations are piecewise linear functions, where hyperplanes partition the input space into an astronomically high number of linear regions.