# ReLU nets adapt to intrinsic dimensionality beyond the target domain

6 Aug 2020Alexander CloningerTimo Klock

We study the approximation of two-layer compositions $f(x) = g(\phi(x))$ via deep ReLU networks, where $\phi$ is a nonlinear, geometrically intuitive, and dimensionality reducing feature map. We focus on two complementary choices for $\phi$ that are intuitive and frequently appearing in the statistical literature... (read more)

PDF Abstract