ReLU nets adapt to intrinsic dimensionality beyond the target domain

6 Aug 2020Alexander CloningerTimo Klock

We study the approximation of two-layer compositions $f(x) = g(\phi(x))$ via deep ReLU networks, where $\phi$ is a nonlinear, geometrically intuitive, and dimensionality reducing feature map. We focus on two complementary choices for $\phi$ that are intuitive and frequently appearing in the statistical literature... (read more)

PDF Abstract


No code implementations yet. Submit your code now


Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper