no code implementations • 31 Jul 2023 • Javed Lindner, David Dahmen, Michael Krämer, Moritz Helias
Using our formalism on a synthetic task and on MNIST we obtain a homogeneous kernel matrix approximation for the learning curve as well as corrections due to data variability which allow the estimation of the generalization properties and exact results for the bounds of the learning curves in the case of infinitely many training data points.
no code implementations • 12 May 2023 • Kirsten Fischer, David Dahmen, Moritz Helias
We here derive a systematic finite-size theory for ResNets to study signal propagation and its dependence on the scaling for the residual branch.
no code implementations • 10 Feb 2022 • Kirsten Fischer, Alexandre René, Christian Keup, Moritz Layer, David Dahmen, Moritz Helias
Understanding the functional principles of information processing in deep neural networks continues to be a challenge, in particular for networks with trained and thus non-random weights.
no code implementations • 10 Dec 2021 • Kai Segadlo, Bastian Epping, Alexander van Meegen, David Dahmen, Michael Krämer, Moritz Helias
Bayesian inference on Gaussian processes has proven to be a viable approach for studying recurrent and deep networks in the limit of infinite layer width, $n\to\infty$.
no code implementations • NeurIPS 2020 • Sandra Nestler, Christian Keup, David Dahmen, Matthieu Gilson, Holger Rauhut, Moritz Helias
Cortical networks are strongly recurrent, and neurons have intrinsic temporal dynamics.
no code implementations • 13 Oct 2020 • Sandra Nestler, Christian Keup, David Dahmen, Matthieu Gilson, Holger Rauhut, Moritz Helias
Cortical networks are strongly recurrent, and neurons have intrinsic temporal dynamics.
no code implementations • 2 Dec 2019 • David Dahmen, Matthieu Gilson, Moritz Helias
Closed-form expressions reveal superior pattern capacity in the binary classification task compared to the classical perceptron in the case of a high-dimensional input and low-dimensional output.