1 code implementation • 26 Oct 2023 • Dániel Rácz, Mihály Petreczky, András Csertán, Bálint Daróczy
Recent advances in deep learning have given us some very promising results on the generalization ability of deep neural networks, however literature still lacks a comprehensive theory explaining why heavily over-parametrized models are able to generalize well while fitting the training data.
no code implementations • 7 Jul 2023 • Dániel Rácz, Mihály Petreczky, Bálint Daróczy
We consider the problem of learning Neural Ordinary Differential Equations (neural ODEs) within the context of Linear Parameter-Varying (LPV) systems in continuous-time.
1 code implementation • 26 Oct 2021 • Dániel Rácz, Bálint Daróczy
Feed-forward networks can be interpreted as mappings with linear decision surfaces at the level of the last layer.