Search Results for author: Manuel Nonnenmacher

Found 4 papers, 2 papers with code

Utilizing Expert Features for Contrastive Learning of Time-Series Representations

1 code implementation23 Jun 2022 Manuel Nonnenmacher, Lukas Oldenburg, Ingo Steinwart, David Reeb

We therefore devise ExpCLR, a novel contrastive learning approach built on an objective that utilizes expert features to encourage both properties for the learned representation.

Contrastive Learning Representation Learning +2

SOSP: Efficiently Capturing Global Correlations by Second-Order Structured Pruning

1 code implementation NeurIPS 2021 Manuel Nonnenmacher, Thomas Pfeil, Ingo Steinwart, David Reeb

We validate SOSP-H by comparing it to our second method SOSP-I that uses a well-established Hessian approximation, and to numerous state-of-the-art methods.

Which Minimizer Does My Neural Network Converge To?

no code implementations4 Nov 2020 Manuel Nonnenmacher, David Reeb, Ingo Steinwart

The loss surface of an overparameterized neural network (NN) possesses many global minima of zero training error.

Wide Neural Networks are Interpolating Kernel Methods: Impact of Initialization on Generalization

no code implementations25 Sep 2019 Manuel Nonnenmacher, David Reeb, Ingo Steinwart

The recently developed link between strongly overparametrized neural networks (NNs) and kernel methods has opened a new way to understand puzzling features of NNs, such as their convergence and generalization behaviors.

Generalization Bounds

Cannot find the paper you are looking for? You can Submit a new open access paper.