Search Results for author: David Reeb

Found 10 papers, 4 papers with code

Tighter Confidence Bounds for Sequential Kernel Regression

no code implementations19 Mar 2024 Hamish Flynn, David Reeb

In this capacity, they can inform the exploration-exploitation trade-off and form a core component in many sequential learning and decision-making algorithms.

Decision Making regression

PAC-Bayes Bounds for Bandit Problems: A Survey and Experimental Comparison

no code implementations29 Nov 2022 Hamish Flynn, David Reeb, Melih Kandemir, Jan Peters

On the one hand, we found that PAC-Bayes bounds are a useful tool for designing offline bandit algorithms with performance guarantees.

Decision Making

Validation of Composite Systems by Discrepancy Propagation

no code implementations21 Oct 2022 David Reeb, Kanil Patel, Karim Barsim, Martin Schiegg, Sebastian Gerwinn

Assessing the validity of a real-world system with respect to given quality criteria is a common yet costly task in industrial applications due to the vast number of required real-world tests.

Experimental Design valid

Utilizing Expert Features for Contrastive Learning of Time-Series Representations

1 code implementation23 Jun 2022 Manuel Nonnenmacher, Lukas Oldenburg, Ingo Steinwart, David Reeb

We therefore devise ExpCLR, a novel contrastive learning approach built on an objective that utilizes expert features to encourage both properties for the learned representation.

Contrastive Learning Representation Learning +2

SOSP: Efficiently Capturing Global Correlations by Second-Order Structured Pruning

1 code implementation NeurIPS 2021 Manuel Nonnenmacher, Thomas Pfeil, Ingo Steinwart, David Reeb

We validate SOSP-H by comparing it to our second method SOSP-I that uses a well-established Hessian approximation, and to numerous state-of-the-art methods.

Which Minimizer Does My Neural Network Converge To?

no code implementations4 Nov 2020 Manuel Nonnenmacher, David Reeb, Ingo Steinwart

The loss surface of an overparameterized neural network (NN) possesses many global minima of zero training error.

Wide Neural Networks are Interpolating Kernel Methods: Impact of Initialization on Generalization

no code implementations25 Sep 2019 Manuel Nonnenmacher, David Reeb, Ingo Steinwart

The recently developed link between strongly overparametrized neural networks (NNs) and kernel methods has opened a new way to understand puzzling features of NNs, such as their convergence and generalization behaviors.

Generalization Bounds

Cannot find the paper you are looking for? You can Submit a new open access paper.