Search Results for author: Peter D. Grünwald

Found 4 papers, 0 papers with code

The no-free-lunch theorems of supervised learning

no code implementations9 Feb 2022 Tom F. Sterkenburg, Peter D. Grünwald

The no-free-lunch theorems promote a skeptical conclusion that all possible machine learning algorithms equally lack justification.

Inductive Bias Learning Theory +1

Fast Rates for General Unbounded Loss Functions: from ERM to Generalized Bayes

no code implementations1 May 2016 Peter D. Grünwald, Nishant A. Mehta

For general loss functions, our bounds rely on two separate conditions: the $v$-GRIP (generalized reversed information projection) conditions, which control the lower tail of the excess loss; and the newly introduced witness condition, which controls the upper tail.

Bayesian Inference

Fast rates in statistical and online learning

no code implementations9 Jul 2015 Tim van Erven, Peter D. Grünwald, Nishant A. Mehta, Mark D. Reid, Robert C. Williamson

For bounded losses, we show how the central condition enables a direct proof of fast rates and we prove its equivalence to the Bernstein condition, itself a generalization of the Tsybakov margin condition, both of which have played a central role in obtaining fast rates in statistical learning.

Density Estimation Learning Theory

Cannot find the paper you are looking for? You can Submit a new open access paper.