Search Results for author: Rafael Müller

Found 2 papers, 1 papers with code

Subclass Distillation

no code implementations10 Feb 2020 Rafael Müller, Simon Kornblith, Geoffrey Hinton

By training a small "student" model to match these probabilities, it is possible to transfer most of the generalization ability of the teacher to the student, often producing a much better small model than directly training the student on the training data.

When Does Label Smoothing Help?

3 code implementations NeurIPS 2019 Rafael Müller, Simon Kornblith, Geoffrey Hinton

The generalization and learning speed of a multi-class neural network can often be significantly improved by using soft targets that are a weighted average of the hard targets and the uniform distribution over labels.

Image Classification Knowledge Distillation +3

Cannot find the paper you are looking for? You can Submit a new open access paper.