2 code implementations • 26 Mar 2020 • Geoff French, Avital Oliver, Tim Salimans
Using it to provide perturbations for semi-supervised consistency regularization, we achieve a state-of-the-art result on ImageNet with 10% labeled data, with a top-5 error of 8. 76% and top-1 error of 26. 06%.
2 code implementations • 24 Mar 2020 • Casper Kaae Sønderby, Lasse Espeholt, Jonathan Heek, Mostafa Dehghani, Avital Oliver, Tim Salimans, Shreya Agrawal, Jason Hickey, Nal Kalchbrenner
Weather forecasting is a long standing scientific challenge with direct social and economic impact.
1 code implementation • ICCV 2019 • Xiaohua Zhai, Avital Oliver, Alexander Kolesnikov, Lucas Beyer
This work tackles the problem of semi-supervised learning of image classifiers.
Ranked #11 on
Semi-Supervised Image Classification
on ImageNet - 10% labeled data
(Top 5 Accuracy metric)
28 code implementations • NeurIPS 2019 • David Berthelot, Nicholas Carlini, Ian Goodfellow, Nicolas Papernot, Avital Oliver, Colin Raffel
Semi-supervised learning has proven to be a powerful paradigm for leveraging unlabeled data to mitigate the reliance on large labeled datasets.
no code implementations • 13 Dec 2018 • Hong-Yu Zhou, Avital Oliver, Jianxin Wu, Yefeng Zheng
While practitioners have had an intuitive understanding of these observations, we do a comprehensive emperical analysis and demonstrate that: (1) the gains from SSL techniques over a fully-supervised baseline are smaller when trained from a pre-trained model than when trained from random initialization, (2) when the domain of the source data used to train the pre-trained model differs significantly from the domain of the target task, the gains from SSL are significantly higher and (3) some SSL methods are able to advance fully-supervised baselines (like Pseudo-Label).
7 code implementations • NeurIPS 2018 • Avital Oliver, Augustus Odena, Colin Raffel, Ekin D. Cubuk, Ian J. Goodfellow
However, we argue that these benchmarks fail to address many issues that these algorithms would face in real-world applications.
3 code implementations • 1 Jul 2017 • Tambet Matiisen, Avital Oliver, Taco Cohen, John Schulman
We propose Teacher-Student Curriculum Learning (TSCL), a framework for automatic curriculum learning, where the Student tries to learn a complex task and the Teacher automatically chooses subtasks from a given set for the Student to train on.