1 code implementation • 21 Aug 2024 • Xingyou Song, Qiuyi Zhang, Chansoo Lee, Emily Fertig, Tzu-Kuo Huang, Lior Belenki, Greg Kochanski, Setareh Ariafar, Srinivas Vasudevan, Sagi Perel, Daniel Golovin
Google Vizier has performed millions of optimizations and accelerated numerous research and production systems at Google, demonstrating the success of Bayesian optimization as a large-scale service.
1 code implementation • ICLR 2022 • Gianluigi Silvestri, Emily Fertig, Dave Moore, Luca Ambrogioni
We also introduce gated structured layers, which allow bypassing the parts of the models that fail to capture the statistics of the data.
2 code implementations • 3 Feb 2020 • Luca Ambrogioni, Kate Lin, Emily Fertig, Sharad Vikram, Max Hinne, Dave Moore, Marcel van Gerven
However, the performance of the variational approach depends on the choice of an appropriate variational family.
4 code implementations • NeurIPS 2019 • Jie Ren, Peter J. Liu, Emily Fertig, Jasper Snoek, Ryan Poplin, Mark A. DePristo, Joshua V. Dillon, Balaji Lakshminarayanan
We propose a likelihood ratio method for deep generative models which effectively corrects for these confounding background statistics.
Out-of-Distribution Detection
Out of Distribution (OOD) Detection
2 code implementations • NeurIPS 2019 • Yaniv Ovadia, Emily Fertig, Jie Ren, Zachary Nado, D. Sculley, Sebastian Nowozin, Joshua V. Dillon, Balaji Lakshminarayanan, Jasper Snoek
Modern machine learning methods including deep learning have achieved great success in predictive accuracy for supervised learning tasks, but may still fall short in giving useful estimates of their predictive {\em uncertainty}.
no code implementations • 17 May 2019 • Bryan Seybold, Emily Fertig, Alex Alemi, Ian Fischer
Variational autoencoders learn unsupervised data representations, but these models frequently converge to minima that fail to preserve meaningful semantic information.
no code implementations • 6 Dec 2018 • Emily Fertig, Aryan Arbabi, Alexander A. Alemi
In this paper, we investigate the degree to which the encoding of a $\beta$-VAE captures label information across multiple architectures on Binary Static MNIST and Omniglot.