Search Results for author: Ghassen Jerfel

Found 15 papers, 7 papers with code

A Simple Approach to Improve Single-Model Deep Uncertainty via Distance-Awareness

2 code implementations1 May 2022 Jeremiah Zhe Liu, Shreyas Padhy, Jie Ren, Zi Lin, Yeming Wen, Ghassen Jerfel, Zack Nado, Jasper Snoek, Dustin Tran, Balaji Lakshminarayanan

The most popular approaches to estimate predictive uncertainty in deep learning are methods that combine predictions from multiple neural networks, such as Bayesian neural networks (BNNs) and deep ensembles.

Data Augmentation Probabilistic Deep Learning +1

Sparse MoEs meet Efficient Ensembles

1 code implementation7 Oct 2021 James Urquhart Allingham, Florian Wenzel, Zelda E Mariet, Basil Mustafa, Joan Puigcerver, Neil Houlsby, Ghassen Jerfel, Vincent Fortuin, Balaji Lakshminarayanan, Jasper Snoek, Dustin Tran, Carlos Riquelme Ruiz, Rodolphe Jenatton

Machine learning models based on the aggregated outputs of submodels, either at the activation or prediction levels, often exhibit strong performance compared to individual models.

Few-Shot Learning

Variational Refinement for Importance Sampling Using the Forward Kullback-Leibler Divergence

no code implementations30 Jun 2021 Ghassen Jerfel, Serena Wang, Clara Fannjiang, Katherine A. Heller, Yian Ma, Michael I. Jordan

We thus propose a novel combination of optimization and sampling techniques for approximate Bayesian inference by constructing an IS proposal distribution through the minimization of a forward KL (FKL) divergence.

Bayesian Inference Variational Inference

Combining Ensembles and Data Augmentation can Harm your Calibration

no code implementations ICLR 2021 Yeming Wen, Ghassen Jerfel, Rafael Muller, Michael W. Dusenberry, Jasper Snoek, Balaji Lakshminarayanan, Dustin Tran

Ensemble methods which average over multiple neural network predictions are a simple approach to improve a model's calibration and robustness.

Data Augmentation

Efficient and Scalable Bayesian Neural Nets with Rank-1 Factors

1 code implementation ICML 2020 Michael W. Dusenberry, Ghassen Jerfel, Yeming Wen, Yi-An Ma, Jasper Snoek, Katherine Heller, Balaji Lakshminarayanan, Dustin Tran

Bayesian neural networks (BNNs) demonstrate promising success in improving the robustness and uncertainty quantification of modern deep learning.

Uncertainty Quantification

Analyzing the Role of Model Uncertainty for Electronic Health Records

1 code implementation10 Jun 2019 Michael W. Dusenberry, Dustin Tran, Edward Choi, Jonas Kemp, Jeremy Nixon, Ghassen Jerfel, Katherine Heller, Andrew M. Dai

We further show that RNNs with only Bayesian embeddings can be a more efficient way to capture model uncertainty compared to ensembles, and we analyze how model uncertainty is impacted across individual input features and patient subgroups.

Modulating transfer between tasks in gradient-based meta-learning

no code implementations ICLR 2019 Erin Grant, Ghassen Jerfel, Katherine Heller, Thomas L. Griffiths

Learning-to-learn or meta-learning leverages data-driven inductive bias to increase the efficiency of learning on a novel task.

Inductive Bias Meta-Learning

AdaNet: A Scalable and Flexible Framework for Automatically Learning Ensembles

1 code implementation30 Apr 2019 Charles Weill, Javier Gonzalvo, Vitaly Kuznetsov, Scott Yang, Scott Yak, Hanna Mazzawi, Eugen Hotaj, Ghassen Jerfel, Vladimir Macko, Ben Adlam, Mehryar Mohri, Corinna Cortes

AdaNet is a lightweight TensorFlow-based (Abadi et al., 2015) framework for automatically learning high-quality ensembles with minimal expert intervention.

Neural Architecture Search

Measuring Calibration in Deep Learning

3 code implementations2 Apr 2019 Jeremy Nixon, Mike Dusenberry, Ghassen Jerfel, Timothy Nguyen, Jeremiah Liu, Linchuan Zhang, Dustin Tran

In this paper, we perform a comprehensive empirical study of choices in calibration measures including measuring all probabilities rather than just the maximum prediction, thresholding probability values, class conditionality, number of bins, bins that are adaptive to the datapoint density, and the norm used to compare accuracies to confidences.

Dynamic Collaborative Filtering with Compound Poisson Factorization

no code implementations17 Aug 2016 Ghassen Jerfel, Mehmet E. Basbug, Barbara E. Engelhardt

Model-based collaborative filtering analyzes user-item interactions to infer latent factors that represent user preferences and item characteristics in order to predict future interactions.

Collaborative Filtering Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.