Search Results for author: Polina Kirichenko

Found 11 papers, 7 papers with code

Modeling Caption Diversity in Contrastive Vision-Language Pretraining

no code implementations30 Apr 2024 Samuel Lavoie, Polina Kirichenko, Mark Ibrahim, Mahmoud Assran, Andrew Gordon Wilson, Aaron Courville, Nicolas Ballas

Contrastive Language Pretraining (CLIP) on the other hand, works by mapping an image and its caption to a single vector -- limiting how well CLIP-like models can represent the diverse ways to describe an image.

Zero-Shot Learning

Does Progress On Object Recognition Benchmarks Improve Real-World Generalization?

no code implementations24 Jul 2023 Megan Richards, Polina Kirichenko, Diane Bouchacourt, Mark Ibrahim

Second, we study model generalization across geographies by measuring the disparities in performance across regions, a more fine-grained measure of real world generalization.

Object Recognition

Chroma-VAE: Mitigating Shortcut Learning with Generative Classifiers

no code implementations28 Nov 2022 Wanqian Yang, Polina Kirichenko, Micah Goldblum, Andrew Gordon Wilson

Deep neural networks are susceptible to shortcut learning, using simple features to achieve low training loss without discovering essential semantic structure.

On Feature Learning in the Presence of Spurious Correlations

1 code implementation20 Oct 2022 Pavel Izmailov, Polina Kirichenko, Nate Gruver, Andrew Gordon Wilson

Deep classifiers are known to rely on spurious features $\unicode{x2013}$ patterns which are correlated with the target on the training data but not inherently relevant to the learning problem, such as the image backgrounds when classifying the foregrounds.

Task-agnostic Continual Learning with Hybrid Probabilistic Models

no code implementations ICML Workshop INNF 2021 Polina Kirichenko, Mehrdad Farajtabar, Dushyant Rao, Balaji Lakshminarayanan, Nir Levine, Ang Li, Huiyi Hu, Andrew Gordon Wilson, Razvan Pascanu

Learning new tasks continuously without forgetting on a constantly changing data distribution is essential for real-world problems but extremely challenging for modern deep learning.

Anomaly Detection Continual Learning +1

Does Knowledge Distillation Really Work?

2 code implementations NeurIPS 2021 Samuel Stanton, Pavel Izmailov, Polina Kirichenko, Alexander A. Alemi, Andrew Gordon Wilson

Knowledge distillation is a popular technique for training a small student network to emulate a larger teacher model, such as an ensemble of networks.

Knowledge Distillation

Semi-Supervised Learning with Normalizing Flows

2 code implementations ICML 2020 Pavel Izmailov, Polina Kirichenko, Marc Finzi, Andrew Gordon Wilson

Normalizing flows transform a latent distribution through an invertible neural network for a flexible and pleasingly simple approach to generative modelling, while preserving an exact likelihood.

Semi-Supervised Image Classification Semi-Supervised Text Classification

Subspace Inference for Bayesian Deep Learning

1 code implementation17 Jul 2019 Pavel Izmailov, Wesley J. Maddox, Polina Kirichenko, Timur Garipov, Dmitry Vetrov, Andrew Gordon Wilson

Bayesian inference was once a gold standard for learning with neural networks, providing accurate full predictive distributions and well calibrated uncertainty.

Bayesian Inference Image Classification +2

SWALP : Stochastic Weight Averaging in Low-Precision Training

3 code implementations26 Apr 2019 Guandao Yang, Tianyi Zhang, Polina Kirichenko, Junwen Bai, Andrew Gordon Wilson, Christopher De Sa

Low precision operations can provide scalability, memory savings, portability, and energy efficiency.

Cannot find the paper you are looking for? You can Submit a new open access paper.