Methods > General > Self-Supervised Learning

Swapping Assignments between Views

Introduced by Caron et al. in Unsupervised Learning of Visual Features by Contrasting Cluster Assignments

SwaV, or Swapping Assignments Between Views, is a self-supervised learning approach that takes advantage of contrastive methods without requiring to compute pairwise comparisons. Specifically, it simultaneously clusters the data while enforcing consistency between cluster assignments produced for different augmentations (or views) of the same image, instead of comparing features directly as in contrastive learning. Simply put, SwaV uses a swapped prediction mechanism where we predict the cluster assignment of a view from the representation of another view.

Source: Unsupervised Learning of Visual Features by Contrasting Cluster Assignments

Latest Papers

PAPER DATE
Semi-Supervised Learning of Visual Features by Non-Parametrically Predicting View Assignments with Support Samples
| Mahmoud AssranMathilde CaronIshan MisraPiotr BojanowskiArmand JoulinNicolas BallasMichael Rabbat
2021-04-28
Self-Supervised Training Enhances Online Continual Learning
Jhair GallardoTyler L. HayesChristopher Kanan
2021-03-25
Self-supervised Pretraining of Visual Features in the Wild
| Priya GoyalMathilde CaronBenjamin LefaudeuxMin XuPengchao WangVivek PaiMannat SinghVitaliy LiptchinskyIshan MisraArmand JoulinPiotr Bojanowski
2021-03-02
Fast Training of Contrastive Learning with Intermediate Contrastive Loss
Anonymous
2021-01-01
How Well Do Self-Supervised Models Transfer?
| Linus EricssonHenry GoukTimothy M. Hospedales
2020-11-26
Self-Supervised Ranking for Representation Learning
Ali VarameshAli DibaTinne TuytelaarsLuc van Gool
2020-10-14
Unsupervised Learning of Visual Features by Contrasting Cluster Assignments
| Mathilde CaronIshan MisraJulien MairalPriya GoyalPiotr BojanowskiArmand Joulin
2020-06-17

Components

COMPONENT TYPE
LARS
Large Batch Optimization

Categories