Improved Baselines with Momentum Contrastive Learning

9 Mar 2020 Xinlei Chen Haoqi Fan Ross Girshick Kaiming He

Contrastive unsupervised learning has recently shown encouraging progress, e.g., in Momentum Contrast (MoCo) and SimCLR. In this note, we verify the effectiveness of two of SimCLR's design improvements by implementing them in the MoCo framework... (read more)

PDF Abstract

Datasets


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Self-Supervised Image Classification ImageNet MoCo v2 (ResNet-50) Top 1 Accuracy 71.1% # 35
Top 5 Accuracy 90.1% # 15
Number of Params 24M # 27

Methods used in the Paper


METHOD TYPE
Dense Connections
Feedforward Networks
InfoNCE
Loss Functions
Random Gaussian Blur
Image Data Augmentation
Feedforward Network
Feedforward Networks
SGD with Momentum
Stochastic Optimization
Random Horizontal Flip
Image Data Augmentation
Random Resized Crop
Image Data Augmentation
Cosine Annealing
Learning Rate Schedules
MoCo v2
Self-Supervised Learning
MoCo
Self-Supervised Learning
Average Pooling
Pooling Operations
Residual Connection
Skip Connections
ReLU
Activation Functions
1x1 Convolution
Convolutions
Batch Normalization
Normalization
Bottleneck Residual Block
Skip Connection Blocks
Global Average Pooling
Pooling Operations
Residual Block
Skip Connection Blocks
Kaiming Initialization
Initialization
Max Pooling
Pooling Operations
Convolution
Convolutions
ResNet
Convolutional Neural Networks