Supervised Contrastive Learning

Contrastive learning applied to self-supervised representation learning has seen a resurgence in recent years, leading to state of the art performance in the unsupervised training of deep image models. Modern batch contrastive approaches subsume or significantly outperform traditional contrastive losses such as triplet, max-margin and the N-pairs loss... (read more)

PDF Abstract NeurIPS 2020 PDF NeurIPS 2020 Abstract

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Image Classification ImageNet ResNet-200 (Supervised Contrastive) Top 1 Accuracy 80.8% # 183
Top 5 Accuracy 95.6% # 77

Methods used in the Paper


METHOD TYPE
Sigmoid Activation
Activation Functions
Tanh Activation
Activation Functions
Average Pooling
Pooling Operations
ReLU
Activation Functions
1x1 Convolution
Convolutions
Batch Normalization
Normalization
Exponential Decay
Learning Rate Schedules
Cosine Annealing
Learning Rate Schedules
SGD with Momentum
Stochastic Optimization
RMSProp
Stochastic Optimization
LARS
Large Batch Optimization
Supervised Contrastive Loss
Loss Functions
Bottleneck Residual Block
Skip Connection Blocks
Global Average Pooling
Pooling Operations
Residual Block
Skip Connection Blocks
Kaiming Initialization
Initialization
Max Pooling
Pooling Operations
LSTM
Recurrent Neural Networks
AutoAugment
Image Data Augmentation
Residual Connection
Skip Connections
Convolution
Convolutions
ResNet
Convolutional Neural Networks