SimCLR

Last updated on Feb 27, 2021

SimCLR ResNet-101 (1000 epochs)

Parameters 45 Million
FLOPs 4 Billion
File Size 358.99 MB
Training Data ImageNet
Training Resources 8x NVIDIA V100 GPUs
Training Time

Training Techniques NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing
Architecture 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID rn101_in1k_simclr_1000ep
LR 0.3
Epochs 1000
Layers 101
Momentum 0.9
Weight Decay 0.0
Width Multiplier 1
SimCLR Loss Temperature 0.1
SHOW MORE
SHOW LESS
SimCLR ResNet-101 (100 epochs)

Parameters 45 Million
FLOPs 4 Billion
File Size 358.99 MB
Training Data ImageNet
Training Resources 8x NVIDIA V100 GPUs
Training Time

Training Techniques NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing
Architecture 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID rn101_in1k_simclr_100ep
LR 0.3
Epochs 100
Layers 101
Momentum 0.9
Weight Decay 0.0
Width Multiplier 1
SimCLR Loss Temperature 0.1
SHOW MORE
SHOW LESS
SimCLR ResNet-50 (1000 epochs)

Parameters 26 Million
FLOPs 4 Billion
File Size 213.74 MB
Training Data ImageNet
Training Resources 8x NVIDIA V100 GPUs
Training Time

Training Techniques NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing
Architecture 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID rn50_in1k_simclr_1000ep
LR 0.3
Epochs 1000
Layers 50
Momentum 0.9
Weight Decay 0.0
Width Multiplier 1
SimCLR Loss Temperature 0.1
SHOW MORE
SHOW LESS
SimCLR ResNet-50 (100 epochs)

Parameters 26 Million
FLOPs 4 Billion
File Size 213.74 MB
Training Data ImageNet
Training Resources 8x NVIDIA V100 GPUs
Training Time

Training Techniques NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing
Architecture 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID rn50_in1k_simclr_100ep
LR 0.3
Epochs 100
Layers 50
Momentum 0.9
Weight Decay 0.0
Width Multiplier 1
SimCLR Loss Temperature 0.1
SHOW MORE
SHOW LESS
SimCLR ResNet-50 (200 epochs)

Parameters 26 Million
FLOPs 4 Billion
File Size 213.74 MB
Training Data ImageNet
Training Resources 8x NVIDIA V100 GPUs
Training Time

Training Techniques NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing
Architecture 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID rn50_in1k_simclr_200ep
LR 0.3
Epochs 200
Layers 50
Momentum 0.9
Weight Decay 0.0
Width Multiplier 1
SimCLR Loss Temperature 0.1
SHOW MORE
SHOW LESS
SimCLR ResNet-50 (400 epochs)

Parameters 26 Million
FLOPs 4 Billion
File Size 213.74 MB
Training Data ImageNet
Training Resources 8x NVIDIA V100 GPUs
Training Time

Training Techniques NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing
Architecture 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID rn50_in1k_simclr_400ep
LR 0.3
Epochs 400
Layers 50
Momentum 0.9
Weight Decay 0.0
Width Multiplier 1
SimCLR Loss Temperature 0.1
SHOW MORE
SHOW LESS
SimCLR ResNet-50 (800 epochs)

Parameters 26 Million
FLOPs 4 Billion
File Size 213.74 MB
Training Data ImageNet
Training Resources 8x NVIDIA V100 GPUs
Training Time

Training Techniques NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing
Architecture 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID rn50_in1k_simclr_800ep
LR 0.3
Epochs 800
Layers 50
Momentum 0.9
Weight Decay 0.0
Width Multiplier 1
SimCLR Loss Temperature 0.1
SHOW MORE
SHOW LESS
SimCLR ResNet-50-w2 (1000 epochs)

Parameters 94 Million
Layers 50
File Size 849.06 MB
Training Data ImageNet
Training Resources 8x NVIDIA V100 GPUs
Training Time

Training Techniques NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing
Architecture 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID rn50_w2_in1k_simclr_1000ep
LR 0.3
Epochs 1000
Layers 50
Momentum 0.9
Weight Decay 0.0
Width Multiplier 2
SimCLR Loss Temperature 0.1
SHOW MORE
SHOW LESS
SimCLR ResNet-50-w2 (100 epochs)

Parameters 94 Million
Layers 50
File Size 849.06 MB
Training Data ImageNet
Training Resources 8x NVIDIA V100 GPUs
Training Time

Training Techniques NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing
Architecture 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID rn50_w2_in1k_simclr_100ep
LR 0.3
Epochs 100
Layers 50
Momentum 0.9
Weight Decay 0.0
Width Multiplier 2
SimCLR Loss Temperature 0.1
SHOW MORE
SHOW LESS
SimCLR ResNet-50-w4 (1000 epochs)

Parameters 375 Million
Layers 50
File Size 3.38 GB
Training Data ImageNet
Training Resources 8x NVIDIA V100 GPUs
Training Time

Training Techniques NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing
Architecture 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID rn50_w4_in1k_simclr_1000ep
LR 0.3
Epochs 1000
Layers 50
Momentum 0.9
Weight Decay 0.0
Width Multiplier 4
SimCLR Loss Temperature 0.1
SHOW MORE
SHOW LESS
README.md

Summary

SimCLR is a framework for contrastive learning of visual representations. It learns representations by maximizing agreement between differently augmented views of the same data example via a contrastive loss in the latent space.

How do I train this model?

Get started with VISSL by trying one of the Colab tutorial notebooks.

Citation

@misc{chen2020simple,
      title={A Simple Framework for Contrastive Learning of Visual Representations}, 
      author={Ting Chen and Simon Kornblith and Mohammad Norouzi and Geoffrey Hinton},
      year={2020},
      eprint={2002.05709},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

Results

Image Classification on ImageNet

Image Classification on ImageNet
MODEL TOP 1 ACCURACY
SimCLR ResNet-50-w2 (1000 epochs) 73.84%
SimCLR ResNet-50-w4 (1000 epochs) 71.61%
SimCLR ResNet-101 (1000 epochs) 71.56%
SimCLR ResNet-50-w2 (100 epochs) 69.82%
SimCLR ResNet-50 (800 epochs) 69.68%
SimCLR ResNet-50 (1000 epochs) 68.8%
SimCLR ResNet-50 (400 epochs) 67.71%
SimCLR ResNet-50 (200 epochs) 66.61%
SimCLR ResNet-50 (100 epochs) 64.4%
SimCLR ResNet-101 (100 epochs) 62.76%