Training Techniques | NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn101_in1k_simclr_1000ep |
SHOW MORE |
Training Techniques | NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn101_in1k_simclr_100ep |
SHOW MORE |
Training Techniques | NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_simclr_1000ep |
SHOW MORE |
Training Techniques | NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_simclr_100ep |
SHOW MORE |
Training Techniques | NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_simclr_200ep |
SHOW MORE |
Training Techniques | NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_simclr_400ep |
SHOW MORE |
Training Techniques | NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_simclr_800ep |
SHOW MORE |
Training Techniques | NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_w2_in1k_simclr_1000ep |
SHOW MORE |
Training Techniques | NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_w2_in1k_simclr_100ep |
SHOW MORE |
Training Techniques | NT-Xent, SimCLR, Weight Decay, SGD with Momentum, Cosine Annealing |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_w4_in1k_simclr_1000ep |
SHOW MORE |
SimCLR is a framework for contrastive learning of visual representations. It learns representations by maximizing agreement between differently augmented views of the same data example via a contrastive loss in the latent space.
Get started with VISSL by trying one of the Colab tutorial notebooks.
@misc{chen2020simple,
title={A Simple Framework for Contrastive Learning of Visual Representations},
author={Ting Chen and Simon Kornblith and Mohammad Norouzi and Geoffrey Hinton},
year={2020},
eprint={2002.05709},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
MODEL | TOP 1 ACCURACY |
---|---|
SimCLR ResNet-50-w2 (1000 epochs) | 73.84% |
SimCLR ResNet-50-w4 (1000 epochs) | 71.61% |
SimCLR ResNet-101 (1000 epochs) | 71.56% |
SimCLR ResNet-50-w2 (100 epochs) | 69.82% |
SimCLR ResNet-50 (800 epochs) | 69.68% |
SimCLR ResNet-50 (1000 epochs) | 68.8% |
SimCLR ResNet-50 (400 epochs) | 67.71% |
SimCLR ResNet-50 (200 epochs) | 66.61% |
SimCLR ResNet-50 (100 epochs) | 64.4% |
SimCLR ResNet-101 (100 epochs) | 62.76% |