Training Techniques | NPID, Weight Decay, SGD with Momentum |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_npid_200ep_4kneg |
SHOW MORE |
Training Techniques | NPID, Weight Decay, SGD with Momentum |
---|---|
Architecture | 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | rn50_in1k_npid_oss |
SHOW MORE |
NPID (Non-Parametric Instance Discrimination) is a self-supervision approach that takes a non-parametric classification approach. Noise contrastive estimation is used to learn representations. Specifically, distances (similarity) between instances are calculated directly from the features in a non-parametric way.
Get started with VISSL by trying one of the Colab tutorial notebooks.
@article{DBLP:journals/corr/abs-1805-01978,
author = {Zhirong Wu and
Yuanjun Xiong and
Stella X. Yu and
Dahua Lin},
title = {Unsupervised Feature Learning via Non-Parametric Instance-level Discrimination},
journal = {CoRR},
volume = {abs/1805.01978},
year = {2018},
url = {http://arxiv.org/abs/1805.01978},
archivePrefix = {arXiv},
eprint = {1805.01978},
timestamp = {Wed, 12 Aug 2020 11:07:47 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1805-01978.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
@misc{goyal2021vissl,
author = {Priya Goyal and Benjamin Lefaudeux and Mannat Singh and Jeremy Reizenstein and Vinicius Reis and
Min Xu and and Matthew Leavitt and Mathilde Caron and Piotr Bojanowski and Armand Joulin and
Ishan Misra},
title = {VISSL},
howpublished = {\url{https://github.com/facebookresearch/vissl}},
year = {2021}
}
BENCHMARK | MODEL | METRIC NAME | METRIC VALUE | GLOBAL RANK |
---|---|---|---|---|
ImageNet | NPID ResNet-50 (ImageNet-1K, official) | Top 1 Accuracy | 54.99% | # 307 |
ImageNet | NPID ResNet-50 (4k negatives, 200 epochs) | Top 1 Accuracy | 52.73% | # 311 |