Unsupervised Pre-training

103 papers with code • 2 benchmarks • 7 datasets

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Libraries

Use these libraries to find Unsupervised Pre-training models and implementations
2 papers
29,174

Most implemented papers

Seasonal Contrast: Unsupervised Pre-Training from Uncurated Remote Sensing Data

ElementAI/seasonal-contrast ICCV 2021

Transfer learning approaches can reduce the data requirements of deep learning algorithms.

An Analysis of Unsupervised Pre-training in Light of Recent Advances

ifp-uiuc/anna 20 Dec 2014

We discover unsupervised pre-training, as expected, helps when the ratio of unsupervised to supervised samples is high, and surprisingly, hurts when the ratio is low.

Data-dependent Initializations of Convolutional Neural Networks

philkr/magic_init 21 Nov 2015

Convolutional Neural Networks spread through computer vision like a wildfire, impacting almost all visual tasks imaginable.

BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning

AsaCooperStickland/Bert-n-Pals 7 Feb 2019

Multi-task learning shares information between related tasks, sometimes reducing the number of parameters required.

Unsupervised Pre-Training of Image Features on Non-Curated Data

facebookresearch/deepcluster ICCV 2019

Our goal is to bridge the performance gap between unsupervised methods trained on curated data, which are costly to obtain, and massive raw datasets that are easily available.

Rolling-Unrolling LSTMs for Action Anticipation from First-Person Video

fpv-iplab/rulstm 4 May 2020

The experiments show that the proposed architecture is state-of-the-art in the domain of egocentric videos, achieving top performances in the 2019 EPIC-Kitchens egocentric action anticipation challenge.

PointContrast: Unsupervised Pre-training for 3D Point Cloud Understanding

facebookresearch/PointContrast ECCV 2020

To this end, we select a suite of diverse datasets and tasks to measure the effect of unsupervised pre-training on a large source set of 3D scenes.

UP-DETR: Unsupervised Pre-training for Object Detection with Transformers

dddzg/up-detr CVPR 2021

DEtection TRansformer (DETR) for object detection reaches competitive performance compared with Faster R-CNN via a transformer encoder-decoder architecture.

OBoW: Online Bag-of-Visual-Words Generation for Self-Supervised Learning

valeoai/obow CVPR 2021

With this in mind, we propose a teacher-student scheme to learn representations by training a convolutional net to reconstruct a bag-of-visual-words (BoW) representation of an image, given as input a perturbed version of that same image.

End-to-End Training of Neural Retrievers for Open-Domain Question Answering

NVIDIA/Megatron-LM ACL 2021

We also explore two approaches for end-to-end supervised training of the reader and retriever components in OpenQA models.