Browse > Methodology > Representation Learning > Unsupervised Representation Learning

Unsupervised Representation Learning

17 papers with code · Methodology

State-of-the-art leaderboards

You can find evaluation results in the subtasks. You can also submitting evaluation metrics for this task.

Greatest papers with code

Meta-Learning Update Rules for Unsupervised Representation Learning

ICLR 2019 tensorflow/models

Specifically, we target semi-supervised classification performance, and we meta-learn an algorithm -- an unsupervised weight update rule -- that produces representations useful for this task.

META-LEARNING UNSUPERVISED REPRESENTATION LEARNING

Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks

19 Nov 2015tensorflow/models

In recent years, supervised learning with convolutional networks (CNNs) has seen huge adoption in computer vision applications.

CONDITIONAL IMAGE GENERATION UNSUPERVISED REPRESENTATION LEARNING

Semi-Supervised Sequence Modeling with Cross-View Training

EMNLP 2018 tensorflow/models

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG SUPERTAGGING DEPENDENCY PARSING MACHINE TRANSLATION MULTI-TASK LEARNING NAMED ENTITY RECOGNITION (NER) UNSUPERVISED REPRESENTATION LEARNING

Visual Reinforcement Learning with Imagined Goals

NeurIPS 2018 vitchyr/rlkit

For an autonomous agent to fulfill a wide range of user-specified goals at test time, it must be able to learn broadly applicable and general-purpose skill repertoires.

UNSUPERVISED REPRESENTATION LEARNING

Unsupervised Representation Learning by Predicting Image Rotations

ICLR 2018 gidariss/FeatureLearningRotNet

However, in order to successfully learn those features, they usually require massive amounts of manually labeled data, which is both expensive and impractical to scale.

UNSUPERVISED REPRESENTATION LEARNING

Spectral Inference Networks: Unifying Deep and Spectral Learning

ICLR 2019 deepmind/spectral_inference_networks

We present Spectral Inference Networks, a framework for learning eigenfunctions of linear operators by stochastic optimization.

ATARI GAMES STOCHASTIC OPTIMIZATION UNSUPERVISED REPRESENTATION LEARNING

AET vs. AED: Unsupervised Representation Learning by Auto-Encoding Transformations rather than Data

14 Jan 2019maple-research-lab/AET

The success of deep neural networks often relies on a large amount of labeled examples, which can be difficult to obtain in many real scenarios.

UNSUPERVISED REPRESENTATION LEARNING

Learning Distributed Representations of Sentences from Unlabelled Data

HLT 2016 jihunchoi/sequential-denoising-autoencoder-tf

Unsupervised methods for learning distributed representations of words are ubiquitous in today's NLP research, but far less is known about the best ways to learn distributed phrase or sentence representations from unlabelled data.

UNSUPERVISED REPRESENTATION LEARNING

textTOvec: Deep Contextualized Neural Autoregressive Topic Models of Language with Distributed Compositional Prior

ICLR 2019 pgcool/textTOvec

We address two challenges of probabilistic topic modelling in order to better estimate the probability of a word in a given context, i. e., P(word|context): (1) No Language Structure in Context: Probabilistic topic models ignore word order by summarizing a given context as a "bag-of-word" and consequently the semantics of words in the context is lost.

INFORMATION EXTRACTION INFORMATION RETRIEVAL LANGUAGE MODELLING TOPIC MODELS UNSUPERVISED REPRESENTATION LEARNING WORD EMBEDDINGS