Browse > Computer Vision > Omniglot

Omniglot

24 papers with code · Computer Vision

Leaderboards

No evaluation results yet. Help compare methods by submit evaluation metrics.

Greatest papers with code

Continual Unsupervised Representation Learning

NeurIPS 2019 deepmind/deepmind-research

Continual learning aims to improve the ability of modern learning systems to deal with non-stationary distributions, typically by attempting to learn a series of tasks sequentially.

CONTINUAL LEARNING OMNIGLOT UNSUPERVISED REPRESENTATION LEARNING

The Omniglot challenge: a 3-year progress report

9 Feb 2019brendenlake/omniglot

Three years ago, we released the Omniglot dataset for one-shot learning, along with five challenge tasks and a computational model that addresses these tasks.

OMNIGLOT ONE-SHOT LEARNING

Matching Networks for One Shot Learning

NeurIPS 2016 oscarknagg/few-shot

Our algorithm improves one-shot accuracy on ImageNet from 87. 6% to 93. 2% and from 88. 0% to 93. 8% on Omniglot compared to competing approaches.

FEW-SHOT IMAGE CLASSIFICATION LANGUAGE MODELLING METRIC LEARNING OMNIGLOT ONE-SHOT LEARNING

Meta-Learning for Semi-Supervised Few-Shot Classification

ICLR 2018 renmengye/few-shot-ssl-public

To address this paradigm, we propose novel extensions of Prototypical Networks (Snell et al., 2017) that are augmented with the ability to use unlabeled examples when producing prototypes.

META-LEARNING OMNIGLOT

Data Augmentation Generative Adversarial Networks

ICLR 2018 AntreasAntoniou/DAGAN

The model, based on image conditional Generative Adversarial Networks, takes data from a source domain and learns to take any data item and generalise it to generate other within-class data items.

DATA AUGMENTATION FEW-SHOT LEARNING OMNIGLOT

Learning to cluster in order to transfer across domains and tasks

ICLR 2018 GT-RIPL/L2C

The key insight is that, in addition to features, we can transfer similarity information and this is sufficient to learn a similarity function and clustering network to perform both domain adaptation and cross-task transfer learning.

OMNIGLOT TRANSFER LEARNING UNSUPERVISED DOMAIN ADAPTATION

VAE with a VampPrior

19 May 2017jmtomczak/vae_vampprior

In this paper, we propose to extend the variational auto-encoder (VAE) framework with a new type of prior which we call "Variational Mixture of Posteriors" prior, or VampPrior for short.

OMNIGLOT

FIGR: Few-shot Image Generation with Reptile

8 Jan 2019marcdemers/FIGR-8

Generative Adversarial Networks (GAN) boast impressive capacity to generate realistic images.

FEW-SHOT LEARNING IMAGE GENERATION META-LEARNING OMNIGLOT

LGM-Net: Learning to Generate Matching Networks for Few-Shot Learning

15 May 2019likesiwell/LGM-Net

The TargetNet module is a neural network for solving a specific task and the MetaNet module aims at learning to generate functional weights for TargetNet by observing training samples.

FEW-SHOT LEARNING META-LEARNING OMNIGLOT

DVAE#: Discrete Variational Autoencoders with Relaxed Boltzmann Priors

NeurIPS 2018 QuadrantAI/dvae

Experiments on the MNIST and OMNIGLOT datasets show that these relaxations outperform previous discrete VAEs with Boltzmann priors.

OMNIGLOT