Rotated MNIST

18 papers with code • 1 benchmarks • 1 datasets

This task has no description! Would you like to contribute one?

Datasets


Latest papers with no code

Co-Attentive Equivariant Neural Networks: Focusing Equivariance On Transformations Co-Occurring In Data

no code yet • ICLR 2020

Equivariance is a nice property to have as it produces much more parameter efficient neural architectures and preserves the structure of the input through the feature mapping.

ICNN: INPUT-CONDITIONED FEATURE REPRESENTATION LEARNING FOR TRANSFORMATION-INVARIANT NEURAL NETWORK

no code yet • 25 Sep 2019

And our proposed decoder network serves the purpose of reducing the transformation present in the input image by learning to construct a representative image of the input image class.

Visual Context-aware Convolution Filters for Transformation-invariant Neural Network

no code yet • 15 Jun 2019

We propose a novel visual context-aware filter generation module which incorporates contextual information present in images into Convolutional Neural Networks (CNNs).

Fast Inference in Capsule Networks Using Accumulated Routing Coefficients

no code yet • 15 Apr 2019

Afterward, the routing coefficients associated with the training examples are accumulated offline and used to create a set of "master" routing coefficients.

DIVA: Domain Invariant Variational Autoencoder

no code yet • ICLR Workshop DeepGenStruct 2019

We consider the problem of domain generalization, namely, how to learn representations given data from a set of domains that generalize to data from a previously unseen domain.

Deformable Classifiers

no code yet • 18 Dec 2017

In this paper, we design a framework for training deformable classifiers, where latent transformation variables are introduced, and a transformation of the object image to a reference instantiation is computed in terms of the classifier output, separately for each class.

Learning Steerable Filters for Rotation Equivariant CNNs

no code yet • CVPR 2018

In many machine learning tasks it is desirable that a model's prediction transforms in an equivariant way under transformations of its input.

Local Group Invariant Representations via Orbit Embeddings

no code yet • 6 Dec 2016

We consider transformations that form a \emph{group} and propose an approach based on kernel methods to derive local group invariant representations.