# Rotated MNIST

18 papers with code • 1 benchmarks • 1 datasets

## Most implemented papers

# DIVA: Domain Invariant Variational Autoencoders

We consider the problem of domain generalization, namely, how to learn representations given data from a set of domains that generalize to data from a previously unseen domain.

# PDO-eConvs: Partial Differential Operator Based Equivariant Convolutions

In implementation, we discretize the system using the numerical schemes of PDOs, deriving approximately equivariant convolutions (PDO-eConvs).

# Deep Rotation Equivariant Network

Recently, learning equivariant representations has attracted considerable research attention.

# Efficient Domain Generalization via Common-Specific Low-Rank Decomposition

The domain specific components are discarded after training and only the common component is retained.

# Equivariance-bridged SO(2)-Invariant Representation Learning using Graph Convolutional Network

Training a Convolutional Neural Network (CNN) to be robust against rotation has mostly been done with data augmentation.

# Group Equivariant Convolutional Networks

We introduce Group equivariant Convolutional Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries.

# Harmonic Networks: Deep Translation and Rotation Equivariance

This is not the case for rotations.

# Polar Transformer Networks

The result is a network invariant to translation and equivariant to both rotation and scale.

# CapsGAN: Using Dynamic Routing for Generative Adversarial Networks

We show that CapsGAN performs better than or equal to traditional CNN based GANs in generating images with high geometric transformations using rotated MNIST.

# General E(2)-Equivariant Steerable CNNs

Here we give a general description of E(2)-equivariant convolutions in the framework of Steerable CNNs.