|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
We introduce Group equivariant Convolutional Neural Networks (G-CNNs), a natural generalization of convolutional neural networks that reduces sample complexity by exploiting symmetries.
Ranked #5 on Breast Tumour Classification on PCam
Based on a general causal model for data from multiple domains, we show that prior methods for learning an invariant representation optimize for an incorrect objective.
Ranked #1 on Domain Generalization on Rotated Fashion-MNIST
We show that CapsGAN performs better than or equal to traditional CNN based GANs in generating images with high geometric transformations using rotated MNIST.
The domain specific components are discarded after training and only the common component is retained.
Ranked #1 on Domain Generalization on LipitK
Deep Convolutional Neural Networks (CNNs) are empirically known to be invariant to moderate translation but not to rotation in image classification.
Deep Neural Networks (DNNs) achieve the state-of-the-art results on a wide range of image processing tasks, however, the majority of such solutions are problem-specific, like most AI algorithms.