1 code implementation • 22 Feb 2024 • Maksim Zhdanov, David Ruhe, Maurice Weiler, Ana Lucic, Johannes Brandstetter, Patrick Forré
We present Clifford-Steerable Convolutional Neural Networks (CS-CNNs), a novel class of $\mathrm{E}(p, q)$-equivariant CNNs.
no code implementations • 29 Aug 2023 • Andrii Skliar, Maurice Weiler
However, no papers have yet suggested a general approach to using Hyperbolic Convolutional Neural Networks for structured data processing, although these are the most common examples of data used.
1 code implementation • 17 Jul 2023 • Xuan Zhang, Limei Wang, Jacob Helwig, Youzhi Luo, Cong Fu, Yaochen Xie, Meng Liu, Yuchao Lin, Zhao Xu, Keqiang Yan, Keir Adams, Maurice Weiler, Xiner Li, Tianfan Fu, Yucheng Wang, Haiyang Yu, Yuqing Xie, Xiang Fu, Alex Strasser, Shenglong Xu, Yi Liu, Yuanqi Du, Alexandra Saxton, Hongyi Ling, Hannah Lawrence, Hannes Stärk, Shurui Gui, Carl Edwards, Nicholas Gao, Adriana Ladera, Tailin Wu, Elyssa F. Hofgard, Aria Mansouri Tehrani, Rui Wang, Ameya Daigavane, Montgomery Bohde, Jerry Kurtin, Qian Huang, Tuong Phung, Minkai Xu, Chaitanya K. Joshi, Simon V. Mathis, Kamyar Azizzadenesheli, Ada Fang, Alán Aspuru-Guzik, Erik Bekkers, Michael Bronstein, Marinka Zitnik, Anima Anandkumar, Stefano Ermon, Pietro Liò, Rose Yu, Stephan Günnemann, Jure Leskovec, Heng Ji, Jimeng Sun, Regina Barzilay, Tommi Jaakkola, Connor W. Coley, Xiaoning Qian, Xiaofeng Qian, Tess Smidt, Shuiwang Ji
Advances in artificial intelligence (AI) are fueling a new paradigm of discoveries in natural sciences.
1 code implementation • ICLR 2022 • Gabriele Cesa, Leon Lang, Maurice Weiler
This enables us to directly parameterize filters in terms of a band-limited basis on the base space, but also to easily implement steerable CNNs equivariant to a large number of groups.
4 code implementations • ICLR 2022 • Erik Jenner, Maurice Weiler
In deep learning, however, these maps are usually defined by convolutions with a kernel, whereas they are partial differential operators (PDOs) in physics.
1 code implementation • 10 Jun 2021 • Maurice Weiler, Patrick Forré, Erik Verlinde, Max Welling
We argue that the particular choice of coordinatization should not affect a network's inference -- it should be coordinate independent.
no code implementations • ICLR 2021 • Leon Lang, Maurice Weiler
Group equivariant convolutional networks (GCNNs) endow classical convolutional networks with additional symmetry priors, which can lead to a considerably improved performance.
1 code implementation • ICLR 2021 • Pim de Haan, Maurice Weiler, Taco Cohen, Max Welling
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs).
1 code implementation • NeurIPS 2019 • Maurice Weiler, Gabriele Cesa
Here we give a general description of E(2)-equivariant convolutions in the framework of Steerable CNNs.
Ranked #2 on Rotated MNIST on Rotated MNIST
7 code implementations • 19 Nov 2019 • Maurice Weiler, Gabriele Cesa
Here we give a general description of $E(2)$-equivariant convolutions in the framework of Steerable CNNs.
Ranked #34 on Image Classification on STL-10
no code implementations • 6 Jun 2019 • Miranda C. N. Cheng, Vassilis Anagiannis, Maurice Weiler, Pim de Haan, Taco S. Cohen, Max Welling
In this proceeding we give an overview of the idea of covariance (or equivariance) featured in the recent development of convolutional neural networks (CNNs).
2 code implementations • 11 Feb 2019 • Taco S. Cohen, Maurice Weiler, Berkay Kicanaoglu, Max Welling
The principle of equivariance to symmetry transformations enables a theoretically grounded approach to neural network architecture design.
Ranked #23 on Semantic Segmentation on Stanford2D3D Panoramic
no code implementations • NeurIPS 2019 • Taco Cohen, Mario Geiger, Maurice Weiler
Feature maps in these networks represent fields on a homogeneous base space, and layers are equivariant maps between spaces of fields.
1 code implementation • 12 Jul 2018 • Luca Falorsi, Pim de Haan, Tim R. Davidson, Nicola De Cao, Maurice Weiler, Patrick Forré, Taco S. Cohen
Our experiments show that choosing manifold-valued latent variables that match the topology of the latent data manifold, is crucial to preserve the topological structure and learn a well-behaved latent space.
2 code implementations • NeurIPS 2018 • Maurice Weiler, Mario Geiger, Max Welling, Wouter Boomsma, Taco Cohen
We prove that equivariant convolutions are the most general equivariant linear maps between fields over R^3.
1 code implementation • 28 Mar 2018 • Taco S. Cohen, Mario Geiger, Maurice Weiler
In algebraic terms, the feature spaces in regular G-CNNs transform according to a regular representation of the group G, whereas the feature spaces in Steerable G-CNNs transform according to the more general induced representations of G. In order to make the network equivariant, each layer in a G-CNN is required to intertwine between the induced representations associated with its input and output space.
no code implementations • CVPR 2018 • Maurice Weiler, Fred A. Hamprecht, Martin Storath
In many machine learning tasks it is desirable that a model's prediction transforms in an equivariant way under transformations of its input.
Ranked #2 on Breast Tumour Classification on PCam
Breast Tumour Classification Colorectal Gland Segmentation: +2