no code implementations • ICML 2020 • Adeel Pervez, Taco Cohen, Efstratios Gavves

Stochastic neural networks with discrete random variables are an important class of models for their expressiveness and interpretability.

no code implementations • 6 May 2024 • Pietro Mazzaglia, Taco Cohen, Daniel Dijkman

Robotic affordances, providing information about what actions can be taken in a given situation, can aid robotic manipulation.

1 code implementation • 7 Feb 2024 • Natasha Butt, Blazej Manczak, Auke Wiggers, Corrado Rainone, David W. Zhang, Michaël Defferrard, Taco Cohen

CodeIt is the first neuro-symbolic approach that scales to the full ARC evaluation dataset.

1 code implementation • 12 Dec 2023 • Alexandre Duval, Simon V. Mathis, Chaitanya K. Joshi, Victor Schmidt, Santiago Miret, Fragkiskos D. Malliaros, Taco Cohen, Pietro Liò, Yoshua Bengio, Michael Bronstein

In these graphs, the geometric attributes transform according to the inherent physical symmetries of 3D atomic systems, including rotations and translations in Euclidean space, as well as node permutations.

no code implementations • 6 Dec 2023 • Ekdeep Singh Lubana, Johann Brehmer, Pim de Haan, Taco Cohen

We explore the viability of casting foundation models as generic reward functions for reinforcement learning.

1 code implementation • 8 Nov 2023 • Pim de Haan, Taco Cohen, Johann Brehmer

The Geometric Algebra Transformer (GATr) is a versatile architecture for geometric deep learning based on projective geometric algebra.

1 code implementation • 16 Jun 2023 • Phillip Lippe, Sara Magliacane, Sindy Löwe, Yuki M. Asano, Taco Cohen, Efstratios Gavves

Identifying the causal variables of an environment and how to intervene on them is of core value in applications such as robotics and embodied AI.

1 code implementation • NeurIPS 2023 • Johann Brehmer, Pim de Haan, Sönke Behrends, Taco Cohen

In this paper we introduce the Geometric Algebra Transformer (GATr), a general-purpose architecture for geometric data.

1 code implementation • 23 Jan 2023 • Chaitanya K. Joshi, Cristian Bodnar, Simon V. Mathis, Taco Cohen, Pietro Liò

The expressive power of Graph Neural Networks (GNNs) has been studied extensively through the Weisfeiler-Leman (WL) graph isomorphism test.

no code implementations • 4 Nov 2022 • Risto Vuorio, Johann Brehmer, Hanno Ackermann, Daniel Dijkman, Taco Cohen, Pim de Haan

Standard imitation learning can fail when the expert demonstrators have different sensory inputs than the imitating agent.

no code implementations • 24 Oct 2022 • Arash Behboodi, Gabriele Cesa, Taco Cohen

Equivariant networks capture the inductive bias about the symmetry of the learning task by building those symmetries into the model.

no code implementations • 28 Jun 2022 • Taco Cohen

There exist well-developed frameworks for causal modelling, but these require rather a lot of human domain expertise to define causal variables and perform interventions.

1 code implementation • 13 Jun 2022 • Phillip Lippe, Sara Magliacane, Sindy Löwe, Yuki M. Asano, Taco Cohen, Efstratios Gavves

To address this issue, we propose iCITRIS, a causal representation learning method that allows for instantaneous effects in intervened temporal sequences when intervention targets can be observed, e. g., as actions of an agent.

1 code implementation • 21 May 2022 • Sourya Basu, Jose Gallego-Posada, Francesco Viganò, James Rowbottom, Taco Cohen

Equivariance to symmetries has proven to be a powerful inductive bias in deep learning research.

no code implementations • 30 Mar 2022 • Johann Brehmer, Pim de Haan, Phillip Lippe, Taco Cohen

Learning high-level causal representations together with a causal model from unstructured low-level data such as pixels is impossible from observational data alone.

1 code implementation • 7 Feb 2022 • Phillip Lippe, Sara Magliacane, Sindy Löwe, Yuki M. Asano, Taco Cohen, Efstratios Gavves

Understanding the latent causal factors of a dynamical system from visual observations is considered a crucial step towards agents reasoning in complex environments.

3 code implementations • ICLR 2022 • Yinhao Zhu, Yang Yang, Taco Cohen

Neural data compression based on nonlinear transform coding has made great progress over the last few years, mainly due to improvements in prior models, quantization methods and nonlinear transforms.

2 code implementations • ICLR 2022 • Phillip Lippe, Taco Cohen, Efstratios Gavves

Learning the structure of a causal graphical model using both observational and interventional data is a fundamental problem in many scientific fields.

6 code implementations • 27 Apr 2021 • Michael M. Bronstein, Joan Bruna, Taco Cohen, Petar Veličković

The last decade has witnessed an experimental revolution in data science and machine learning, epitomised by deep learning methods.

no code implementations • 11 Dec 2020 • Dana Kianfar, Auke Wiggers, Amir Said, Reza Pourreza, Taco Cohen

We train two classes of neural networks, a fully-convolutional network and an auto-regressive network, and evaluate each as a post-quantization step designed to refine cheap quantization schemes such as scalar quantization (SQ).

no code implementations • NeurIPS 2020 • Pim de Haan, Taco Cohen, Max Welling

A key requirement for graph neural networks is that they must process a graph in a way that does not depend on how the graph is described.

no code implementations • 20 Apr 2020 • Vijay Veerabadran, Reza Pourreza, Amirhossein Habibian, Taco Cohen

In this paper, we present a novel adversarial lossy video compression model.

1 code implementation • ICLR 2021 • Pim de Haan, Maurice Weiler, Taco Cohen, Max Welling

A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs).

no code implementations • 25 Sep 2019 • Berkay Kicanaoglu, Pim de Haan, Taco Cohen

Spherical CNNs are convolutional neural networks that can process signals on the sphere, such as global climate and weather patterns or omnidirectional images.

no code implementations • 25 Sep 2019 • Adeel Pervez, Taco Cohen, Efstratios Gavves

In this work we focus on stochastic networks with multiple layers of Boolean latent variables.

no code implementations • NeurIPS 2019 • Taco Cohen, Mario Geiger, Maurice Weiler

Feature maps in these networks represent fields on a homogeneous base space, and layers are equivariant maps between spaces of fields.

2 code implementations • NeurIPS 2018 • Maurice Weiler, Mario Geiger, Max Welling, Wouter Boomsma, Taco Cohen

We prove that equivariant convolutions are the most general equivariant linear maps between fields over R^3.

4 code implementations • 8 Jun 2018 • Bastiaan S. Veeling, Jasper Linmans, Jim Winkens, Taco Cohen, Max Welling

We propose a new model for digital pathology segmentation, based on the observation that histopathology images are inherently symmetric under rotation and reflection.

Ranked #7 on Breast Tumour Classification on PCam

2 code implementations • 14 Sep 2017 • Taco Cohen, Mario Geiger, Jonas Köhler, Max Welling

Many areas of science and egineering deal with signals with other symmetries, such as rotation invariant data on the sphere.

3 code implementations • 1 Jul 2017 • Tambet Matiisen, Avital Oliver, Taco Cohen, John Schulman

We propose Teacher-Student Curriculum Learning (TSCL), a framework for automatic curriculum learning, where the Student tries to learn a complex task and the Teacher automatically chooses subtasks from a given set for the Student to train on.

no code implementations • 18 Feb 2014 • Taco Cohen, Max Welling

We present a new probabilistic model of compact commutative Lie groups that produces invariant-equivariant and disentangled representations of data.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.