You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • insights (ACL) 2022 • Yue Ding, Karolis Martinkus, Damian Pascual, Simon Clematide, Roger Wattenhofer

Different studies of the embedding space of transformer models suggest that the distribution of contextual representations is highly anisotropic - the embeddings are distributed in a narrow cone.

no code implementations • 20 Nov 2022 • Jeremia Geiger, Karolis Martinkus, Oliver Richter, Roger Wattenhofer

While rigid origami has shown potential in a large diversity of engineering applications, current rigid origami crease pattern designs mostly rely on known tessellations.

1 code implementation • 4 Oct 2022 • Kilian Konstantin Haefeli, Karolis Martinkus, Nathanaël Perraudin, Roger Wattenhofer

Denoising diffusion probabilistic models and score matching models have proven to be very powerful for generative tasks.

1 code implementation • 22 Jun 2022 • Karolis Martinkus, Pál András Papp, Benedikt Schesch, Roger Wattenhofer

AgentNet is inspired by sublinear algorithms, featuring a computational complexity that is independent of the graph size.

no code implementations • 26 May 2022 • Peter Müller, Lukas Faber, Karolis Martinkus, Roger Wattenhofer

We propose the fully explainable Decision Tree Graph Neural Network (DT+GNN) architecture.

1 code implementation • 4 Apr 2022 • Karolis Martinkus, Andreas Loukas, Nathanaël Perraudin, Roger Wattenhofer

We approach the graph generation problem from a spectral perspective by first generating the dominant parts of the graph Laplacian spectrum and then building a graph matching these eigenvalues and eigenvectors.

1 code implementation • NeurIPS 2021 • Pál András Papp, Karolis Martinkus, Lukas Faber, Roger Wattenhofer

In DropGNNs, we execute multiple runs of a GNN on the input graph, with some of the nodes randomly and independently dropped in each of these runs.

Ranked #8 on Graph Classification on IMDb-B

no code implementations • 27 Sep 2021 • Yue Ding, Karolis Martinkus, Damian Pascual, Simon Clematide, Roger Wattenhofer

Different studies of the embedding space of transformer models suggest that the distribution of contextual representations is highly anisotropic - the embeddings are distributed in a narrow cone.

1 code implementation • 14 Oct 2020 • Karolis Martinkus, Aurelien Lucchi, Nathanaël Perraudin

However, the dynamics of many real-world systems are challenging to learn due to the presence of nonlinear potentials and a number of interactions that scales quadratically with the number of particles $N$, as in the case of the N-body problem.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.