no code implementations • 10 Jun 2024 • Antoine Siraudin, Fragkiskos D. Malliaros, Christopher Morris

Discrete-state denoising diffusion models led to state-of-the-art performance in graph generation, especially in the molecular domain.

1 code implementation • 5 Jun 2024 • Luis Müller, Christopher Morris

While recent works aligning graph transformer architectures with the $k$-WL hierarchy have shown promising empirical results, employing transformers for higher orders of $k$ remains challenging due to a prohibitive runtime and memory complexity of self-attention as well as impractical architectural assumptions, such as an infeasible number of attention heads.

no code implementations • 27 May 2024 • Chendi Qian, Andrei Manolache, Christopher Morris, Mathias Niepert

Message-passing graph neural networks (MPNNs) have emerged as a powerful paradigm for graph-based machine learning.

no code implementations • 12 Feb 2024 • Billy J. Franks, Christopher Morris, Ameya Velingker, Floris Geerts

Moreover, we focus on augmenting $1$-WL and MPNNs with subgraph information and employ classical margin theory to investigate the conditions under which an architecture's increased expressivity aligns with improved generalization performance.

no code implementations • 3 Feb 2024 • Christopher Morris, Fabrizio Frasca, Nadav Dym, Haggai Maron, İsmail İlkan Ceylan, Ron Levie, Derek Lim, Michael Bronstein, Martin Grohe, Stefanie Jegelka

Machine learning on graphs, especially using graph neural networks (GNNs), has seen a surge in interest due to the wide availability of graph data across a broad spectrum of disciplines, from life to social and engineering sciences.

1 code implementation • 18 Jan 2024 • Luis Müller, Daniel Kusuma, Blai Bonet, Christopher Morris

Empirically, we demonstrate that the Edge Transformer surpasses other theoretically aligned architectures regarding predictive performance while not relying on positional or structural encodings.

1 code implementation • 16 Oct 2023 • Chendi Qian, Didier Chételat, Christopher Morris

Recently, machine learning, particularly message-passing graph neural networks (MPNNs), has gained traction in enhancing exact optimization algorithms.

1 code implementation • 6 Oct 2023 • Dominique Beaini, Shenyang Huang, Joao Alex Cunha, Zhiyi Li, Gabriela Moisescu-Pareja, Oleksandr Dymov, Samuel Maddrell-Mander, Callum McLean, Frederik Wenkel, Luis Müller, Jama Hussein Mohamud, Ali Parviz, Michael Craig, Michał Koziarski, Jiarui Lu, Zhaocheng Zhu, Cristian Gabellini, Kerstin Klaser, Josef Dean, Cas Wognum, Maciej Sypetkowski, Guillaume Rabusseau, Reihaneh Rabbany, Jian Tang, Christopher Morris, Ioannis Koutis, Mirco Ravanelli, Guy Wolf, Prudencio Tossou, Hadrien Mary, Therence Bois, Andrew Fitzgibbon, Błażej Banaszewski, Chad Martin, Dominic Masters

Recently, pre-trained foundation models have enabled significant advancements in multiple fields.

1 code implementation • 3 Oct 2023 • Chendi Qian, Andrei Manolache, Kareem Ahmed, Zhe Zeng, Guy Van Den Broeck, Mathias Niepert, Christopher Morris

Message-passing graph neural networks (MPNNs) emerged as powerful tools for processing graph-structured input.

1 code implementation • NeurIPS 2023 • Jan Böker, Ron Levie, Ningyuan Huang, Soledad Villar, Christopher Morris

In particular, we characterize the expressive power of MPNNs in terms of the tree distance, which is a graph distance based on the concept of fractional isomorphisms, and substructure counts via tree homomorphisms, showing that these concepts have the same expressive power as the $1$-WL and MPNNs on graphons.

1 code implementation • 8 Feb 2023 • Luis Müller, Mikhail Galkin, Christopher Morris, Ladislav Rampášek

Recently, transformer architectures for graphs emerged as an alternative to established techniques for machine learning with graphs, such as (message-passing) graph neural networks.

1 code implementation • 26 Jan 2023 • Christopher Morris, Floris Geerts, Jan Tönshoff, Martin Grohe

Secondly, when an upper bound on the graphs' order is known, we show a tight connection between the number of graphs distinguishable by the $1\text{-}\mathsf{WL}$ and GNNs' VC dimension.

1 code implementation • 30 Nov 2022 • Pablo Barcelo, Mikhail Galkin, Christopher Morris, Miguel Romero Orth

Namely, we investigate the limitations in the expressive power of the well-known Relational GCN and Compositional GCN architectures and shed some light on their practical learning performance.

no code implementations • 22 Jun 2022 • Chendi Qian, Gaurav Rattan, Floris Geerts, Christopher Morris, Mathias Niepert

Numerous subgraph-enhanced graph neural networks (GNNs) have emerged recently, provably boosting the expressive power of standard (message-passing) GNNs.

1 code implementation • 27 May 2022 • Elias B. Khalil, Christopher Morris, Andrea Lodi

Mixed-integer programming (MIP) technology offers a generic way of formulating and solving combinatorial optimization problems.

1 code implementation • 25 Mar 2022 • Christopher Morris, Gaurav Rattan, Sandra Kiefer, Siamak Ravanbakhsh

While (message-passing) graph neural networks have clear limitations in approximating permutation-equivariant functions over graphs or general relational data, more expressive, higher-order graph neural networks do not scale to large graphs.

2 code implementations • 4 Mar 2022 • Maxime Gasse, Quentin Cappart, Jonas Charfreitag, Laurent Charlin, Didier Chételat, Antonia Chmiela, Justin Dumouchelle, Ambros Gleixner, Aleksandr M. Kazachkov, Elias Khalil, Pawel Lichocki, Andrea Lodi, Miles Lubin, Chris J. Maddison, Christopher Morris, Dimitri J. Papageorgiou, Augustin Parjadis, Sebastian Pokutta, Antoine Prouvost, Lara Scavuzzo, Giulia Zarpellon, Linxin Yang, Sha Lai, Akang Wang, Xiaodong Luo, Xiang Zhou, Haohan Huang, Shengcheng Shao, Yuanming Zhu, Dong Zhang, Tao Quan, Zixuan Cao, Yang Xu, Zhewei Huang, Shuchang Zhou, Chen Binbin, He Minggui, Hao Hao, Zhang Zhiyu, An Zhiwu, Mao Kun

Combinatorial optimization is a well-established area in operations research and computer science.

no code implementations • 18 Dec 2021 • Christopher Morris, Yaron Lipman, Haggai Maron, Bastian Rieck, Nils M. Kriege, Martin Grohe, Matthias Fey, Karsten Borgwardt

In recent years, algorithms and neural architectures based on the Weisfeiler--Leman algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a powerful tool for machine learning with graphs and relational data.

no code implementations • NeurIPS 2021 • Leonardo Cotta, Christopher Morris, Bruno Ribeiro

Empirically, we show how reconstruction can boost GNN's expressive power -- while maintaining its invariance to permutations of the vertices -- by solving seven graph property tasks not solvable by the original GNN.

no code implementations • 12 May 2021 • Christopher Morris, Matthias Fey, Nils M. Kriege

In recent years, algorithms and neural architectures based on the Weisfeiler-Leman algorithm, a well-known heuristic for the graph isomorphism problem, emerged as a powerful tool for (supervised) machine learning with graphs and relational data.

no code implementations • 18 Feb 2021 • Quentin Cappart, Didier Chételat, Elias Khalil, Andrea Lodi, Christopher Morris, Petar Veličković

Combinatorial optimization is a well-established area in operations research and computer science.

2 code implementations • 16 Jul 2020 • Christopher Morris, Nils M. Kriege, Franka Bause, Kristian Kersting, Petra Mutzel, Marion Neumann

We provide Python-based data loaders, kernel and graph neural network baseline implementations, and evaluation tools.

2 code implementations • ICLR 2020 • Matthias Fey, Jan E. Lenssen, Christopher Morris, Jonathan Masci, Nils M. Kriege

This work presents a two-stage neural architecture for learning and refining structural correspondences between graphs.

Ranked #12 on Entity Alignment on DBP15k zh-en (using extra training data)

no code implementations • 14 Oct 2019 • Lutz Oettershagen, Nils M. Kriege, Christopher Morris, Petra Mutzel

Hence, we confirm that taking temporal information into account is crucial for the successful classification of dissemination processes.

1 code implementation • NeurIPS 2020 • Christopher Morris, Gaurav Rattan, Petra Mutzel

Hence, it accounts for the higher-order interactions between vertices.

Ranked #3 on Graph Classification on NCI109

no code implementations • 28 Mar 2019 • Nils M. Kriege, Fredrik D. Johansson, Christopher Morris

Graph kernels have become an established and widely-used technique for solving classification tasks on graphs.

1 code implementation • 4 Oct 2018 • Christopher Morris, Martin Ritzert, Matthias Fey, William L. Hamilton, Jan Eric Lenssen, Gaurav Rattan, Martin Grohe

We show that GNNs have the same expressiveness as the $1$-WL in terms of distinguishing non-isomorphic (sub-)graphs.

Ranked #4 on Graph Classification on NCI1

14 code implementations • NeurIPS 2018 • Rex Ying, Jiaxuan You, Christopher Morris, Xiang Ren, William L. Hamilton, Jure Leskovec

Recently, graph neural networks (GNNs) have revolutionized the field of graph representation learning through effectively learned node embeddings, and achieved state-of-the-art results in tasks such as node classification and link prediction.

Ranked #1 on Graph Classification on REDDIT-MULTI-12K

1 code implementation • 7 Mar 2017 • Christopher Morris, Kristian Kersting, Petra Mutzel

Specifically, we introduce a novel graph kernel based on the $k$-dimensional Weisfeiler-Lehman algorithm.

no code implementations • 2 Mar 2017 • Nils M. Kriege, Marion Neumann, Christopher Morris, Kristian Kersting, Petra Mutzel

On this basis we propose exact and approximative feature maps for widely used graph kernels based on the kernel trick.

no code implementations • 1 Oct 2016 • Christopher Morris, Nils M. Kriege, Kristian Kersting, Petra Mutzel

While state-of-the-art kernels for graphs with discrete labels scale well to graphs with thousands of nodes, the few existing kernels for graphs with continuous attributes, unfortunately, do not scale well.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.