1 code implementation • 16 May 2025 • Vijay Prakash Dwivedi, Sri Jaladi, Yangyi Shen, Federico López, Charilaos I. Kanatsoulis, Rishi Puri, Matthias Fey, Jure Leskovec
Relational Deep Learning (RDL) is a promising approach for building state-of-the-art predictive models on multi-table relational data by representing it as a heterogeneous temporal graph.
1 code implementation • 29 Nov 2024 • Yiwen Yuan, Zecheng Zhang, Xinwei He, Akihiro Nitta, Weihua Hu, Dong Wang, Manan Shah, Shenyang Huang, Blaž Stojanovič, Alan Krumholz, Jan Eric Lenssen, Jure Leskovec, Matthias Fey
Recommendation systems predominantly utilize two-tower architectures, which evaluate user-item rankings through the inner product of their respective embeddings.
3 code implementations • 29 Jul 2024 • Joshua Robinson, Rishabh Ranjan, Weihua Hu, Kexin Huang, Jiaqi Han, Alejandro Dobles, Matthias Fey, Jan E. Lenssen, Yiwen Yuan, Zecheng Zhang, Xinwei He, Jure Leskovec
We use RelBench to conduct the first comprehensive study of Relational Deep Learning (RDL) (Fey et al., 2024), which combines graph neural network predictive models with (deep) tabular models that extract initial entity-level representations from raw tables.
1 code implementation • 31 Mar 2024 • Weihua Hu, Yiwen Yuan, Zecheng Zhang, Akihiro Nitta, Kaidi Cao, Vid Kocijan, Jinu Sunil, Jure Leskovec, Matthias Fey
We present PyTorch Frame, a PyTorch-based framework for deep learning over multi-modal tabular data.
Ranked #1 on
Binary Classification
on kickstarter
1 code implementation • 31 Mar 2024 • Jialin Chen, Jan Eric Lenssen, Aosong Feng, Weihua Hu, Matthias Fey, Leandros Tassiulas, Jure Leskovec, Rex Ying
Motivated by our observation of a correlation between the time series model's performance boost against channel mixing and the intrinsic similarity on a pair of channels, we developed a novel and adaptable Channel Clustering Module (CCM).
1 code implementation • 7 Dec 2023 • Matthias Fey, Weihua Hu, Kexin Huang, Jan Eric Lenssen, Rishabh Ranjan, Joshua Robinson, Rex Ying, Jiaxuan You, Jure Leskovec
The core idea is to view relational databases as a temporal, heterogeneous graph, with a node for each row in each table, and edges specified by primary-foreign key links.
6 code implementations • NeurIPS 2023 • Shenyang Huang, Farimah Poursafaei, Jacob Danovitch, Matthias Fey, Weihua Hu, Emanuele Rossi, Jure Leskovec, Michael Bronstein, Guillaume Rabusseau, Reihaneh Rabbany
We present the Temporal Graph Benchmark (TGB), a collection of challenging and diverse benchmark datasets for realistic, reproducible, and robust evaluation of machine learning models on temporal graphs.
no code implementations • 18 Dec 2021 • Christopher Morris, Yaron Lipman, Haggai Maron, Bastian Rieck, Nils M. Kriege, Martin Grohe, Matthias Fey, Karsten Borgwardt
In recent years, algorithms and neural architectures based on the Weisfeiler--Leman algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a powerful tool for machine learning with graphs and relational data.
2 code implementations • 10 Jun 2021 • Matthias Fey, Jan E. Lenssen, Frank Weichert, Jure Leskovec
We present GNNAutoScale (GAS), a framework for scaling arbitrary message-passing GNNs to large graphs.
no code implementations • 12 May 2021 • Christopher Morris, Matthias Fey, Nils M. Kriege
In recent years, algorithms and neural architectures based on the Weisfeiler-Leman algorithm, a well-known heuristic for the graph isomorphism problem, emerged as a powerful tool for (supervised) machine learning with graphs and relational data.
6 code implementations • 17 Mar 2021 • Weihua Hu, Matthias Fey, Hongyu Ren, Maho Nakata, Yuxiao Dong, Jure Leskovec
Enabling effective and efficient machine learning (ML) over large-scale graph data (e. g., graphs with billions of edges) can have a great impact on both industrial and scientific applications.
Ranked #1 on
Knowledge Graphs
on WikiKG90M-LSC
1 code implementation • 22 Jun 2020 • Matthias Fey, Jan-Gin Yuen, Frank Weichert
We present a hierarchical neural message passing architecture for learning on molecular graphs.
Ranked #27 on
Graph Property Prediction
on ogbg-molhiv
21 code implementations • NeurIPS 2020 • Weihua Hu, Matthias Fey, Marinka Zitnik, Yuxiao Dong, Hongyu Ren, Bowen Liu, Michele Catasta, Jure Leskovec
We present the Open Graph Benchmark (OGB), a diverse set of challenging and realistic benchmark datasets to facilitate scalable, robust, and reproducible graph machine learning (ML) research.
Ranked #1 on
Link Property Prediction
on ogbl-citation2
1 code implementation • 2 Feb 2020 • Marian Kleineberg, Matthias Fey, Frank Weichert
This work presents a generative adversarial architecture for generating three-dimensional shapes based on signed distance representations.
2 code implementations • ICLR 2020 • Matthias Fey, Jan E. Lenssen, Christopher Morris, Jonathan Masci, Nils M. Kriege
This work presents a two-stage neural architecture for learning and refining structural correspondences between graphs.
Ranked #12 on
Entity Alignment
on DBP15k zh-en
(using extra training data)
1 code implementation • 9 Apr 2019 • Matthias Fey
We propose a dynamic neighborhood aggregation (DNA) procedure guided by (multi-head) attention for representation learning on graphs.
Ranked #26 on
Node Classification
on Citeseer
6 code implementations • 6 Mar 2019 • Matthias Fey, Jan Eric Lenssen
We introduce PyTorch Geometric, a library for deep learning on irregularly structured input data such as graphs, point clouds and manifolds, built upon PyTorch.
Ranked #4 on
Graph Classification
on REDDIT-B
1 code implementation • 4 Oct 2018 • Christopher Morris, Martin Ritzert, Matthias Fey, William L. Hamilton, Jan Eric Lenssen, Gaurav Rattan, Martin Grohe
We show that GNNs have the same expressiveness as the $1$-WL in terms of distinguishing non-isomorphic (sub-)graphs.
Ranked #5 on
Graph Classification
on NCI1
1 code implementation • NeurIPS 2018 • Jan Eric Lenssen, Matthias Fey, Pascal Libuschewski
We present group equivariant capsule networks, a framework to introduce guaranteed equivariance and invariance properties to the capsule network idea.
no code implementations • 16 Feb 2018 • Nils M. Kriege, Matthias Fey, Denis Fisseler, Petra Mutzel, Frank Weichert
To this end, the distance measure is used to implement a nearest neighbor classifier leading to a high computational cost for the prediction phase with increasing training set size.
5 code implementations • CVPR 2018 • Matthias Fey, Jan Eric Lenssen, Frank Weichert, Heinrich Müller
We present Spline-based Convolutional Neural Networks (SplineCNNs), a variant of deep neural networks for irregular structured and geometric input, e. g., graphs or meshes.
Ranked #2 on
Node Classification
on Cora