1 code implementation • 29 Nov 2024 • Yiwen Yuan, Zecheng Zhang, Xinwei He, Akihiro Nitta, Weihua Hu, Dong Wang, Manan Shah, Shenyang Huang, Blaž Stojanovič, Alan Krumholz, Jan Eric Lenssen, Jure Leskovec, Matthias Fey
Recommendation systems predominantly utilize two-tower architectures, which evaluate user-item rankings through the inner product of their respective embeddings.
no code implementations • 17 Jul 2024 • Shenyang Huang, Farimah Poursafaei, Reihaneh Rabbany, Guillaume Rabusseau, Emanuele Rossi
In this paper, we introduce Unified Temporal Graph (UTG), a framework that unifies snapshot-based and event-based machine learning models under a single umbrella, enabling models developed for one representation to be applied effectively to datasets of the other.
3 code implementations • 14 Jun 2024 • Julia Gastinger, Shenyang Huang, Mikhail Galkin, Erfan Loghmani, Ali Parviz, Farimah Poursafaei, Jacob Danovitch, Emanuele Rossi, Ioannis Koutis, Heiner Stuckenschmidt, Reihaneh Rabbany, Guillaume Rabusseau
To address these challenges, we introduce Temporal Graph Benchmark 2. 0 (TGB 2. 0), a novel benchmarking framework tailored for evaluating methods for predicting future links on Temporal Knowledge Graphs and Temporal Heterogeneous Graphs with a focus on large-scale datasets, extending the Temporal Graph Benchmark.
1 code implementation • 14 Jun 2024 • Razieh Shirzadkhani, Tran Gia Bao Ngo, Kiarash Shamsi, Shenyang Huang, Farimah Poursafaei, Poupak Azad, Reihaneh Rabbany, Baris Coskunuzer, Guillaume Rabusseau, Cuneyt Gurcan Akcora
Next, we evaluate the transferability of Temporal Graph Neural Networks (TGNNs) for the temporal graph property prediction task by pre-training on a collection of up to sixty-four token transaction networks and then evaluating the downstream performance on twenty unseen token networks.
1 code implementation • 4 Jun 2024 • Katarina Petrović, Shenyang Huang, Farimah Poursafaei, Petar Veličković
Evolving relations in real-world networks are often modelled by temporal graphs.
no code implementations • 23 Apr 2024 • Kerstin Kläser, Błażej Banaszewski, Samuel Maddrell-Mander, Callum McLean, Luis Müller, Ali Parviz, Shenyang Huang, Andrew Fitzgibbon
In this work, we propose $\texttt{MiniMol}$, a foundational model for molecular learning with 10 million parameters.
3 code implementations • 6 Feb 2024 • Razieh Shirzadkhani, Shenyang Huang, Elahe Kooshafar, Reihaneh Rabbany, Farimah Poursafaei
Bridging this gap, we introduce TGX, a Python package specially designed for analysis of temporal networks that encompasses an automated pipeline for data loading, data processing, and analysis of evolving graphs.
no code implementations • 2 Dec 2023 • Yashaswi Pupneja, Joseph Zou, Sacha Lévy, Shenyang Huang
In this work, we aim to understand how real world events influence the opinions of individuals towards climate change related topics on social media.
1 code implementation • 6 Oct 2023 • Dominique Beaini, Shenyang Huang, Joao Alex Cunha, Zhiyi Li, Gabriela Moisescu-Pareja, Oleksandr Dymov, Samuel Maddrell-Mander, Callum McLean, Frederik Wenkel, Luis Müller, Jama Hussein Mohamud, Ali Parviz, Michael Craig, Michał Koziarski, Jiarui Lu, Zhaocheng Zhu, Cristian Gabellini, Kerstin Klaser, Josef Dean, Cas Wognum, Maciej Sypetkowski, Guillaume Rabusseau, Reihaneh Rabbany, Jian Tang, Christopher Morris, Ioannis Koutis, Mirco Ravanelli, Guy Wolf, Prudencio Tossou, Hadrien Mary, Therence Bois, Andrew Fitzgibbon, Błażej Banaszewski, Chad Martin, Dominic Masters
Recently, pre-trained foundation models have enabled significant advancements in multiple fields.
1 code implementation • 15 Aug 2023 • Lekang Jiang, Caiqi Zhang, Farimah Poursafaei, Shenyang Huang
In this paper, we explore the application of GNNs to edge regression tasks in both static and dynamic settings, focusing on predicting food and agriculture trade values between nations.
5 code implementations • NeurIPS 2023 • Shenyang Huang, Farimah Poursafaei, Jacob Danovitch, Matthias Fey, Weihua Hu, Emanuele Rossi, Jure Leskovec, Michael Bronstein, Guillaume Rabusseau, Reihaneh Rabbany
We present the Temporal Graph Benchmark (TGB), a collection of challenging and diverse benchmark datasets for realistic, reproducible, and robust evaluation of machine learning models on temporal graphs.
2 code implementations • 15 May 2023 • Shenyang Huang, Jacob Danovitch, Guillaume Rabusseau, Reihaneh Rabbany
Current solutions do not scale well to large real-world graphs, lack robustness to large amounts of node additions/deletions, and overlook changes in node attributes.
1 code implementation • 6 Feb 2023 • Dominic Masters, Josef Dean, Kerstin Klaser, Zhiyi Li, Sam Maddrell-Mander, Adam Sanders, Hatem Helal, Deniz Beker, Andrew Fitzgibbon, Shenyang Huang, Ladislav Rampášek, Dominique Beaini
We present GPS++, a hybrid Message Passing Neural Network / Graph Transformer model for molecular property prediction.
2 code implementations • 2 Feb 2023 • Shenyang Huang, Samy Coulombe, Yasmeen Hitti, Reihaneh Rabbany, Guillaume Rabusseau
how to capture temporal dependencies, and iii).
1 code implementation • 20 Jul 2022 • Farimah Poursafaei, Shenyang Huang, Kellin Pelrine, Reihaneh Rabbany
To evaluate against more difficult negative edges, we introduce two more challenging negative sampling strategies that improve robustness and better match real-world applications.
1 code implementation • 2 Jul 2020 • Shenyang Huang, Yasmeen Hitti, Guillaume Rabusseau, Reihaneh Rabbany
To solve the above challenges, we propose Laplacian Anomaly Detection (LAD) which uses the spectrum of the Laplacian matrix of the graph structure at each snapshot to obtain low dimensional embeddings.
no code implementations • 2 Mar 2020 • Stefano Alletto, Shenyang Huang, Vincent Francois-Lavet, Yohei Nakata, Guillaume Rabusseau
Almost all neural architecture search methods are evaluated in terms of performance (i. e. test accuracy) of the model structures that it finds.
no code implementations • 14 Sep 2019 • Shenyang Huang, Vincent François-Lavet, Guillaume Rabusseau
To understand how to expand a continual learner, we focus on the neural architecture design problem in the context of class-incremental learning: at each time step, the learner must optimize its performance on all classes observed so far by selecting the most competitive neural architecture.