Search Results for author: Shenyang Huang

Found 18 papers, 13 papers with code

ContextGNN: Beyond Two-Tower Recommendation Systems

1 code implementation29 Nov 2024 Yiwen Yuan, Zecheng Zhang, Xinwei He, Akihiro Nitta, Weihua Hu, Dong Wang, Manan Shah, Shenyang Huang, Blaž Stojanovič, Alan Krumholz, Jan Eric Lenssen, Jure Leskovec, Matthias Fey

Recommendation systems predominantly utilize two-tower architectures, which evaluate user-item rankings through the inner product of their respective embeddings.

Link Prediction Recommendation Systems

UTG: Towards a Unified View of Snapshot and Event Based Models for Temporal Graphs

no code implementations17 Jul 2024 Shenyang Huang, Farimah Poursafaei, Reihaneh Rabbany, Guillaume Rabusseau, Emanuele Rossi

In this paper, we introduce Unified Temporal Graph (UTG), a framework that unifies snapshot-based and event-based machine learning models under a single umbrella, enabling models developed for one representation to be applied effectively to datasets of the other.

Link Prediction

TGB 2.0: A Benchmark for Learning on Temporal Knowledge Graphs and Heterogeneous Graphs

3 code implementations14 Jun 2024 Julia Gastinger, Shenyang Huang, Mikhail Galkin, Erfan Loghmani, Ali Parviz, Farimah Poursafaei, Jacob Danovitch, Emanuele Rossi, Ioannis Koutis, Heiner Stuckenschmidt, Reihaneh Rabbany, Guillaume Rabusseau

To address these challenges, we introduce Temporal Graph Benchmark 2. 0 (TGB 2. 0), a novel benchmarking framework tailored for evaluating methods for predicting future links on Temporal Knowledge Graphs and Temporal Heterogeneous Graphs with a focus on large-scale datasets, extending the Temporal Graph Benchmark.

Benchmarking Knowledge Graphs

Towards Neural Scaling Laws for Foundation Models on Temporal Graphs

1 code implementation14 Jun 2024 Razieh Shirzadkhani, Tran Gia Bao Ngo, Kiarash Shamsi, Shenyang Huang, Farimah Poursafaei, Poupak Azad, Reihaneh Rabbany, Baris Coskunuzer, Guillaume Rabusseau, Cuneyt Gurcan Akcora

Next, we evaluate the transferability of Temporal Graph Neural Networks (TGNNs) for the temporal graph property prediction task by pre-training on a collection of up to sixty-four token transaction networks and then evaluating the downstream performance on twenty unseen token networks.

Graph Learning Graph Property Prediction +1

Temporal Graph Rewiring with Expander Graphs

1 code implementation4 Jun 2024 Katarina Petrović, Shenyang Huang, Farimah Poursafaei, Petar Veličković

Evolving relations in real-world networks are often modelled by temporal graphs.

Temporal Graph Analysis with TGX

3 code implementations6 Feb 2024 Razieh Shirzadkhani, Shenyang Huang, Elahe Kooshafar, Reihaneh Rabbany, Farimah Poursafaei

Bridging this gap, we introduce TGX, a Python package specially designed for analysis of temporal networks that encompasses an automated pipeline for data loading, data processing, and analysis of evolving graphs.

Understanding Opinions Towards Climate Change on Social Media

no code implementations2 Dec 2023 Yashaswi Pupneja, Joseph Zou, Sacha Lévy, Shenyang Huang

In this work, we aim to understand how real world events influence the opinions of individuals towards climate change related topics on social media.

Community Detection Misinformation +1

Towards Temporal Edge Regression: A Case Study on Agriculture Trade Between Nations

1 code implementation15 Aug 2023 Lekang Jiang, Caiqi Zhang, Farimah Poursafaei, Shenyang Huang

In this paper, we explore the application of GNNs to edge regression tasks in both static and dynamic settings, focusing on predicting food and agriculture trade values between nations.

Graph Regression Link Prediction +2

Temporal Graph Benchmark for Machine Learning on Temporal Graphs

5 code implementations NeurIPS 2023 Shenyang Huang, Farimah Poursafaei, Jacob Danovitch, Matthias Fey, Weihua Hu, Emanuele Rossi, Jure Leskovec, Michael Bronstein, Guillaume Rabusseau, Reihaneh Rabbany

We present the Temporal Graph Benchmark (TGB), a collection of challenging and diverse benchmark datasets for realistic, reproducible, and robust evaluation of machine learning models on temporal graphs.

Node Property Prediction Property Prediction

Fast and Attributed Change Detection on Dynamic Graphs with Density of States

2 code implementations15 May 2023 Shenyang Huang, Jacob Danovitch, Guillaume Rabusseau, Reihaneh Rabbany

Current solutions do not scale well to large real-world graphs, lack robustness to large amounts of node additions/deletions, and overlook changes in node attributes.

Change Detection Change Point Detection

Towards Better Evaluation for Dynamic Link Prediction

1 code implementation20 Jul 2022 Farimah Poursafaei, Shenyang Huang, Kellin Pelrine, Reihaneh Rabbany

To evaluate against more difficult negative edges, we introduce two more challenging negative sampling strategies that improve robustness and better match real-world applications.

Dynamic Link Prediction Memorization +1

Laplacian Change Point Detection for Dynamic Graphs

1 code implementation2 Jul 2020 Shenyang Huang, Yasmeen Hitti, Guillaume Rabusseau, Reihaneh Rabbany

To solve the above challenges, we propose Laplacian Anomaly Detection (LAD) which uses the spectrum of the Laplacian matrix of the graph structure at each snapshot to obtain low dimensional embeddings.

Anomaly Detection Change Point Detection

RandomNet: Towards Fully Automatic Neural Architecture Design for Multimodal Learning

no code implementations2 Mar 2020 Stefano Alletto, Shenyang Huang, Vincent Francois-Lavet, Yohei Nakata, Guillaume Rabusseau

Almost all neural architecture search methods are evaluated in terms of performance (i. e. test accuracy) of the model structures that it finds.

Neural Architecture Search

Neural Architecture Search for Class-incremental Learning

no code implementations14 Sep 2019 Shenyang Huang, Vincent François-Lavet, Guillaume Rabusseau

To understand how to expand a continual learner, we focus on the neural architecture design problem in the context of class-incremental learning: at each time step, the learner must optimize its performance on all classes observed so far by selecting the most competitive neural architecture.

class-incremental learning Class Incremental Learning +3

Cannot find the paper you are looking for? You can Submit a new open access paper.