Search Results for author: William Shiao

Found 8 papers, 3 papers with code

How Does Message Passing Improve Collaborative Filtering?

no code implementations27 Mar 2024 Mingxuan Ju, William Shiao, Zhichun Guo, Yanfang Ye, Yozen Liu, Neil Shah, Tong Zhao

A branch of research enhances CF methods by message passing used in graph neural networks, due to its strong capabilities of extracting knowledge from graph-structured data, like user-item bipartite graphs that naturally exist in CF.

Collaborative Filtering Recommendation Systems +1

Improving Out-of-Vocabulary Handling in Recommendation Systems

no code implementations27 Mar 2024 William Shiao, Mingxuan Ju, Zhichun Guo, Xin Chen, Evangelos Papalexakis, Tong Zhao, Neil Shah, Yozen Liu

This work focuses on a complementary problem: recommending new users and items unseen (out-of-vocabulary, or OOV) at training time.

Recommendation Systems

GPT-generated Text Detection: Benchmark Dataset and Tensor-based Detection Method

1 code implementation12 Mar 2024 Zubair Qazi, William Shiao, Evangelos E. Papalexakis

As natural language models like ChatGPT become increasingly prevalent in applications and services, the need for robust and accurate methods to detect their output is of paramount importance.

Text Detection

Node Duplication Improves Cold-start Link Prediction

no code implementations15 Feb 2024 Zhichun Guo, Tong Zhao, Yozen Liu, Kaiwen Dong, William Shiao, Neil Shah, Nitesh V. Chawla

Graph Neural Networks (GNNs) are prominent in graph machine learning and have shown state-of-the-art performance in Link Prediction (LP) tasks.

Link Prediction Recommendation Systems

CARL-G: Clustering-Accelerated Representation Learning on Graphs

no code implementations12 Jun 2023 William Shiao, Uday Singh Saini, Yozen Liu, Tong Zhao, Neil Shah, Evangelos E. Papalexakis

CARL-G is adaptable to different clustering methods and CVIs, and we show that with the right choice of clustering method and CVI, CARL-G outperforms node classification baselines on 4/5 datasets with up to a 79x training speedup compared to the best-performing baseline.

Clustering Contrastive Learning +4

Link Prediction with Non-Contrastive Learning

1 code implementation25 Nov 2022 William Shiao, Zhichun Guo, Tong Zhao, Evangelos E. Papalexakis, Yozen Liu, Neil Shah

In this work, we extensively evaluate the performance of existing non-contrastive methods for link prediction in both transductive and inductive settings.

Contrastive Learning Link Prediction +2

Linkless Link Prediction via Relational Distillation

no code implementations11 Oct 2022 Zhichun Guo, William Shiao, Shichang Zhang, Yozen Liu, Nitesh V. Chawla, Neil Shah, Tong Zhao

In this work, to combine the advantages of GNNs and MLPs, we start with exploring direct knowledge distillation (KD) methods for link prediction, i. e., predicted logit-based matching and node representation-based matching.

Knowledge Distillation Link Prediction +1

FRAPPE: $\underline{\text{F}}$ast $\underline{\text{Ra}}$nk $\underline{\text{App}}$roximation with $\underline{\text{E}}$xplainable Features for Tensors

1 code implementation19 Jun 2022 William Shiao, Evangelos E. Papalexakis

In this work, we propose FRAPPE and Self-FRAPPE: a cheaply supervised and a self-supervised method to estimate the canonical rank of a tensor without ever having to compute the CPD.

Cannot find the paper you are looking for? You can Submit a new open access paper.