Search Results for author: Shang-Ling Hsu

Found 6 papers, 2 papers with code

Forecasting Unseen Points of Interest Visits Using Context and Proximity Priors

no code implementations22 Nov 2024 Ziyao Li, Shang-Ling Hsu, Cyrus Shahabi

To address this challenge, we propose a model designed to predict a new POI outside the training data as long as its context is aligned with the user's interests.

TrajGPT: Controlled Synthetic Trajectory Generation Using a Multitask Transformer-Based Spatiotemporal Model

1 code implementation7 Nov 2024 Shang-Ling Hsu, Emmanuel Tung, John Krumm, Cyrus Shahabi, Khurram Shafique

TrajGPT integrates the spatial and temporal models in a transformer architecture through a Bayesian probability model that ensures that the gaps in a visit sequence are filled in a spatiotemporally consistent manner.

Epidemiology Text Infilling

Fake News Detection with Heterogeneous Transformer

1 code implementation6 May 2022 Tianle Li, Yushi Sun, Shang-Ling Hsu, Yanjia Li, Raymond Chi-Wing Wong

The heterogeneity in both news content and the relationship with other entities in social networks brings challenges to designing a model that comprehensively captures the local multi-modal semantics of entities in social networks and the global structural representation of the propagation patterns, so as to classify fake news effectively and accurately.

Decoder Fake News Detection

Temporal Relation Extraction with a Graph-Based Deep Biaffine Attention Model

no code implementations16 Jan 2022 Bo-Ying Su, Shang-Ling Hsu, Kuan-Yin Lai, Amarnath Gupta

Moreover, our architecture uses Multilayer Perceptrons (MLP) with biaffine attention to predict arcs and relation labels separately, improving relation detecting accuracy by exploiting the two-sided nature of temporal relationships.

Natural Language Understanding Relation +2

Cannot find the paper you are looking for? You can Submit a new open access paper.