Transferable Graph Structure Learning for Graph-based Traffic Forecasting Across Cities

KDD 2023  ·  Yilun Jin, Kai Chen, Qiang Yang ·

Graph-based deep learning models are powerful in modeling spatiotemporal graphs for traffic forecasting. In practice, accurate forecasting models rely on sufficient traffic data, which may not be accessible in real-world applications. To address this problem, transfer learning methods are designed to transfer knowledge from the source graph with abundant data to the target graph with limited data. However, existing methods adopt pre-defined graph structures for knowledge extraction and transfer, which may be noisy or biased and negatively impact the performance of knowledge transfer. To address the problem, we propose TransGTR, a transferable structure learning framework for traffic forecasting that jointly learns and transfers the graph structures and forecasting models across cities. TransGTR consists of a node feature network, a structure generator, and a forecasting model. We train the node feature network with knowledge distillation to extract city-agnostic node features, such that the structure generator, taking the node features as inputs, can be transferred across both cities. Furthermore, we train the structure generator via a temporal decoupled regularization, such that the spatial features learned with the generated graphs share similar distributions across cities and thus facilitate knowledge transfer for the forecasting model. We evaluate TransGTR on real-world traffic speed datasets, where under a fair comparison, TransGTR outperforms state-of-the-art baselines by up to 5.4%.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods