Strategies for Pre-training Graph Neural Networks

ICLR 2020 Weihua Hu*Bowen Liu*Joseph GomesMarinka ZitnikPercy LiangVijay PandeJure Leskovec

Many applications of machine learning require a model to make accurate pre-dictions on test examples that are distributionally different from training ones, while task-specific labels are scarce during training. An effective approach to this challenge is to pre-train a model on related tasks where data is abundant, and then fine-tune it on a downstream task of interest... (read more)

PDF Abstract

Evaluation Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.