DUET is deployed on a powerful cloud server that only requires the low cost of forwarding propagation and low time delay of data transmission between the device and the cloud.
In the past few years, transformer-based pre-trained language models have achieved astounding success in both industry and academia.
In this paper, we study the problem of conducting self-supervised learning for node representation learning on non-homophilous graphs.
Meta-learning has emerged as a potent paradigm for quick learning of few-shot tasks, by leveraging the meta-knowledge learned from meta-training tasks.
Meta Reinforcement Learning (MRL) enables an agent to learn from a limited number of past trajectories and extrapolate to a new task.
While few-shot learning (FSL) aims for rapid generalization to new concepts with little supervision, self-supervised learning (SSL) constructs supervisory signals directly computed from unlabeled data.
It is of significance for an agent to learn a widely applicable and general-purpose policy that can achieve diverse goals including images and text descriptions.