Time-aware Relational Graph Attention Network for Temporal Knowledge Graph Embeddings

29 Sep 2021  ·  Chengjin Xu, Fenglong Su, Jens Lehmann ·

Embedding-based representation learning approaches for knowledge graphs (KGs) have been mostly designed for static data. However, many KGs involve temporal data, which creates the need for new representation learning approaches that can characterize and reason over time. In this work, we propose a Time-aware Relational Graph ATtention Network (TR-GAT) for temporal knowledge graph (TKG) embeddings, in which the initial feature of each entity is represented by fusing its embedding and the embeddings of its connected relations and timestamps as well as its neighboring entities. Different from the existing temporal GNN models which discretize temporal graphs into multiple snapshots, we treat timestamps as properties of links between entities. To further incorporate relation and time information into the graph structures, we utilize a self-attention mechanism which specifies different weights to different nodes according to the corresponding link features, i.e., embeddings of the relevant relations and timestamps within one neighborhood. Experimental results show that our approach achieves state-of-the-art performances regarding TKG completion and entity alignment tasks on several well-established TKG datasets due to the effective and efficient integration of time information.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here