Furthermore, these models are all trained offline, which cannot well adapt to the changes of evolutional patterns from then on.
Besides entity-centric knowledge, usually organized as Knowledge Graph (KG), events are also an essential kind of knowledge in the world, which trigger the spring up of event-centric knowledge representation form like Event KG (EKG).
In this paper, we propose a Transformer-based model, called MCPredictor, which integrates deep event-level and script-level information for script event prediction.
Specifically, at the clue searching stage, CluSTeR learns a beam search policy via reinforcement learning (RL) to induce multiple clues from historical facts.
To capture these properties effectively and efficiently, we propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN), called RE-GCN, which learns the evolutional representations of entities and relations at each timestamp by modeling the KG sequence recurrently.
However, they mainly focus on link prediction on binary relational data, where facts are usually represented as triples in the form of (head entity, relation, tail entity).
To resolve event coreference, existing methods usually calculate the similarities between event mentions and between specific kinds of event arguments.
It aims to infer an unknown element in a partial fact consisting of the primary triple coupled with any number of its auxiliary description(s).
Document-level information is very important for event detection even at sentence level.
Predicting anchor links across social networks has important implications to an array of applications, including cross-network information diffusion and cross-domain recommendation.
Knowledge graph embedding aims to represent entities and relations in a large-scale knowledge graph as elements in a continuous vector space.