460 papers with code • 69 benchmarks • 45 datasets
Link prediction is a task to estimate the probability of links between nodes in a graph.
( Image credit: Inductive Representation Learning on Large Graphs )
Furthermore, Cluster-GCN allows us to train much deeper GCN without much time and memory overhead, which leads to improved prediction accuracy---using a 5-layer Cluster-GCN, we achieve state-of-the-art test F1 score 99. 36 on the PPI dataset, while the previous best result was 98. 71 by .
Ranked #1 on Node Classification on Amazon2M
Recent interest in graph embedding methods has focused on learning a single representation for each node in the graph.
The task becomes more challenging on temporal knowledge graphs, where each fact is associated with a timestamp.
This work presents Contextualized Knowledge Graph Embedding (CoKE), a novel paradigm that takes into account such contextual nature, and learns dynamic, flexible, and fully contextualized entity and relation embeddings.
However, FM models feature interactions in a linear way, which can be insufficient for capturing the non-linear and complex inherent structure of real-world data.
Ranked #2 on Link Prediction on MovieLens 25M
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations.
Graph embedding methods produce unsupervised node features from graphs that can then be used for a variety of machine learning tasks.
Ranked #1 on Link Prediction on YouTube (Macro F1 metric)
We consider learning representations of entities and relations in KBs using the neural-embedding approach.
Ranked #9 on Link Property Prediction on ogbl-biokg