Graph embeddings learn a mapping from a network to a vector space, while preserving relevant network properties.
Representation learning has become an invaluable approach for learning from symbolic data such as text and graphs.
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations.
#3 best model for Document Classification on Cora
Graphs, such as social networks, word co-occurrence networks, and communication networks, occur naturally in various real-world applications.
This paper studies the problem of embedding very large information networks into low-dimensional vector spaces, which is useful in many tasks such as visualization, node classification, and link prediction.
#5 best model for Node Classification on Wikipedia
Recent works on representation learning for graph structured data predominantly focus on learning distributed representations of graph substructures such as nodes and subgraphs.
SOTA for Graph Classification on NCI109
Implementation and experiments of graph embedding algorithms. deep walk, LINE(Large-scale Information Network Embedding), node2vec, SDNE(Structural Deep Network Embedding), struc2vec
#2 best model for Node Classification on Wikipedia
The design of good heuristics or approximation algorithms for NP-hard combinatorial optimization problems often requires significant specialized knowledge and trial-and-error.
Deep learning on graphs has become a popular research topic with many applications.