214 papers with code • 1 benchmarks • 7 datasets
Graph embeddings learn a mapping from a network to a vector space, while preserving relevant network properties.
( Image credit: GAT )
Recent interest in graph embedding methods has focused on learning a single representation for each node in the graph.
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations.
Ranked #2 on Node Classification on Wiki-Vote
Graph embedding methods produce unsupervised node features from graphs that can then be used for a variety of machine learning tasks.
Ranked #1 on Link Prediction on YouTube (Macro F1 metric)
Implementation and experiments of graph embedding algorithms. deep walk, LINE(Large-scale Information Network Embedding), node2vec, SDNE(Structural Deep Network Embedding), struc2vec
Ranked #2 on Node Classification on Wikipedia
This paper studies the problem of embedding very large information networks into low-dimensional vector spaces, which is useful in many tasks such as visualization, node classification, and link prediction.
Ranked #5 on Node Classification on Wikipedia
Numeric values associated to edges of a knowledge graph have been used to represent uncertainty, edge importance, and even out-of-band knowledge in a growing number of scenarios, ranging from genetic data to social networks.
In particular, we develop a novel graph embedding algorithm, High-Order Proximity preserved Embedding (HOPE for short), which is scalable to preserve high-order proximities of large scale graphs and capable of capturing the asymmetric transitivity.
We present Karate Club a Python framework combining more than 30 state-of-the-art graph mining algorithms which can solve unsupervised machine learning tasks.