# Graph Embedding

449 papers with code • 1 benchmarks • 10 datasets

Graph embeddings learn a mapping from a network to a vector space, while preserving relevant network properties.

( Image credit: GAT )

## Libraries

Use these libraries to find Graph Embedding models and implementations## Datasets

## Subtasks

## Most implemented papers

# Graph Attention Networks

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations.

# Poincaré Embeddings for Learning Hierarchical Representations

Representation learning has become an invaluable approach for learning from symbolic data such as text and graphs.

# RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space

We study the problem of learning representations of entities and relations in knowledge graphs for predicting missing links.

# Learning Hierarchy-Aware Knowledge Graph Embeddings for Link Prediction

HAKE is inspired by the fact that concentric circles in the polar coordinate system can naturally reflect the hierarchy.

# LINE: Large-scale Information Network Embedding

This paper studies the problem of embedding very large information networks into low-dimensional vector spaces, which is useful in many tasks such as visualization, node classification, and link prediction.

# Learning Combinatorial Optimization Algorithms over Graphs

The design of good heuristics or approximation algorithms for NP-hard combinatorial optimization problems often requires significant specialized knowledge and trial-and-error.

# Inductive Relation Prediction by Subgraph Reasoning

The dominant paradigm for relation prediction in knowledge graphs involves learning and operating on latent representations (i. e., embeddings) of entities and relations.

# GraphSAINT: Graph Sampling Based Inductive Learning Method

Graph Convolutional Networks (GCNs) are powerful models for learning representations of attributed graphs.

# graph2vec: Learning Distributed Representations of Graphs

Recent works on representation learning for graph structured data predominantly focus on learning distributed representations of graph substructures such as nodes and subgraphs.

# NSCaching: Simple and Efficient Negative Sampling for Knowledge Graph Embedding

Negative sampling, which samples negative triplets from non-observed ones in the training data, is an important step in KG embedding.