# Node Property Prediction

50 papers with code • 5 benchmarks • 1 datasets

## Libraries

Use these libraries to find Node Property Prediction models and implementations## Most implemented papers

# SSD: Single Shot MultiBox Detector

Experimental results on the PASCAL VOC, MS COCO, and ILSVRC datasets confirm that SSD has comparable accuracy to methods that utilize an additional object proposal step and is much faster, while providing a unified framework for both training and inference.

# Graph Attention Networks

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations.

# Semi-Supervised Classification with Graph Convolutional Networks

We present a scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs.

# Modeling Relational Data with Graph Convolutional Networks

We demonstrate the effectiveness of R-GCNs as a stand-alone model for entity classification.

# Open Graph Benchmark: Datasets for Machine Learning on Graphs

We present the Open Graph Benchmark (OGB), a diverse set of challenging and realistic benchmark datasets to facilitate scalable, robust, and reproducible graph machine learning (ML) research.

# Inductive Representation Learning on Large Graphs

Low-dimensional embeddings of nodes in large graphs have proved extremely useful in a variety of prediction tasks, from content recommendation to identifying protein functions.

# How Attentive are Graph Attention Networks?

Because GATs use a static attention mechanism, there are simple graph problems that GAT cannot express: in a controlled problem, we show that static attention hinders GAT from even fitting the training data.

# GraphSAINT: Graph Sampling Based Inductive Learning Method

Graph Convolutional Networks (GCNs) are powerful models for learning representations of attributed graphs.

# Combining Label Propagation and Simple Models Out-performs Graph Neural Networks

Graph Neural Networks (GNNs) are the predominant technique for learning over graphs.

# Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks

Furthermore, Cluster-GCN allows us to train much deeper GCN without much time and memory overhead, which leads to improved prediction accuracy---using a 5-layer Cluster-GCN, we achieve state-of-the-art test F1 score 99. 36 on the PPI dataset, while the previous best result was 98. 71 by [16].